Google Analytics

Monday, March 22, 2010

Q&A With Student from King's College

From time to time, I get questions from students who have an assignment to speak to someone in the field. I think those kinds of assignments are wonderful because they link pre-professionals and people learning about the field with people who have been in the field for a while. Those types of introductions and relationships keep the "old timers" fresh and introduce new ideas and ways of looking at things to the people asking the questions.

Here are some questions and my answers from a student at King's College.

1. As a professional in the field of Computer Mediated Communications, when did you first realize the massive potential of new technology? How did you initially go about utilizing the technology?

The first time I realized what technology could do for learning was at an internship at a place called "Applied Science Associates." I had gotten the internship at, unbeknownest to me at the time, an instructional design company. I was asked to do quality control on a computer-based learning program. The program was very elementary. It was a screen with one question, in green text with multiple choice responses. I knew even at this crude level, that learning could take place. This was also the time when "virtual reality" was getting a lot of press. I thought if the two combined, the learning potential would be unlimited.

Then, a short-time later in graduate school we were introduced to an early version of HyperCard a computer-flip card program that allowed for linking and branching, then I knew that online learning would be revolutionary and that it was just a matter of time. I learned as much as I could in graduate school about designing instruction.

A short-time later at a company called Telesis Computer Corporation, I was able to purchase an early authoring tool called IconAuthor. While the tool was complex and detailed, it held such promise for computer-based instruction. I realized back in the late-1990s that PowerPoint had the potential to be an effective authoring tool.

In fact, at my interview at Bloomsburg University, I mentioned that I thought PowerPoint was going to become formidable authoring tool. Of course, at that time, the faculty laughed at me and thought I was crazy but time proved that PowerPoint could be used as an authoring tool (see Articulate). You can learn more about how I got started in the field from an interview I did with Chip Ramsey at Intellum.

2. Your book, "Gadgets, Games and Gizmos for Learning" highlights an often overlooked fact about video games and other technological devices: their educational value. How do you explain such important values to "non-techies?"

Trying to convince non-techies of the learning potential and value of video games for learning is a daunting task. Especially when you have professors at some universities, destroying computers and banning them from their classrooms reinforcing the fact that traditional academic institutions are "out of touch" (as a I mentioned in my post 20,000 US Hackers Wanted...Creating the Computer Elite (or Failing at it))

So it is a real up hill battle. The best method I have found is if you can draw a person in with a demonstration and show them that learning is occurring. Then let them play the video game and experience the learning that occurs for themselves, that is the best way to convince someone. Of course you can also point to numerous studies and the fact that the military has been using games for centuries to teach. Remind them that the visual cues of games, the need to think quickly, the immersive qualities all create an effective learning environment. But the bias against video games runs deep and is visceral. I just keep in mind that at one time calculators were the equivalent, they were evil devices that robbed learners of their capacity to think. Educational institutions have almost gotten over that bias so...in the future...they'll get over the video game bias as well. Especially when organizations like the National Science Foundation are backing video games for education.

Additionally, I positioning video games as a piece of the learning process, not the entire learning process. We know from research that distributed practice or learning a little bit at a time is an effective method and that video games can play a role in that process. I never think one learning method is the only method, I really believe that multiple channels for learning are most effective.

3. You have blogged and spoke many times on the program Second Life, which we are currently in the process of registering for in class. One of your most recent blog posts talks about the importance of customizing your avatar. How do you view the relation between an avatar - someone's online personality - and their real-life perception of themselves? Is the "sense of self," as you refer to it, different in people utilizing 3-D environments such as Second Life and other video games?

Traditionally, the big difference between video games and a metaverse like Second Life is that in a metaverse you are were interacting with other people who were controlling the avatars that you encountered. Now with many video games, you can play against others who are inhabiting the same "game space" as you and so the line between playing a video game character and playing an avatar in a metaverse if blurring. In a game, you are typically playing a character in a "story" in a metaverse you are playing a version of yourself interacting with others. So the difference is in a metaverse, more of you is in the avatar since it is not a character.

As you customize your video game character or your avatar with whom you are meeting other people in a metaverse, you are vesting part of yourself into the environment. When you interact, you feel as if you are interacting on a more personnel level.

4. How, in your experiences, has education evolved with the implementation of technology? Do you feel enough educators, from elementary to university levels, are well-versed enough to properly use it in the classroom? Where, if anywhere, is such instruction lacking?

Education is the last great institution untouched by technology. For years, technology has been taught as a separate item from other subjects. You take a "computer class" but you never really used technology as part of an Art, History or English class. That concept is slowly starting to change but not very quickly. Online education, ironically pioneered by for-profit institutions, is slowly becoming more and more of a game changer. With more access to broadband and more credibility given to online education, education will start to evolve. Adding pressure is the untenable costs of higher education. The era of online, technology-based education is heading toward our institutions at an increasing velocity. So teachers, college faculty and administrators must fundamentally re-think educational models. That isn't happening. The educational model we have in schools from pre-school to PhDs is the same model we used in the 1800's. We need a model where education is intermingled with life experiences and students are not divided by subject or grade but by interests.

As we need fundamental changes, traditional organizations are retrenching (unions) and administrators and we see a backlash against technology. The same sort of thing happened when factories were automated and when knowledge work became automated. I'm not saying automation is good for the sake of automation but what I am saying is that we need to add more technology into our educational mix and make it more fundamental and core than just a couple of electives. As society embraces technology for everything from bill paying to looking up healthcare information, our educational institutions must change the classroom-centric model to be more learner-centric at all levels. This is a rethinking that few are undertaking. Institutions primarily exist to perpetuate themselves and no institution is better at it than education (except maybe our government).

The problem is that too many people see technology as the answer and miss the need to intelligently apply instructional strategies with the technology. You can't have effective instructional technology without carefully designing it and crafting it so learning occurs. So many early ventures in technology-based education were technology-centric and then need to be changes to being focused on delivering instruction effectively.

5. What does the future of CMC hold? In your book you mention the importance of workplaces adopting technology to suit their needs for training, organizational development, etc. What values must people take away from instructional technology to fully adapt it to such environments in the future?

The future is just now beginning to be explored. With augmented reality applications and interactive applications like Xbox's project Natal the future is exciting and limitless. What we need to take away from instructional technology are three things.

First, never forget the instructional part what makes instructional technology effective is the proper application of instructional strategies to aid and assist the learning. Blindly implementing technology does not equal learning. It must be applied intelligently.

The second is that just because something has been done in a certain way for centuries, that doesn't mean it shouldn't be changed. As the world moves and changes so should education. No model is unassailable.

Technology is a tool and we should not become enamored with it. Use technology to reach an end goal, not as the end goal itself.
__

Bookmark and Share
Catalog of Recommended Books, Games and Gadgets
Recommended Games and Gadgets
Recommended Books
Content Guide

No comments: