Monday, August 30, 2010

Tandem Learning Innovation Community: 1st event recap

Just a quick post to thank everyone who attended our first official Tandem Learning Innovation Community event. We hosted the event on our new Second Life island and I'd encourage all of you who were unable to make it to stop by any time and take a tour. The island is open and public; we'd be happy to share the space with any member of the TLIC who'd like to run a meeting in SL. Feel free to contact me if you'd like a guided tour or assistance in how to work our browser displays.

I can't thank Earth Primbee (SL name, for those not in-world) enough...he did a fantastic job building the island to suit our needs, and he completed the entire build in just a few days. Amazing. He also served as tech support and transcriptionist during the event. Big kudos on an amazing job and for all of his help!

We had about 15 attendees for our first event, ranging from people in academia, corporate training, consultants and designers/developers. Although we kept the event informal, we spent some time doing introductions and talking about topics we'd like to see discussed in future TLIC events before letting people explore the island. 

One of the topics discussed was what new technologies community members would be interested in learning about through TLIC events. Suggestions included:
  • OpenSim
  • Unity3D
  • Alternate reality games (ARGs)
  • Jibe
  • Augmented reality
  • Geolocation technologies
The consensus was that community members would be more interested in hearing about case studies in future events than in seeing new technology demos. We also discussed what major issues organizations are facing in implementing new technologies. The major issues raised included:
  • access to the technologies (firewalls, hardware, etc.)
  • interface design issues 
  • ease of use
It was great to see our friends in avatar form and meet new community members who provided great feedback and insight. Plans are already underway for our next event and we're looking forward to some additional announcements to the community in the next few weeks. 

In the meantime, we'd love to hear your thoughts on what new technologies you'd like to find out more about and what types of programming for future events would be of interest to you!

Monday, August 23, 2010

Games for assessment

Another Twitter conversation, another blog post...another thought-stream based out of ADL's Implementation Fest #ifest (Srsly, this is the most a conference has inspired me to write in a long time. A good, and bad, thing...).

So, let's talk about the opportunities and problems with using games for assessment. Especially when you throw the "s" word in there..."standardized" assessment.

Deep breath.

First, I'll preface by saying that games are a natural environment for essence, they are assessing your performance just by nature of the game structure itself. Unless, of course, there aren't clear success metrics and you "win" by collecting more and more meaningless stuff (like Farmville)...but that's a whole other topic. So let's assume there are success metrics built into the game and those metrics align with what your learning objectives are. Its logical that by having someone play a game, you'll see how well they know something or know how to do something. Right?

Nothing is ever that easy. There are lots of aspects of game play that depend greatly on how the game was designed. For one, games have an intrinsic layer of cognitive overhead that may not exist in real life. For example, as I've been learning how to play Call of Duty 4, I first have to master the use of my PS3 controller. No, this isn't a learning game, but the same principles's why real guitar players get irritated playing Guitar Hero...there are skills that you need to develop to play a game, or to be successful in a game, that don't exist in real life or don't mirror the skills necessary to be successful at real life tasks. I think it becomes clear in first person shooter games, where your ability to operate your game controller does not directly translate to being able to accurately fire an automatic weapon in a combat environment. For any assessment, you have to make sure you're not just assessing how well someone plays the game, but how well they have mastered the real skill or content. In using games for assessment, you run the risk of assessing how well someone plays the game, not the objectives you are hoping to assess.

Another issue with games for assessment is the gender differences in how people play games. I'm about to talk about broad generalizations, so bear with me and recognize that some women game like "guys" and some guys game like "girls"...but there are different ways that people approach game environments and those differences do tend to follow along gender lines. Men are bigger risk-takers and explorers; women like to be guided, understand the environment, and follow the rules. Depending on how you design your game, you risk alienating a whole group of players if you don't consider the gender differences in game play. Worse, if you are using games for standardized assessment, you could be putting about half of the people you are assessing at a disadvantage just by the nature of the game design. Given the general acknowledgment that standardized tests are racially and class biased, adding a layer of gender bias in the game design risks making the concept of "standardized" even more meaningless.

Do I think games can be used effectively for assessment? Yes. Look at surgical simulations, flight simulators...close approximations of performing tasks in real life. Research has proven that successful performance in these simulated environments correlates to successful performance at the actual tasks. Where you can mirror game performance to real performance in this way, I think games are a brilliant and useful measure of assessment. But without careful design, thoughtful reflection on what the game environment adds to assessment, and what the trade-offs are with other forms of assessment, we risk creating another assessment environment that falls short of measuring true capability, potential, or performance.

Sunday, August 22, 2010

America's Army and gender bias in game design

Coming out of the #ifest Twitter stream, I once again heard how America's Army was a shining example of how games had been used to improve recruitment efforts. I posed the question...has it improved recruitment of women at the same rate it has improved the recruitment of men? So far, all I've heard back is *crickets*.

For two weeks, I have been looking for data or research on America's Army that mentions gender as a research parameter, but so far, I've found nothing. If you know of any research, I'd love to see it. My hypothesis? Recruitment of women was not as greatly improved after they played America's Army. If that's the case, what does that say about the relative value of recruiting women vs men into our military?

What makes a game successful? Is it ok for public institutions (government, schools, etc.) to measure the success of a serious game without looking at differences in outcomes along the most basic parameters (gender, class, race)? Is it ok to say a game is successful in achieving its goals if we don't consider those issues as part of the discussion?

I'm tired of hearing the marketing spin and the hype around how games can change the world if we're not even asking the most basic questions about WHO games are changing and HOW they are changing them. You won't find a bigger advocate of games for learning and as a vehicle to raise awareness and support behavior change. But not all games are created equal. We have to be vigilant and constantly questioning our design to ensure we're achieving the outcomes we seek. Ignoring questions of gender, class, and racial bias in serious game design makes me question the motives of the design itself and the motives of those promoting a game's "success."

As always, I welcome anyone's comments who can prove me wrong...

Tuesday, August 17, 2010

Announcing the 1st Tandem Learning Innovation Community event!

We are pleased to announce the first, official Tandem Learning Innovation Community event scheduled for Friday, August 27th, 2010 at 9:00 am SLT/ 12:00 pm EST. We’ll be hosting an open house on our new Second Life island. The official networking event, led by Koreen Olbrish (SL: Nina Sommerfleck), will be an informal discussion of community topics, including:

New technologies for the TLIC to explore
Major challenges in technology adoption for the community to address
TLIC at DevLearn 2010 – call for interested parties to be showcased

Please let us know if you plan to attend by emailing Jedd Gold via linkedin or at We will be sending out the SLURL the day before the event to everyone who RSVP’s.

If you haven't yet joined the Tandem Learning Innovation Community, you can request to join here.

We are looking forward to seeing you there!

Thursday, August 12, 2010

What makes a good learning "tool"?

Even though I'm attending through the Twitter stream, ADL's Implementation Fest #ifest is getting me fired up about some learning technology industry issues that just can't be explored in 140 characters. For example, yesterday there was some lively conversation around the usefulness of learning tools.

And I, in a rash statement, said that most learning tools suck.

But let me clarify, because there can be a broad definition of what a learning tool is.

For me, a learning tool is not what I use to design learning experiences (those things might include pen and paper, whiteboard, PowerPoint, Visio, etc.). A learning tool is NOT a reference tool like Wikipedia. Wikipedia is an information portal where you can go, read, and maybe learn something new...but it was not designed as a learning experience. It does not facilitate learning, even though it can enable it. Can you learn from a reference tool? Sure! But good reference tools have good user experience design, not instructional design, making it a reference tool and not a learning tool. There IS a difference.

A learning tool, to me, is something that you use to develop a learning experience. In other words, a tool that allows you to "design" a learning experience and output it into "Voila!" a learning experience. Input = content, output = training. And here's why I think most learning tools suck.

Most tools limit what you can design intrinsically in their functionality. Let's take PowerPoint. What you're going to get is slides. Pretty didactic. Maybe a little video embedded, some nifty animations...but you're not going to get much in the way of learner interaction.

But now I'm going to ask you a question...have you ever learned in a workshop that was guided by a PowerPoint slide? Have you ever been in a learning environment where PowerPoint was the primary learning tool, but the content, activity, discussion actually taught you something? I'm going to guess yes. Maybe you've even been lucky enough to be in a session guided by PowerPoint that made you do something differently when you left. You know what that is? GOOD DESIGN. It's not the tool. Its how you design learning experiences that facilitates learning, not the tool that you use.

So what makes a learning tool "good"? Openness. Flexibility. Interoperability with other learning tools and reference tools.

What makes a tool bad? One that dictates design. I could list some specific examples, but I'm betting you know what they are. Online learning development tools would be a great place to start.

One of my favorite quotes from yesterday's Twitter discussion was from John Campbell @jpcampbell :
what's ur expected output from tools? Learning Content? Why ask the architect to output a house?

Which is my point exactly. Instructional design and learning technology development are two different skill sets. Instructional designers are the architects and technology developers are the builders. You shouldn't build a house without an architectural plan, nor should you expect your architect to go ahead and put hammer to nail to bring his plan to life. There's an essential relationship here that too many organizations neglect to recognize, instead hiring IDs to build their training content using some rapid development tool. Most organizations are guilty of this in someway..."Put together a PowerPoint - led workshop!" "Import our workshop content into a virtual classroom!" "Create an Articulate module!" "Make video clips accessible from a smart phone!" This isn't a fault of the tools, its a lack of awareness of the importance of design. As an industry, we should NOT be designing learning experiences dictated by what tool you have (if you're a hammer, everything looks like a nail) but by the appropriate format to support the goal, supported by the appropriate design for that format. Instructor-led, game-based, online, mobile, print...they are ALL good formats when they are appropriate for the content, designed appropriately, and appropriate tools are used to develop them.

And that's why I think most learning tools suck...because they neglect to recognize the difference between design and development, and the default tends to be development at the expense of design.

Tuesday, August 10, 2010

The opportunity for government in the training industry

I've been following the ADL Implementation Fest #ifest stream on Twitter today and some of the conversation with my PLN has sparked some thoughts, maybe perspective, on how, or where, I see the government being able to lead the way in training. And, what prompted me to write this post, the ways in which its misdirecting its energies.

First, let me say, there are some great examples of people in government doing things the right way. Just from my immediate experience, Dr. Alicia Sanchez, who works for DAU, is the games czar who is helping integrate gaming into their curriculum. Mark Oehlert, also at DAU, is integrating social media technologies to support learning and knowledge management. Judy Brown at ADL is an industry recognized expert in mobile technologies and how they can be leveraged for learning. (Just realized, ironically, that these three will also be showcasing their knowledge at DevLearn 2010. You should go.) These three people, who happen to be people I know and respect, understand the unique positions they hold, and their opportunity to leverage technology for innovative applications. In short, they recognize that they have the chance to DESIGN really cool applications of existing technologies within the government and talk about how these projects are helping to improve learning, collaboration, and communication.

What I'm hearing out of iFest (so far...its the first day...;) is that the focus is still really on what technology can do for you and what technology initiatives ADL has been focusing on. To which I say...REALLY?!?! Sigh.

I don't need or want government agencies to fancy themselves technology companies. They aren't a start up, nor are they Microsoft. In short, there are companies who actually do that. And those companies need to make money doing it, which means that they need to build things that the market needs (even if the market doesn't want it...that's a totally different thing...).

What I'd love to see is that agencies within the government start looking at what REALLY helps support learning...good design. I'd love if they saw themselves as master implementers, not builders. Our government has tons of people that need training, its got tons of money and resources...why not leverage it for those things? Try innovative solutions. Experiment with design. Conduct research to establish best practices. That's what the learning technology industry needs. The government could provide could LEAD this. But for the most part, its not.

If a technology is needed, the market will push it because that's what the market looks for: meeting unmet needs to make money. I'm tired of hearing about how a technology the government is developing is going to solve some problem.  Let's face it, even Google has struggled with implementing innovative technologies (see: Wave, Lively) and that's their business...its what they do to make money...and they are arguably the best at it.

I'm hoping that as I hear more at iFest that its focused on design. Fingers crossed. If not, its an opportunity lost...

Admitting limits when you're a "no limits" kinda girl

Whether or not we like to admit it...there actually IS a limit to what we can accomplish in the little time we have. I'm not saying that our potential is limited...I'm saying that we have capacity issues. There are only 24 hours in a day. We actually need to sleep, eat...we can't just keep going full steam ahead, full out running, all of the time.

Oh, and then there's what other people bring into the equation. We can't control that either, and even the best laid plans sometimes get blown to bits because...well, we're social. We're complicated. We have all kinds of quirky relationship dynamics that guide our actions. And unless you lock yourself away to work away at your goals...sometimes life gets in the way.

So what happens when grand plans meet daily necessities and bump into unexpected events? Limitations.

And this is where dreams die.

The truth is, you can't do EVERYTHING. You need to prioritize. You need to figure out what's really important. And goals...dreams...visions of the future...they can easily get lost in this process. Try to do to much and chances are, nothing gets done. If you don't plan for your priorities, they get lost in the shuffle.

I've been working on admitting limits and setting priorities. Wow, its not easy. Its a process, for sure. I can easily convince myself to give up sleep. To work on vacation. To commit to doing more, to helping, to doing what it takes to get stuff done.

But then, of course, other things don't get done. And some of those things are really important to me. Frustration sets in. Less sleep. Distraction. And everything is affected.

How do you get past the limits?

1. Define your goals
You just aren't going to get there if you don't know where you're going. Write down your goals, make a plan to achieve them and be realistic about what its going to take (time, resources, etc.)

2. Enlist help
I'm horrible at this. You need people to not only to support you, but sometimes to actually help you achieve your goal. In fact, most things you CAN'T do alone. So ask and get people on your team.

3. Give up control
If I'm horrible at asking for help, I'm downright abysmal at giving up control. But once you get people to help, you have to let them help. So give it up. Seriously.

4. Prioritize (really, really)
This is the part of the process where you look around at what your doing and figure out if its getting you where you want to go. Are your actions supporting what you want to achieve? If not, change what you're doing.

5. Pace yourself
Be patient but not complacent. Don't confuse the two. Patience means understanding the realities of how much time things take. Complacent means letting things take too long...and then maybe them never happening at all. So be patient, realistic, and don't burn yourself out. Burn out is just as dangerous as complacency.

Sometimes you need to sprint, but sometimes you really are running a marathon. There are different limits to each and the winning runners acknowledge those limits and work within them effectively.

Your dreams shouldn't have limits, but all of us do. The only way we can reach our dreams is to be honest, strategic, and tenacious about who we are and what we want.