If the ILXD Model emphasizes one thing, it's the importance of empowering the learner to be involved in every part of the learning experience. From needs analysis to product development, through intention, immersion, empathy, and learning, the learner must be fully engaged in the entire experience.

That's great on paper, but we've found that in the design / Intend layer of ILXD—between project scope limitations, tight deadlines, and strained resources—achieving ideal learner involvement has proven to be challenging.

In a recent project, our client needed six small interactive learning activities (I hesitate to call them games since there is limited opportunity for failure and they're not scored) that would help emphasize key course concepts. We were repurposing some previously developed gaming content in a limited time frame. Alongside the client, we developed learning goals, outlined the activity structure, created templates, and started developing.

Notice anything missing here? We certainly did. At no point in that initial design phase did we have an opportunity to consult actual learners. We requested a playtest with learners as soon as possible, but because of the short development time frame, the products were essentially done (developed, reviewed by the client, and revised) by the time we were able to observe the learner's experience.

I'm coming to playtesting as a novice. In my many years as a teacher, instructional designer, curriculum developer, etc., I was never part of a team that playtested. Even when I developed educational games years ago for a well-known publisher, our playtesting was extremely limited, solely bug / functionality-related, and not something I, as a designer, was a part of. My experience is not unique. The needs assessment that most instructional designers complete at the beginning of a project is through the lens of the client, not the learner. Almost always, the first time any learner sees a course is when one is enrolled in it as a student. It's rare to engage the learner so thoroughly in development because we still see learning as a product meant to facilitate what the instructor wants the learner to know, where they should be seen as educational experiences that empower learners.

The reason playtesting is common in game development is because the product will eventually go to market. In order for it to make money, it has to be good. It has to satisfy the buyer, and game developers can't just guess what will work. They have to be sure.

Why shouldn't all learning experiences rise to that standard? Don't we want them to be as good as possible? We need to go beyond satisfying the middle man / client / school, and utilize the incredible resource of the learner's experience. Playtesting is an integral part of that process.

Here's what I've learned. A playtest is a process in which learners/players test a new game for design flaws, bugs, and efficacy before publication. The goals of an effective play test include the following:

  1. Gather empirical evidence about how the game is played.
  2. Obtain data about the usefulness and success of the game.
  3. Learn what learners need and want out of the game.
  4. Discover blocks in flow / functionality issues.


In this instance, our 43 testers were broken up into two sessions in the computer lab at their school campus. They were rewarded with pizza and a gift card raffle, but they were intrinsically motivated to help because of the nature of the learning (Games! Yay!) and because they are engaged students in a demanding program. The vast majority of our testers had little to no gaming experience.

We handed out scoring sheets asking students to rank various aspects of game design and content. We circulated throughout the room, observing play, fielding questions, and fleshing out ideas, wants, and needs of our testers.

The results were overwhelmingly positive. It was inspiring to see how game-based learning engages and motivates learners to master key concepts. It wasn't surprising that students picked up on a whole host of issues and had ideas that none of our internal reviewers / clients had. They enthusiastically suggested functionality or flow rework, identified content improvements, and deftly analyzed how the experiences should / could be used in coursework. We walked away with heads buzzing with new ideas and long lists of future improvements we'll make to these products.

In an ideal world, and the method ILXD argues for, the learner is involved from the very beginning. In truth, we should have invited students of all skill levels who had completed the course to the very first brainstorming meeting where we identified pain points in the course and opportunities to engage learners. We'd look at performance data on objectives and topics to determine where students most struggle. Then, with a team of learners, we'd collaborate on designing personalized learning experiences that are flexible and provide enough choice to enable learners to master concepts based on their needs and skill levels. We'd develop the learning experience and playtest with learners at the same time our clients are reviewing the material. This would allow for more comprehensive revision and rework so the learning experience is truly market/learner ready.

We're setting up our next projects with this model in mind. We'll continue to strive for increased learner involvement throughout the process, and while we accept that not all projects will allow for such thorough learner involvement, we're confident that the experiences we create when we do follow this model will be better than anything we could have created without them.