Should Instructional Designers “Teach to the Test”?

Teach to the Test

There is a lot of angst these days in the education field about “teaching to the test.” It started in K-12 but it’s crept into corporate speak as well. Some say that tests are no longer relevant. They are viewed as hold-overs of an out-of-touch education system. A growing bandwagon of people are saying that they want to help people learn to problem-solve and do critical thinking… and not just memorize facts.

In the corporate world, people really do need to recall facts to do their jobs well. There are plenty of times where being able to “Google it” is not enough: they need to know it if they want to perform their job efficiently and/or safely. In compliance and safety situations, we need some objective verification that they do know it before they are allowed to perform the job. This is needed both to satisfy OSHA regulations and as a way to protect the employee and business.

Case in Point

I sat on a materials review call for a course we are developing within the healthcare industry. This particular scenario asks quite a bit from the learner:

  1. They need to be able to recall the steps to performing a variety of tasks.
  2. They need to select the appropriate tools to do specifics jobs.
  3. They need to be able to correctly put on personal protective equipment (PPE) when entering spaces where high-risk infections are present.
  4. They need to know what protective equipment is required in specific situations, which means they need to recognize different signage located outside patient rooms.

The entire course concludes with a certification test. The test directly links to what this role needs to know…and know how to do. I was concerned to hear a materials reviewer push to add course content that was not going to be part of the test.  This reviewer said, “We need to go beyond teaching to the test.” The implication was that we would fail the learner if we only include content that will be on the test. In essence, we want to give them a smorgasbord of information and heighten their competency by doing so.

What's Wrong with the Test

What’s Wrong With The Test?

We act as if it is shameful if we “only” teach to a test, but why? I suspect many of us believe that we are dumbing things down if we do just focus on a test. Perhaps we are afraid that teaching to a test limits our ability to deliver a rich, meaningful experience that elevates the general abilities of the learner. Too often, we want to turn people into the experts that we are rather than arming them with basic proficiency to do their jobs well.

What’s the risk? If we mix nice-to-know and need-to-know content, learners will likely experience cognitive overload. Worse, we risk them remembering some of the irrelevant information at the expense of the most relevant information.*

What does “Good” Look Like?

A good test should be an accurate assessment of the body of knowledge learners need to know to perform their jobs. If appropriate, it should also assess the skills people have or their decision-making ablity when judgment is a component of executing the job. It should only assess the knowledge and skill required to do the job. Courses that are designed to teach steps, processes, and the “why’s” behind those steps and processes need to keep their focus laser-sharp. People can only remember so much.

If it is essential that workers recall a specific body of knowledge and apply that knowledge to the execution of a set of procedures and processes then, please, don’t include anything that is not essential to them.

The problem is not tests. The problem is bad tests. Bad tests contain irrelevant material. Bad tests are poorly worded. Bad tests are too easy or too hard. Bad tests are not comprehensive, covering all the knowledge and skills critical to a job or situation.

Please do teach to the test… But only if you want to verify that people gained the skill and knowledge you have defined as essential to successful performance of the job.


*- Ruth Clark and Richard Mayer, The Science of Instruction, New Jersey: Pfeiffer, 2011.


Once Is Not Enough: How to Playtest Custom Learning Games

Once is Not Enough

I believe playtesting is a crucial, and often overlooked, part of learning game design. It takes multiple iterations to determine the right combination of game mechanics and game elements for your target learners. Whether you are an experienced game designer or an instructional designer trying to design a game for the first time, your game will benefit from multiple rounds of playtesting.

Of course, if we are going to tell everyone else to playtest their games, we have to do so as well! The following is the step-by-step story of how we created and play-tested a recent learning game for a client.

Example: Serious Game for Healthcare Workers

We recently created a game called “Five Star Facility,” targeted towards environmental technicians who work in healthcare settings. These technicians clean the patients’ rooms and other areas of a hospital or long-term care facility. It took us four iterations to get the game we wanted and players could most learn from.

We reached a great game in the end, but it took good playtesting and iteration to get us there. Our stellar design team of Amanda Gentry, Matt Kroeger, Kristen Hewett, and Erika Bartlett did a terrific job!

Version One – Let’s create something “sort of like Clue”

This version has similarity to a commercial game called Clue. The design team felt the target learners would be pretty familiar with Clue’s  rules and core dynamics (exploration, collection), and they wanted a game that learners could quickly learn to play. Gameplay was competitive. The game goal was to be the first person to collect all the room tokens, which represented all the categories of information players needed to learn and remember. Players rolled a die to determine how many spaces they moved on the board. Each space corresponded to a different category of environmental protection/cleaning. Players had to answer questions related to whatever category they landed on. Similar to Clue, they had to go into each room on the board. Answering a question in the room earned them a token for that room. The first player to earn all the tokens won the game.

First version of five star facility game.

First version of five star facility game.

It wasn’t a terrible game design… but it was just okay.

Problems With the First Design:

  1. The game could get sort of long if people were rolling lots of low numbers.
  2. The designers made the game competitive. In the real-world, environmental technicians should behave cooperatively with each other and with the healthcare team as a whole.
  3. This first rendition ignored the “why” of the environmental tech’s job and didn’t help them see the connection between what they do and how the healthcare facility gets dollars to stay in business. Survey ratings determine the reimbursements healthcare facilities receive from Medicare. If your facility’s aggregate survey ratings are only three stars, you do not receive the same dollars as a facility who received five star ratings.

Version Two – Scrap the Clue idea. Let’s race to the finish.

The icons are gone, the die is gone, and we have a path we’re traveling and monetary targets to reach. This version stunk. It was boring and tedious to play. Players simply took turns drawing cards to try to reach the target dollar amount. They worked together to answer the questions, but when the designers switched from competition to cooperation, they failed to include game mechanics that created any conflict or tension within the players. There was no “Lose” state or no really bad things that could happen. This version was quickly ditched.

Version Two: Lots and lots of dots.

Version Two: Lots and lots of dots.

Version Three – Bring back the game icons. Add in a progress tracker.

Version three was much better! The team latched on to the realization that five star ratings led to better reimbursements. Now players had to secure at least $70K in reimbursements to win… and mistakes would push their survey ratings downward. This was better, but there was still a serious flaw. Players’ dollars didn’t go down when they made mistakes; only their survey ratings did. In the real world, these are tied together. We also discovered as we played that we needed to better write our questions to eliminate ambiguity of responses. On the plus side, the discussion team members did before deciding on a correct response was phenomenal. Lots of learning happened in these discussions.

Version 3: We’re getting closer to the final product.

Version 3: We’re getting closer to the final product.

 Version Four – We have a winner!

The final version of the game was the winner. Look at how we tied together survey ratings and reimbursement dollars. Players start with a 1.5 star rating and $30K in reimbursement dollars. To move to the right and earn more dollars, they have to enter a room and respond correctly to that category’s question. They still roll a die to move a team token around the board. If they land on a space outside of a room, they have to answer a question that corresponds to the icon they land on. A correct response allows play to progress to the next player with no adverse event. An incorrect response forces players to move to a lower survey rating. If they hit the zero starts spot on the game board, the game is over and the team loses. If they earn $10K from every “room” on the game board and achieve at least $70K in reimbursement, they win.

Version four: By George, we’ve got it!

Version four: By George, we’ve got it!

Lessons for learning game designers:

  1. Make sure your choice of a competitive game or a cooperative game mirrors the real-world environment. Do not have people competing in a learning game if their real-world context requires cooperation or collaboration to be successful.
  2. Be aware that competitive games do not tend to be as influential of learning experiences as cooperative ones do. In competitive games, only one person or team wins. The “losers” can disengage from the experience entirely if it is not managed well.
  3. Make sure the game mechanics (rules) and game goal complement – or at least do not detract – from your real-world situation.
  4. Make sure your game includes enough “tension” in it to keep things interesting. Interesting translates into “fun.” If there are not realistically significant odds of losing the game, it becomes boring to play.
  5. Don’t be content with the first version of your game; it will not be the best version.
  6. Don’t playtest once; identify changes to make, and then fail to playtest to verify those changes improve the game play and learning experience. You have to test every time you make a change.

Can Micro-Learning Help Stressed, Unmotivated Learners?


I’ve published two posts on micro-learning in recent months. One was on this site; one was done for ATD. Both generated discussion with some folks debating my assertion that we need to be very cautious about leaping to it. I’m going to stand by my assertion. I think “micro-lessons” can be great for some things; I do not think they are the answer to most things. And for learners who are over-extended and not motivated to learn in the first place, they are not the answer at all.

Will Thalheimer, someone I respect tremendously in the arena of learning science research and applying research to practices, wrote an extensive comment to my ATD post. He also linked to a post by Alex Khurgin, CEO of Grovo, a SaaS company that produces lots of micro-learning. Khurgin positions micro-learning as good for 21st century businesses. Khurgin’s blog is high-level and, in general, promotes micro-learning as the solution to the crazy pace that exemplifies many of today’s organizations.

Here’s the thing. I feel like I am an example of the “C-suite” person so many say are the reason we need to shift to micro-learning. I do not own a Fortune 500 company, but I am a business owner who has concerns about maximizing what my team can do. My company has been named as one of the top 25 fastest growing companies in Indiana… and making sure our team members continually learn and grow is a key reason why we’re on that list. Their skill and knowledge fuels company growth.

Why Companies *Think* Learners Need Micro-Learning

Micro-Learning 1

Within my company (and probably many others), these truths all affect my team’s ability to learn:

  1. We’re stressed. Life is stressful, not just work. We all have a bazillion things to do each day and many people who need things from us.
  2. We face multiple interruptions each day. If we don’t discipline ourselves to ignore email, disconnect from instant messaging, or mute our phones, we can be distracted every few minutes all day long every day.
  3. Time is limited. We never feel like we have enough time to get things done.
  4. We want to enjoy life. Most folks don’t want to work 60-hour weeks; we need for our work – and our learning – to happen within the sanity of a 40- to 45-hour work week. Sadly, we don’t all hit the goal of 45-hour maximums, which makes carving out time for learning a constant challenge if it is not prioritized.
  5. Maintaining focus is HARD. New technologies and ideas are like squirrels, tempting us to run off in new directions all the time. We see these squirrels when we consume content on social media – checking out links sent via tweets,  perusing Zite, monitoring our accounts. We can get highly distracted just trying to “keep up.”

Micro-learning is identified as the answer to items 2, 3, and 5 from that list, but I do not believe it is truly “the” answer to any of them. It sounds great on the surface, but the root of the problem goes deeper.

Motivate, Focus, Repeat.

So what is the answer? I think these things are…

  1. Make sure motivation exists. Putting people who have zero desire to be learning into a learning situation is a recipe for flushing money down the drain. Motivation trumps almost everything else. Really great instructional designers can help with motivation, but only to a point. People need to perceive that learning the new skill matters to them in a significant way.
  2. Make sure focused time is available  Get clear on company priorities…and realize you can  only execute on one priority at a time. Without time to focus, people cannot learn. Don’t think you can squeeze a learning experience into five minutes per day.
  3. Repeat to remember. Assume that people will need to have multiple repetitions to truly learn something. Repetition can be effective in short, continuous bursts… but I’ll save that discussion for another blog post.
  4. Make sure there are immediate opportunities to use what’s being learned. Without the immediate opportunity to apply, learning gets lost.This would be a second way you can flush money down a drain.
  5. Make sure someone else – besides the learner – cares about what someone is learning. Someone else, either a manager or co-worker, needs to inquire about what’s being learned. If people never get to talk about or reflect on what they are learning, the learning will be extremely limited and difficult for the person to apply it.


Right now, I am taking a 5-session MOOC (massive open online course) called Smart Growth for Private Businesses. It includes about 5.5 hours of lectures, several quizzes, and four case studies that are each 12-15 pages in length. I started the class about three weeks ago; I’m now three-fourths of the way through it and should have it completed in the next week or so. The lectures are organized into relatively small chunks, which are interspersed with 2-minute quizzes. The first sessions lectures are organized into “bites” of varying lengths that range in length from 2 minutes to 26 minutes. To date, I’ve invested several hours and I think every hour as been hugely valuable.

My completion of this course hits the five ingredients I feel are necessary for learning:

  1. I’m motivated. My company is growing extremely fast; we want to control our growth to maximize the health of our company and its team members. We’re also getting ready to start another strategic planning cycle and I will use the info from the course as we execute this process.
  2. I’m finding time to focus because focus matters. I’ve clarified my priorities… and taking this course is one of those priorities.
  3. I am reviewing the content with myself and others. I have taken notes, I’ve gone back and reviewed sections, etc. I’m repeating to remember… even using the white boards on my walls to write down key concepts I want to retain.
  4. I already mentioned strategic planning we’re preparing to do. I have an immediate need for the content.
  5. Two others in the company are also taking the course. Having someone to talk to is huge in retaining the information and learning from it.

So… that’s not micro-learning as I’ve seen it defined. But I can tell you I am getting excellent results, and I absolutely do not believe I would be getting excellent results if this content was of low or even medium value to me and delivered in small, five-minute chunks each day.

What do you think?

I’d love to hear your thoughts: can micro-learning truly help an employee who is stressed and lacks motivation learn?

5 Ways to Stop Training Content Overload


Product launch cycles are speeding up. Compliance and regulatory requirements are not getting any easier to fulfill. Your organization just implemented a new software tool and employees need to know how to use it. The XYZ process just changed… again. As an L&D leader, you are probably tasked with helping employees respond and adapt in an environment that is constantly shifting. How much training is “enough”? Are you overloading your learners with training?

In our Learning & Remembering Survey, first conducted in December 2014, 24% of respondents said that “content overload’ was a huge problem in their organizations. 38% more said that knowledge transfer and retention of content are their biggest challenges. Too many courses are being delivered… and learners cannot keep up with the amount of information being thrown at them. When learners are asked to take too much training, or remember too much content, many negative consequences can occur:

  • The organization is wasting resources on training that is not effective.
  • Learners are not remembering what was trained because too much content is crammed into each session. This leads to poor performance.
  • Stakeholders expect results from their investment in training, and they are disappointed when reality does not meet expectations.
  • Learners are fatigued by the high volume of compliance training they must complete, and unfortunately there is no way to reduce the number of required topics.

…The list could go on. In many organizations, large amounts of complex content and technical information may be unavoidable. Fortunately, there are many strategies you can use to reduce the burden on your learners. When used properly, these strategies will also improve the impact training has on your organization.

Strategies to Stop Training Content Overload

1. Less “tell.” More “do.” Consider ways to turn your content into a series of interactions. Look for ways to involve learners in the content instead of simply presenting it to them.


This Hazard Communication course removed the up-front screens with endless information and turned the content into a series of interactions. The course won a 2014 Horizon Interactive award.

2. Use a variety of media. When you have lots of information to deliver, doing so in a variety of different formats can lighten the load. When a client needs us to create a course that must include lots of information, we will frequently include animated videos and clickable interactions to break up the content.


Short videos can be used to present some of the necessary “tell” content.

3. Break the learning into small chunks. Research clearly shows that we learn best in small chunks, repeated over time. Leverage learning technologies that allow smaller bites of content to be delivered to learners over a longer period of time instead of creating a single eLearning course.


Knowledge Guru’s “Quest” game type allows levels to be played in small chunks. Learners can receive email reminders when a new level is available for play.

4. Make ILT interactive. Even in our technology-driven world, instructor-led training is still often the right learning solution. You just have to maximize it. In the customer training we created for Roche, we created a robust facilitator guide and participant guide that broke up the day and incorporated lots of coaching and practice opportunities.

5. Personalize the learning. Don’t underestimate the power of technology to tailor the content learners receive. The customer training curriculum referenced above included an online pre-work module where learners choose their job role and take a course that only includes information relevant to them. Once on-site, Roche trainers use an iPad app to deliver a customized lesson based on customer role and product configuration. These on-site sessions are organized so customers only participate in the portion of training that is relevant to their role.


This customer training program created for Roche Diagnostics includes a personalized eLearning course and an app that allow instructors to deliver tailored lessons.

How to Create Award-Winning Training Solutions


The after-glow of our three 2014 Brandon Hall awards is starting to fade away. Yes, excitement can only last for so long… and we are hard at work making plans to submit for 2015 awards. Many of the clients we talk to care about awards and would like to have the prestige that comes from having them attached to their training efforts.

But if you’ve never submitted for an award before, how do you know your learning solution is award-worthy? And what do you need to include in the submission to draft a winner?

The “Secret Sauce”

If you want to receive awards for the training you create, you have to be able to articulate two things: value and innovation. Too frequently training functions cannot articulate the value their solution(s) delivered to the organization. They cannot describe a problem or how their training solution helped solve or at least reduce the severity of the problem. That’s the secret sauce.


The folks in L&D who do gather up all the fancy trophies and industry accolades have submissions that describe innovative solutions that solved a quantifiable problem. Think about it in terms of submitting a “before and after” story. Here are some examples:

  • Before we implemented training, we had numerous safety accidents on the XYZ machine. It was costing us $X and X days in lost productivity as a result of accidents.  After training, our accidents decreased by X%, our costs were $X, and our days of lost productivity decreased by X.
  • Before training, we were spending up to 24 months ramping up a new-hire. Our team leads were self-reporting high levels of stress and our employee surveys indicated low job satisfaction for those in team leadership roles.  After we implemented our new employee onboarding training program, we winnowed down the ramp-up to 12 months—a 100% decrease in the time required to achieve full productivity. While we do not yet have the results of the most recent employee survey, a poll of team leaders indicates that they perceive stress levels to be “significantly lower” than before we implemented the new onboarding program.
  • We have 90 people in the director of recruitment role in our organization. Before training, the annual employee turnover for the director of recruitment role was 30%. It cost our organization $18,000 for every new hire we had to make, which meant our annual spend on the recruitment, hiring, and training of this role was $486,000. After training, we were able to reduce it to 20% AND decrease the time to full productivity by 3 months. Our annual costs for recruiting and hiring decreased to $324,000, which is a 33% cost savings.
  • Before training we were spending up to 3 months at a customer site following installation of our product. After we launched the revamped customer education program, satisfaction ratings improved from an average of 3.75 out of 5 to an average of 4.5 out of 5.  In addition, we reduced time on site by 30 days. This resulted in a cost savings of $8,000 per customer.

The common denominator in these stories is some form of data that identifies a problem, quantifies its impact to the business, and then quantifies the results obtained from implementing a training solution. Too often, training is not quantified.

So what can you do when you don’t have data? The simple answer is to get some, and here are some tangible techniques you can use to help you do that.

Probe more than once. Don’t accept the first answer as the final answer.

If you ask a subject matter expert or stakeholder, “What is the problem and how can you quantify it,” avoid accepting the initial response that might go like this:

“We don’t have any actual numbers, but I’ve been hearing from the field that this is an issue.  I’ve talked to our lab chemists and they  tell me that they are answering the same questions over and over. They are sharing basic information that field reps should really know themselves. If they were able to answer questions, I know it would be beneficial to us.

Your stakeholder or SME may well be right, but you should probe. If this is really a big issue that costs the company money, chances are that data is not as difficult to assemble as the SME thinks. Here are things you can ask to help quantify the problem and its impact on your company:

  • How many chemists are affected by a sales rep’s need to call for technical support?
  • Ask the chemists: In a given week, how many calls or emails from sales reps do you respond to and what are the most common issues? How much time do you spend on this per week – 30 minutes, an hour, 2 hours, etc.
  • Ask a handful of key distributors: How frequently do you ask a sales rep a product question that he or she cannot answer? If he or she cannot answer you immediately, can you quantify any dollar impact to your business? What about your perception of ACME as a supplier?

Ask for the stakeholder to give you a dollar value that they would associate with whatever problem they describe to you. Ask them: Is it worth $10K, $20K, $30K, etc? Why? What benefit will ACME get by implementing this solution?

Sometimes, the act of asking them to assign a value will help the stakeholder or SME realize they need more data before jumping to solution design. After all, the data might help create a better solution!

In truth, if it is worth $10K or less, then you are not looking at a very robust learning solution…and if you are not looking at a robust solution, will you truly affect performance change? Even if a solution is 100% designed, developed, and delivered internally (no vendors), the cost is likely to quickly approach at least $10,000 when we factor in the time for a training person to design and build the solution, the time a SME will spend providing content expertise, and the time all the employees will spend completing the training.

What about innovation?

This is the other element to the “secret sauce” of winning awards You have to go beyond defining a problem and quantifying results. You need to think about how you did it DIFFERENTLY than others have done. How is the solution an advancement in the field? What new approaches does it use that might be a model for others?


Of course, the “innovation” must be relevant to the topic you are submitting the award for. If you have submitted your project for a “Best Use of Blended Learning” award, but the results of the project are not at all related to the blended learning approach, then your chances of winning are lower, even if the results are good.

In the award we submitted with Cisco, the “before” problem they identified was a challenge with getting new sales associates to retain large amounts of product and technical information. Through learner surveys and learning objective completion rates, they were able to determine that the spaced repetition built into Knowledge Guru games had a meaningful impact on solving their problem. In this case, the way gaming was connected to learning science was considered “innovative,” and the innovation mattered because it drove results for Cisco.

So, You’re Saying There’s a Chance?

In the end, there’s no guarantee that a particular learning solution will win an award, no matter what organization you submit it to. A small percentage of projects will win any given award, and even fewer will win a “Gold” distinction. Whether you plan to submit your work for awards or not, adopt an award-winning mentality by showing measurable results and using innovative designs and approaches to drive those results.


Building Skill and Knowledge, iEV, and Storytelling Tips: This Week on #BLPLearn

blp-learn-banner#BLPLearn is our way of saving all of the great content our team curates… and sharing it with the wider community. We’ll take the best articles shared by our Learning Services, Multimedia, and Product Development teams in their weekly meetings and include them in the weekly #BLPLearn blog. We’ll usually include some commentary from the original team member who found the article, too.

Our goal is to make the weekly #BLPLearn blog a dependable source for quality, curated L&D content. Check back every Thursday.


Rather than restricting the social media conversation to a 30 minute window, we’re inviting everyone inside and outside BLP to share interesting links, thoughts, and articles with the #BLPLearn hashtag on Twitter. We’ll check the feed once a week and include the best articles submitted via Twitter in the post, too.


Now that introductions are out of the way, let’s dive in to this week’s articles:

Games That Build Skill and Knowledge
Submitted by Sharon Boller, President and Chief Product Officer 

One of the reasons we created the Quest game type was to develop a game type that allowed for both knowledge acquisition AND skill building. We enabled people to create “performance challenges” that could be skill-based as opposed to knowledge-based while still retaining the knowledge component. Because Guru is a game engine that includes an authoring tool, we needed to make it very, very easy for people to develop a game quickly and with minimal to no game design skill.

We know that one of the primary needs our client base has is to help people build process knowledge and skill. Imagine you have the freedom and $$ to develop a 100% custom game. It’s important for people to perform the process quickly and without errors. Asking questions about it won’t be enough. They have to practice DOING it – over and over again. A game is an ideal tool for this frequent practice. An eLearning company in the UK developed an intriguing game for the UI division of McDonald’s Corporation’s. The game teaches people how to operate the “till,” (you have to love British English) and also incorporates a knowledge component in the form of Q&As. The business results were impressive, as was the implementation technique – merely embedding the game within an employee website/portal and letting employees discover it for themselves.

They designed achievements that linked effectiveley to what they wanted employees producing on the job (perfection, happy camper, etc),and, in general, modeled lots of the traits of effective learning game design.

You can read about it here:

Example of Game that builds knowledge AND skill

Submitted by Brandon Penticuff, Technology Director 

Pop quiz: What’s the #1 concern that someone usually has when considering an electric vehicle?

It turns out that it’s something called “Range Anxiety”, essentially the concern about the range of electric vehicles and whether or not your needs would be sufficiently covered by an electric powered car.

The link/app that I want to share with everyone today is designed to address this anxiety as well as provide additional information on what your driving patterns would look like if you had an electric vehicle. So this is a very targeted learning app really.


  • What do you think, would this information address anxieties you might share?
  • This is a pretty unique way of problem solving, using essentially a data collection app to compare results with a different model in the same environment, any thoughts on how that could be applied to other learning?
  • One neat feature of the app is that if you agree to share anonymous data about your drives, it unlocks additional features. I thought this was a novel way to approach “pro features” instead of making the user pay for them.


4 Storytelling Tips From the Co-Creator of Blockbuster Mystery Podcast “Serial
Submitted by Jennifer Bertram, Director of Instructional Design

I am obsessed with this podcast. I listen to it religiously every week – it’s my favorite “show”. There are some great storytelling tips in this article that I think we could translate to the stories we tell in our courses. How can we weave stories throughout the learning experience? How can we do a better job of parceling out details over time?

  • The idea of spreading it out over multiple weeks due to the amount of content. This could really apply to a “bite sized” learning solution. Helps with the goal of leaving the learner wanting more.
  • The narrator has a perspective and you can see her feelings/questions as you go. This helps you engage in the story. By taking this approach, you can really see the different perspectives of all of the people in the story.
  • Holding back some of the details of the stories is a great way to engage the audience and keep them in the story/learning.
  • Think about what questions you want the story to answer, worry about that more than the ending.

4 Storytelling Tips

Our “Recipe” for Learning and Remembering in Corporate Learning


Some employees get too little training. They sit through a few classroom sessions, see some slides, and get very little help at actually doing their job.

Others get too much training. The list of required eLearning courses is too long, and actually takes them away from their responsibilities. The learning and remembering ends up happening outside of, or in spite of, the training requirement.

Most organizations invest heavily in training their employees, yet employees still do not retain the critical knowledge they need to be successful. This is why we focus our research on why employees forget. How do our brains respond when we learn new information? Is there a pattern to forgetting?

Sharon Boller’s work has explored the divide between remembering and forgetting extensively in 2014. Her white paper, When Remembering Really Matters, identifies eight strategies, four for learning and four for remembering, that help fight forgetting.

Sharon’s presentation at DevLearn 2014 presented these strategies from a new perspective. Have a look:

learning and remembering as a “recipe”

In her presentation, Sharon illustrates how the elements required for learning and remembering fit together into a repeatable process. When used correctly this process, or “recipe,” can yield our desired outcomes.

Here’s a look at the entire recipe from start to finish:


What elements are required to learn?

Before we can remember anything, we have to first learn it! Research (and experience) tells us that motivation, relevant practice, and specific, timely feedback are all required for learning… but that’s not the whole story. These are all essential parts of the learning process, but we have to take remembering into consideration to complete our recipe.

What elements are required to remember?

In her white paper and presentation, Sharon presents four strategies to use when you really want learners to remember:

  1. Spaced intervals – not a single “glop”
  2. Repetition – several instances of it
  3. Feedback – with requirement to do it right after making mistake
  4. Stories

One particular reason spacing works is that it eliminates the “glop.” With learning, too much = nothing. If you overload the learner with information then none of it will stick. Space the learning out and use repetitions to cement the content.

Finally, story helps create context and an emotional response in the learner, both of which are proven to increase retention. This is one of the reasons that games can be such a powerful learning tool.

So what’s the real recipe for learning and remembering?



Motivation: Employees/players/learners need to be motivated to learn. The most obvious way to do this is to incentivize them somehow, and that can work, but that provides only extrinsic motivation. The best learning happens when the learner is intrinsically motivated. Think about what your learners might need to want to participate in the training. Could you make it more fun? Do they want to compete?


Relevant Practice: It is crucial that your learners practice. The saying “practice makes perfect” might be cliche, but it’s true. Think about ways you can encourage practice over time… and make sure it’s relevant to the goals you set.


Specific, Timely Feedback: Feedback is one of the most essential ingredients because it allows your learners to correct mistakes and stops them from building any bad habits or repeating incorrect information. Behavioral psychology shows time and time again, however, that the feedback must be specific and it must be quick, so that the learner can make the connections between the correct feedback and their mistake.


Spacing and Repetition: Now we’re getting into the ingredients that are crucial to long-term retention. Without repetition at strategically spaced intervals, learners will forget 30 – 90% of what they learned in 2-6 days time. Spaced repetition is the secret to fighting this forgetting curve.


Story: As we stated above, story both gives the learner context and creates an emotional element that will help them retrieve the information later. It’s easier to remember an alien telling the safety guidelines that can help you keep your lab safe from invaders than it is to remember those same guidelines from a boring PDF.

Test the recipe in your own “Kitchen”

What do you think of these strategies? Have you applied any of them in your own training? What obstacles make it hard to do so?

The Corporate Learning Starter Pack (Free Download)


The status quo is no longer working in Learning & Development.

Rapid authoring, agile learning design, game-based learning platforms and mobile have made their way into our field, along with other innovations. Stakeholders who understand the impact training can have on the business are asking L&D to measure this impact based on real business outcomes.

And then we have our learners, who increasingly demand engaging, interactive learning solutions that truly help them improve their performance.

It’s time for a new plan.

For nearly 20 years, we have partnered with our clients to design and deliver the right learning solutions for their training and performance needs. While every project is different, we have evolved a set of internal tools that help us analyze the current situation, design a single course or blended curriculum to meet the need and select the right technology.

We’ve simplified these tools to create the Corporate Learning Starter Pack. Use these tools to jumpstart your planning process and create learning that meets and exceeds your goals!

There is so much evaluation and planning that goes into training, from a simple course to a full curriculum. Your budget and needs will ultimate shape your decisions, and the tools we’ve created will help you organize all of that information together in one place. With tips, questionnaires, checklists, and more, you’ll feel better equipped to tackle your training needs.

What’s inside

In the Starter Pack, you’ll find some of our most popular eLearning resources:

  • Needs Analysis worksheet: Ten simple questions that get you to the root of the training need.
  • Self assessment for your training program: A chance to be honest about the current state of training in your organization.
  • Template for planning your training program: A structured approach to sketching out the various learning solutions you must include.
  • Technology evaluation checklist: A shortcut for evaluating what technologies should be included in your approach.

What It Means to Be Collaborative in Agile Learning Design

This is an excerpt from Jennifer Bertram’s new white paper, Agile Learning Design for Beginners: Designing Learning at the Speed of Change. Here is a section on collaboration:


Agile instructional design focuses just as much on communication as it does process, if not more. Without strong communication within a development team and with a client or stakeholder, Agile just doesn’t work. Agile comes to us from the software development world, and one of the principles is individuals and interactions over processes and tools.


Our agile team has worked hard to become more collaborative and ensure that everyone has a voice. In the past, the individuals on the team would throw things over the fence to each other and often it would result in miscommunication and frustration.
We’ve created some guidelines for the team to help us ensure that we’re communicating well:

  • We will all plan on working in the office at least one day a week. We believe that having face-time aids in communication and makes it easier for us to collaborate as a team.
  • We’ll use group IM chats if there’s something that we feel like the whole group needs to know or weigh in on.
  • We’ll avoid using email. It stinks as a communication tool. It slows things down and doesn’t allow for conversations to happen. Face-to-face, IM, or phone are our preferred tools. (This is one of our team’s favorite “rules”!)
  • We want to have no surprises at internal deliverables. What that means is that the team will work together to agree on functionality, graphics, and flow. We believe that we come up with better ideas when more than one person is involved in making the decision.

It’s been a change in behavior and taken time, but now individuals catch themselves when they communicate in ways that aren’t efficient or don’t involve the right people.

What does collaboration look like with stakeholders or clients?

Collaborating with stakeholders means involving them in the idea creation and decision making processes. While most people like that idea in theory, it requires a different way of working. To help your stakeholders, be sure to:

  • Set clear expectations as to what you want them to review. Try giving them a checklist of questions they should answer.
  • Be sure to tell them what’s not included in the review. If certain features or activities haven’t been created, be clear that it’s intentional, not an oversight.
  • Let them know how much time it will take to look at the feature. In general, our clients have found that they are looking at smaller pieces, more often.
  • Look for ways where you can have them look at things with you. Tack on 10 minutes at the end of a meeting and show them what you’re working on. Or if you have regularly scheduled status meetings, come prepared with some show and tell to get their reactions and feedback.


The Science of Remembering: Strategies for Long-Term Retention (Free Webinar)


Employees are required to complete a lot of training during the year, so much so that it is simply impossible for them to remember everything that is asked of them. Sharon Boller spoke to these challenges in her recent white paper: “When Remembering Really Matters: Learning Strategies for Long-Term Retention.” The white paper includes strategies for both learning and remembering, emphasizing the need to improve the effectiveness of the initial training event as well as the post-training reinforcement strategy.


VP of Client Relations Leanne Batchelder took these ideas a step further last week at CLN Week West. Her session included expanded case studies of clients where we have applied the science of learning and remembering to achieve tangible business results.

If increasing what employees remember is a priority for you in your role, I encourage you to join Leanne and I for our upcoming webinar, “The Science of Remembering: Proven Techniques for Helping Learners Obtain New Knowledge and Skills.” The webinar is based off of Leanne’s CLN Week West presentation, but also includes some expanded “show and tell” from our custom eLearning projects.

The webinar will be held on Tuesday, June 3rd at 1 pm PDT, 10 am EDT. Click here to register.

Leanne will spotlight two corporate learning case studies that show how incorporating research-based design techniques into your learning solutions will improve knowledge and skill retention, and ultimately drive business outcomes. We’ll take a look at what we did, how we did it, and the results we achieved with a single online learning game for ExactTarget sales reps and a larger, blended curriculum for Roche Diagnostics customers.

What Will You Learn?

4 proven strategies that inhibit forgetting and enhance remembering.

Explanation, examples and research behind specific learning design techniques: spaced learning, distributed practice, repetition, feedback and more.

The real cost of not remembering

ASTD estimates that in 2012, organizations invested $164.2 billion in employee training. How much of your training investment goes to waste?

Ways to incorporate these strategies into a single learning solution or large curriculum

Case studies from our work with Roche Diagnostics and ExactTarget demonstrate how these learning strategies impact business results

How to put it all together

Perhaps most importantly of all, the expanded webinar includes a summary of five business challenges we solved for our clients using a combination of these strategies.

Click here to Register!