Benchmark Your Training With Our Learning Solution Scorecard


All training is designed to help target learners improve their performance, but that’s not where the story ends. Stakeholders and training vendors are also judged by the success or failure of a learning solution.

We all have a lot to gain when training meets its intended goal. And we have just as much to lose when it doesn’t. When learning solutions are successful, job performance improves, satisfaction increases, the business meets its goals, and L&D professionals receive more budget (!) to make an even greater impact.

And when training fails to improve performance, well… we won’t get into that.

By the way, did you know we’ll be exhibiting at ATD International Conference & Expo again this year? We’ll be at booth #834. Learn how you can register with us and save!

The definition of a successful learning solution can be subjective. There isn’t a consistent rubric to measure learning solutions against each other to see which ones are the best. Award programs like Brandon Hall and CLO help with this to some extent, but unless you plan to submit every learning solution you create for an award, benchmarking is tough.

That’s why we created our Learning Solution Scorecard. We introduce it to clients at the beginning of every project to coach them on what factors will make their training succeed. We use it at the end of projects to measure their success. And we use the scorecard during a project to make sure we’re headed in the right direction.

By scoring learning solutions across four categories, we are able to assess whether they are Performance Accelerating, Performance Promoting or Performance Demoting.

Is the learning solution meaningful?

Meaningful learning solutions are linked to a clearly defined business problem or need. They are designed with the needs and motivations of target learners in mind and have a clear learning solution goal that defines what learners need to do differently based on the training.

Is it memorable?

When we say memorable, we mean that learners should be able to remember what they learned and apply it on the job. The scorecard checks for a variety of instructional design approaches based on learning science to make sure the learning solution maximizes retention and recall.

Is it engaging?

By understanding the target learner, we can create solutions they will find engaging. The scorecard checks to verify that the learning materials have high production value and use innovative approaches where appropriate to enhance the learning experience.

Is it supported post-training?

Learners need more than one touch-point to remember new information and skills. The scorecard verifies that mechanisms are in place to ensure performance on the job. It also makes sure supervisors encourage and monitor the learner’s efforts in applying the learning.

Get the scorecard

How Gestalt Can Help You Create Better Training: This Month on #BLPLearn


Welcome back to our #BLPLearn blog series, where we offer a monthly look at design and technology as it pertains to learning and development. I’m your host, Jake Huhn… Senior Marketing Technologist at Bottom-Line Performance.

Let’s Talk About Gestalt Principles

Learning design and graphic design sometimes feel like two distant worlds. When you’re building a course—or working with a vendor—and you’re responsible for results, it can make graphic design seem like a trivial afterthought. You’re concerned with making sure every word is perfect, and making sure every step is explained thoroughly, and making sure you provide accurate definitions. Where’s the time to worry about how “pretty” that screen looks?

But I want to encourage you to make graphic design a higher priority—and there’s science to back me up.


It all has to do with Gestalt Principles of Organization. “The Gestalt principles of organization involve observations about the ways in which we group together various stimuli to arrive at perceptions of patterns and shapes.” [Gestalt Principles of Organization] These principles are essentially graphic design 101, and every designer should at least be familiar with them. And researchers at Monash University in Melbourne, Australia have shown how Gestalt theory can help improve learning:

“The new screen designs were then evaluated by asking students and others to compare the designs. The viewers were also asked to rate directly the value of using the eleven Gestalt design principles in the redesign, both for improving the product’s appearance and improving its value for learning.The evaluation results were overwhelmingly positive. Both the new design and the value of applying the eleven Gestalt laws to improve learning were strongly supported by the students’ opinions.”

These researchers aren’t alone, either. Other research has shown how these principles facilitate Visual Working Memory, an essential part of learning and other cognitive processes.

Implications for Learning Design

As a graphic designer, I gravitate towards how beautiful, clean design can improve learners’ comprehension of a course… but there’s more that Gestalt theory can offer learning designers. Gestalt is more than graphic design, it’s an entire psychology of perception—and it can improve more than just looks.

Consider what Gestalt theory teaches us about Similarity. Learning is facilitated if similar ideas are treated and linked together and then contrasted with opposing or complementary sets of ideas.

It can also shape the way you challenge your learners (think quizzing). “The Gestalt theory of learning purports the importance of presenting information or images that contain gaps and elements that don’t exactly fit into the picture. This type of learning requires the learner to use critical thinking and problem solving skills. Rather than putting out answers by rote memory, the learner must examine and deliberate in order to find the answers they are seeking.” [Gestalt Theory (von Ehrenfels)]

And bringing it back to where we started, the graphic design of your learning solution (the proximity of text to images, the negative space, the clean lines) is yet another piece of the puzzle when it comes to facilitating proper learning. If you organize your information and images according to these principles, your learning solution will look beautiful and be more effective.

So Take the Time to Learn About Gestalt Theory

I hope I’ve made the case that taking graphic design 101 can actually benefit your learning design. There is a lot of information on the web—from either universities or graphic design authorities—that can help give you an overview of Gestalt principles in design. A great starting point is this Designer’s Guide to Gestalt Theory on Creative Bloq. From there you can dive into the actual psychology and even explore eLearning Industry’s website for more industry specific coverage.


Chang, Dempsey, Laurence Dooley, and Juhani E. Tuovinen. “Gestalt Theory in Visual Screen Design: A New Look at an Old Subject.” Proceedings of the Seventh World Conference on Computers in Education Conference on Computers in Education: Australian Topics 8 (2002). Accessed March 27, 2016.

“Gestalt Theory (von Ehrenfels).” 2014. Accessed March 27, 2016.

Peterson, Dwight J., and Marian E. Berryhill. “The Gestalt Principle of Similarity Benefits Visual Working Memory.” Psychon Bull Rev. 20, no. 6 (December 20, 2013): 1282-289. Accessed March 27, 2016.

“Gestalt Principles of Organization.” Psychology Encyclopedia. 2013. Accessed March 27, 2016.

Why “70:20:10” Is Not Enough


In a recent article, Bill Brandon speculates that we’ve hit a tipping point in the workplace. Learning demands placed on workers have exceed workers’ capacity to meet them. In other words, we are inundating today’s workers with training and asking them to complete all of it while still maintaining high levels of productivity. This really struck a chord with me… because I see it happening again and again in our clients’ organizations.

I liked Brandon’s article and tweeted it out, saying it was a nice piece… and it was. However, I found myself going back to it and feeling like Brandon neglected a very important point. He advocated for us to think about three elements that all contribute to our accomplishments and performance in the workplace:

  • Our skills and knowledge
  • Our shared experience (things gleaned from others, informal learning we do via social networks, interactions with peers, etc.)
  • Our individual experience (things we learn by doing)

Brandon felt that if L&D professionals thought less in terms of “courses” and used the 70-20-10 “rule of thumb” to consider how to help someone build competence, then we’d be better off. (Caution flag here: 70-20-10 is NOT a proven model; it is described by the person who originally coined it as “folklore.” See page 5 of this journal article written by the originator of 70-20-10, Morgan McCall.) Brandon advocated that we embrace social learning and learning pathways as the means of reducing the stress and burnout so many are experiencing.

The Elephant in the Room

While I do not disagree that avenues other than courses can be hugely valuable in helping build people’s proficiencies, I realized that the article failed to mention the elephant in the stress/burnout room. The elephant is time, or rather lack thereof. Learning takes time, whether we do it informally or formally. In today’s workplaces, we’re pushing people to do more and more. We are failing to acknowledge what this “more and more” often means: we are asking people to go way beyond 40 hours in their work week to do the learning required to build and maintain proficiency and to do the work that contributes to company profits.

Harold Jarche had it right when he said that in today’s economy, work is learning and learning is the work. THAT is the model employers and employees have to get into our heads…that learning on the job is simply part of doing our jobs.

To manage stress and minimize burnout, we have to incorporate “learning curve” into the work people do. We have to factor this learning curve into the time things will take to complete and the amount someone will accomplish in a day or a week. And because people are constantly figuring out how to do something while they are working on their projects, we have to build in this constant “learning curve” into our expectations of what people will accomplish and how fast they will accomplish it.

In My Experience.

I run a business and our formula for billable time is not 40 hours a week. Depending on the team member’s role, we estimate that 80% of their time can be devoted to billable tasks. The remainder is allocated to learning and administrative tasks. Giving people time to learn on the job is essential in an industry where we need to stay on the leading edge of what’s possible re: learning solutions. We have communities of practice that people are part of, we have weekly link-sharing and discussions, we have periodic all-company “demo-fests” where we share out projects with each other. On top of all that, we have periodic formal courses that people will attend to build skills in niche areas. All these things take time…in addition to the constant learning someone does in the course of executing projects.

So Bill Brandon, I most definitely agree that we can and should think beyond formal courses in helping people build proficiency. But we cannot do so – even via informal means – if we fail to acknowledge that we have to build the time in for people to learn. Even looking something up requires time.

Should Instructional Designers “Teach to the Test”?

Teach to the Test

There is a lot of angst these days in the education field about “teaching to the test.” It started in K-12 but it’s crept into corporate speak as well. Some say that tests are no longer relevant. They are viewed as hold-overs of an out-of-touch education system. A growing bandwagon of people are saying that they want to help people learn to problem-solve and do critical thinking… and not just memorize facts.

In the corporate world, people really do need to recall facts to do their jobs well. There are plenty of times where being able to “Google it” is not enough: they need to know it if they want to perform their job efficiently and/or safely. In compliance and safety situations, we need some objective verification that they do know it before they are allowed to perform the job. This is needed both to satisfy OSHA regulations and as a way to protect the employee and business.

Case in Point

I sat on a materials review call for a course we are developing within the healthcare industry. This particular scenario asks quite a bit from the learner:

  1. They need to be able to recall the steps to performing a variety of tasks.
  2. They need to select the appropriate tools to do specifics jobs.
  3. They need to be able to correctly put on personal protective equipment (PPE) when entering spaces where high-risk infections are present.
  4. They need to know what protective equipment is required in specific situations, which means they need to recognize different signage located outside patient rooms.

The entire course concludes with a certification test. The test directly links to what this role needs to know…and know how to do. I was concerned to hear a materials reviewer push to add course content that was not going to be part of the test.  This reviewer said, “We need to go beyond teaching to the test.” The implication was that we would fail the learner if we only include content that will be on the test. In essence, we want to give them a smorgasbord of information and heighten their competency by doing so.

What's Wrong with the Test

What’s Wrong With The Test?

We act as if it is shameful if we “only” teach to a test, but why? I suspect many of us believe that we are dumbing things down if we do just focus on a test. Perhaps we are afraid that teaching to a test limits our ability to deliver a rich, meaningful experience that elevates the general abilities of the learner. Too often, we want to turn people into the experts that we are rather than arming them with basic proficiency to do their jobs well.

What’s the risk? If we mix nice-to-know and need-to-know content, learners will likely experience cognitive overload. Worse, we risk them remembering some of the irrelevant information at the expense of the most relevant information.*

What does “Good” Look Like?

A good test should be an accurate assessment of the body of knowledge learners need to know to perform their jobs. If appropriate, it should also assess the skills people have or their decision-making ablity when judgment is a component of executing the job. It should only assess the knowledge and skill required to do the job. Courses that are designed to teach steps, processes, and the “why’s” behind those steps and processes need to keep their focus laser-sharp. People can only remember so much.

If it is essential that workers recall a specific body of knowledge and apply that knowledge to the execution of a set of procedures and processes then, please, don’t include anything that is not essential to them.

The problem is not tests. The problem is bad tests. Bad tests contain irrelevant material. Bad tests are poorly worded. Bad tests are too easy or too hard. Bad tests are not comprehensive, covering all the knowledge and skills critical to a job or situation.

Please do teach to the test… But only if you want to verify that people gained the skill and knowledge you have defined as essential to successful performance of the job.


*- Ruth Clark and Richard Mayer, The Science of Instruction, New Jersey: Pfeiffer, 2011.

Maybe you have a training need that requires some additional analysis? You can use our free Training Needs Analysis Worksheet to get to the heart of the matter.

Once Is Not Enough: How to Playtest Custom Learning Games

Once is Not Enough

I believe playtesting is a crucial, and often overlooked, part of learning game design. It takes multiple iterations to determine the right combination of game mechanics and game elements for your target learners. Whether you are an experienced game designer or an instructional designer trying to design a game for the first time, your game will benefit from multiple rounds of playtesting.

Of course, if we are going to tell everyone else to playtest their games, we have to do so as well! The following is the step-by-step story of how we created and play-tested a recent learning game for a client.

Example: Serious Game for Healthcare Workers

We recently created a game called “Five Star Facility,” targeted towards environmental technicians who work in healthcare settings. These technicians clean the patients’ rooms and other areas of a hospital or long-term care facility. It took us four iterations to get the game we wanted and players could most learn from.

We reached a great game in the end, but it took good playtesting and iteration to get us there. Our stellar design team of Amanda Gentry, Matt Kroeger, Kristen Hewett, and Erika Bartlett did a terrific job!

Version One – Let’s create something “sort of like Clue”

This version has similarity to a commercial game called Clue. The design team felt the target learners would be pretty familiar with Clue’s  rules and core dynamics (exploration, collection), and they wanted a game that learners could quickly learn to play. Gameplay was competitive. The game goal was to be the first person to collect all the room tokens, which represented all the categories of information players needed to learn and remember. Players rolled a die to determine how many spaces they moved on the board. Each space corresponded to a different category of environmental protection/cleaning. Players had to answer questions related to whatever category they landed on. Similar to Clue, they had to go into each room on the board. Answering a question in the room earned them a token for that room. The first player to earn all the tokens won the game.

First version of five star facility game.

First version of five star facility game.

It wasn’t a terrible game design… but it was just okay.

Problems With the First Design:

  1. The game could get sort of long if people were rolling lots of low numbers.
  2. The designers made the game competitive. In the real-world, environmental technicians should behave cooperatively with each other and with the healthcare team as a whole.
  3. This first rendition ignored the “why” of the environmental tech’s job and didn’t help them see the connection between what they do and how the healthcare facility gets dollars to stay in business. Survey ratings determine the reimbursements healthcare facilities receive from Medicare. If your facility’s aggregate survey ratings are only three stars, you do not receive the same dollars as a facility who received five star ratings.

Version Two – Scrap the Clue idea. Let’s race to the finish.

The icons are gone, the die is gone, and we have a path we’re traveling and monetary targets to reach. This version stunk. It was boring and tedious to play. Players simply took turns drawing cards to try to reach the target dollar amount. They worked together to answer the questions, but when the designers switched from competition to cooperation, they failed to include game mechanics that created any conflict or tension within the players. There was no “Lose” state or no really bad things that could happen. This version was quickly ditched.

Version Two: Lots and lots of dots.

Version Two: Lots and lots of dots.

Version Three – Bring back the game icons. Add in a progress tracker.

Version three was much better! The team latched on to the realization that five star ratings led to better reimbursements. Now players had to secure at least $70K in reimbursements to win… and mistakes would push their survey ratings downward. This was better, but there was still a serious flaw. Players’ dollars didn’t go down when they made mistakes; only their survey ratings did. In the real world, these are tied together. We also discovered as we played that we needed to better write our questions to eliminate ambiguity of responses. On the plus side, the discussion team members did before deciding on a correct response was phenomenal. Lots of learning happened in these discussions.

Version 3: We’re getting closer to the final product.

Version 3: We’re getting closer to the final product.

 Version Four – We have a winner!

The final version of the game was the winner. Look at how we tied together survey ratings and reimbursement dollars. Players start with a 1.5 star rating and $30K in reimbursement dollars. To move to the right and earn more dollars, they have to enter a room and respond correctly to that category’s question. They still roll a die to move a team token around the board. If they land on a space outside of a room, they have to answer a question that corresponds to the icon they land on. A correct response allows play to progress to the next player with no adverse event. An incorrect response forces players to move to a lower survey rating. If they hit the zero starts spot on the game board, the game is over and the team loses. If they earn $10K from every “room” on the game board and achieve at least $70K in reimbursement, they win.

Version four: By George, we’ve got it!

Version four: By George, we’ve got it!

Lessons for learning game designers:

  1. Make sure your choice of a competitive game or a cooperative game mirrors the real-world environment. Do not have people competing in a learning game if their real-world context requires cooperation or collaboration to be successful.
  2. Be aware that competitive games do not tend to be as influential of learning experiences as cooperative ones do. In competitive games, only one person or team wins. The “losers” can disengage from the experience entirely if it is not managed well.
  3. Make sure the game mechanics (rules) and game goal complement – or at least do not detract – from your real-world situation.
  4. Make sure your game includes enough “tension” in it to keep things interesting. Interesting translates into “fun.” If there are not realistically significant odds of losing the game, it becomes boring to play.
  5. Don’t be content with the first version of your game; it will not be the best version.
  6. Don’t playtest once; identify changes to make, and then fail to playtest to verify those changes improve the game play and learning experience. You have to test every time you make a change.

Can Micro-Learning Help Stressed, Unmotivated Learners?


I’ve published two posts on micro-learning in recent months. One was on this site; one was done for ATD. Both generated discussion with some folks debating my assertion that we need to be very cautious about leaping to it. I’m going to stand by my assertion. I think “micro-lessons” can be great for some things; I do not think they are the answer to most things. And for learners who are over-extended and not motivated to learn in the first place, they are not the answer at all.

Will Thalheimer, someone I respect tremendously in the arena of learning science research and applying research to practices, wrote an extensive comment to my ATD post. He also linked to a post by Alex Khurgin, CEO of Grovo, a SaaS company that produces lots of micro-learning. Khurgin positions micro-learning as good for 21st century businesses. Khurgin’s blog is high-level and, in general, promotes micro-learning as the solution to the crazy pace that exemplifies many of today’s organizations.

Here’s the thing. I feel like I am an example of the “C-suite” person so many say are the reason we need to shift to micro-learning. I do not own a Fortune 500 company, but I am a business owner who has concerns about maximizing what my team can do. My company has been named as one of the top 25 fastest growing companies in Indiana… and making sure our team members continually learn and grow is a key reason why we’re on that list. Their skill and knowledge fuels company growth.

Why Companies *Think* Learners Need Micro-Learning

Micro-Learning 1

Within my company (and probably many others), these truths all affect my team’s ability to learn:

  1. We’re stressed. Life is stressful, not just work. We all have a bazillion things to do each day and many people who need things from us.
  2. We face multiple interruptions each day. If we don’t discipline ourselves to ignore email, disconnect from instant messaging, or mute our phones, we can be distracted every few minutes all day long every day.
  3. Time is limited. We never feel like we have enough time to get things done.
  4. We want to enjoy life. Most folks don’t want to work 60-hour weeks; we need for our work – and our learning – to happen within the sanity of a 40- to 45-hour work week. Sadly, we don’t all hit the goal of 45-hour maximums, which makes carving out time for learning a constant challenge if it is not prioritized.
  5. Maintaining focus is HARD. New technologies and ideas are like squirrels, tempting us to run off in new directions all the time. We see these squirrels when we consume content on social media – checking out links sent via tweets,  perusing Zite, monitoring our accounts. We can get highly distracted just trying to “keep up.”

Micro-learning is identified as the answer to items 2, 3, and 5 from that list, but I do not believe it is truly “the” answer to any of them. It sounds great on the surface, but the root of the problem goes deeper.

Motivate, Focus, Repeat.

So what is the answer? I think these things are…

  1. Make sure motivation exists. Putting people who have zero desire to be learning into a learning situation is a recipe for flushing money down the drain. Motivation trumps almost everything else. Really great instructional designers can help with motivation, but only to a point. People need to perceive that learning the new skill matters to them in a significant way.
  2. Make sure focused time is available  Get clear on company priorities…and realize you can  only execute on one priority at a time. Without time to focus, people cannot learn. Don’t think you can squeeze a learning experience into five minutes per day.
  3. Repeat to remember. Assume that people will need to have multiple repetitions to truly learn something. Repetition can be effective in short, continuous bursts… but I’ll save that discussion for another blog post.
  4. Make sure there are immediate opportunities to use what’s being learned. Without the immediate opportunity to apply, learning gets lost.This would be a second way you can flush money down a drain.
  5. Make sure someone else – besides the learner – cares about what someone is learning. Someone else, either a manager or co-worker, needs to inquire about what’s being learned. If people never get to talk about or reflect on what they are learning, the learning will be extremely limited and difficult for the person to apply it.


Right now, I am taking a 5-session MOOC (massive open online course) called Smart Growth for Private Businesses. It includes about 5.5 hours of lectures, several quizzes, and four case studies that are each 12-15 pages in length. I started the class about three weeks ago; I’m now three-fourths of the way through it and should have it completed in the next week or so. The lectures are organized into relatively small chunks, which are interspersed with 2-minute quizzes. The first sessions lectures are organized into “bites” of varying lengths that range in length from 2 minutes to 26 minutes. To date, I’ve invested several hours and I think every hour as been hugely valuable.

My completion of this course hits the five ingredients I feel are necessary for learning:

  1. I’m motivated. My company is growing extremely fast; we want to control our growth to maximize the health of our company and its team members. We’re also getting ready to start another strategic planning cycle and I will use the info from the course as we execute this process.
  2. I’m finding time to focus because focus matters. I’ve clarified my priorities… and taking this course is one of those priorities.
  3. I am reviewing the content with myself and others. I have taken notes, I’ve gone back and reviewed sections, etc. I’m repeating to remember… even using the white boards on my walls to write down key concepts I want to retain.
  4. I already mentioned strategic planning we’re preparing to do. I have an immediate need for the content.
  5. Two others in the company are also taking the course. Having someone to talk to is huge in retaining the information and learning from it.

So… that’s not micro-learning as I’ve seen it defined. But I can tell you I am getting excellent results, and I absolutely do not believe I would be getting excellent results if this content was of low or even medium value to me and delivered in small, five-minute chunks each day.

What do you think?

I’d love to hear your thoughts: can micro-learning truly help an employee who is stressed and lacks motivation learn?

Want to revamp your training? Plan out more engaging solutions and curriculums with our Simple Template for Planning Your Training Program.

5 Ways to Stop Training Content Overload


Product launch cycles are speeding up. Compliance and regulatory requirements are not getting any easier to fulfill. Your organization just implemented a new software tool and employees need to know how to use it. The XYZ process just changed… again. As an L&D leader, you are probably tasked with helping employees respond and adapt in an environment that is constantly shifting. How much training is “enough”? Are you overloading your learners with training?

In our Learning & Remembering Survey, first conducted in December 2014, 24% of respondents said that “content overload’ was a huge problem in their organizations. 38% more said that knowledge transfer and retention of content are their biggest challenges. Too many courses are being delivered… and learners cannot keep up with the amount of information being thrown at them. When learners are asked to take too much training, or remember too much content, many negative consequences can occur:

  • The organization is wasting resources on training that is not effective.
  • Learners are not remembering what was trained because too much content is crammed into each session. This leads to poor performance.
  • Stakeholders expect results from their investment in training, and they are disappointed when reality does not meet expectations.
  • Learners are fatigued by the high volume of compliance training they must complete, and unfortunately there is no way to reduce the number of required topics.

…The list could go on. In many organizations, large amounts of complex content and technical information may be unavoidable. Fortunately, there are many strategies you can use to reduce the burden on your learners. When used properly, these strategies will also improve the impact training has on your organization.

Strategies to Stop Training Content Overload

1. Less “tell.” More “do.” Consider ways to turn your content into a series of interactions. Look for ways to involve learners in the content instead of simply presenting it to them.


This Hazard Communication course removed the up-front screens with endless information and turned the content into a series of interactions. The course won a 2014 Horizon Interactive award.

2. Use a variety of media. When you have lots of information to deliver, doing so in a variety of different formats can lighten the load. When a client needs us to create a course that must include lots of information, we will frequently include animated videos and clickable interactions to break up the content.


Short videos can be used to present some of the necessary “tell” content.

3. Break the learning into small chunks. Research clearly shows that we learn best in small chunks, repeated over time. Leverage learning technologies that allow smaller bites of content to be delivered to learners over a longer period of time instead of creating a single eLearning course.


Knowledge Guru’s “Quest” game type allows levels to be played in small chunks. Learners can receive email reminders when a new level is available for play.

4. Make ILT interactive. Even in our technology-driven world, instructor-led training is still often the right learning solution. You just have to maximize it. In the customer training we created for Roche, we created a robust facilitator guide and participant guide that broke up the day and incorporated lots of coaching and practice opportunities.

5. Personalize the learning. Don’t underestimate the power of technology to tailor the content learners receive. The customer training curriculum referenced above included an online pre-work module where learners choose their job role and take a course that only includes information relevant to them. Once on-site, Roche trainers use an iPad app to deliver a customized lesson based on customer role and product configuration. These on-site sessions are organized so customers only participate in the portion of training that is relevant to their role.


This customer training program created for Roche Diagnostics includes a personalized eLearning course and an app that allow instructors to deliver tailored lessons.

One of the best ways to avoid training content overload is to do a thorough needs analysis. Use our Training Needs Analysis Worksheet to get started!

How to Create Award-Winning Training Solutions


The after-glow of our three 2014 Brandon Hall awards is starting to fade away. Yes, excitement can only last for so long… and we are hard at work making plans to submit for 2015 awards. Many of the clients we talk to care about awards and would like to have the prestige that comes from having them attached to their training efforts.

But if you’ve never submitted for an award before, how do you know your learning solution is award-worthy? And what do you need to include in the submission to draft a winner?

The “Secret Sauce”

If you want to receive awards for the training you create, you have to be able to articulate two things: value and innovation. Too frequently training functions cannot articulate the value their solution(s) delivered to the organization. They cannot describe a problem or how their training solution helped solve or at least reduce the severity of the problem. That’s the secret sauce.


The folks in L&D who do gather up all the fancy trophies and industry accolades have submissions that describe innovative solutions that solved a quantifiable problem. Think about it in terms of submitting a “before and after” story. Here are some examples:

  • Before we implemented training, we had numerous safety accidents on the XYZ machine. It was costing us $X and X days in lost productivity as a result of accidents.  After training, our accidents decreased by X%, our costs were $X, and our days of lost productivity decreased by X.
  • Before training, we were spending up to 24 months ramping up a new-hire. Our team leads were self-reporting high levels of stress and our employee surveys indicated low job satisfaction for those in team leadership roles.  After we implemented our new employee onboarding training program, we winnowed down the ramp-up to 12 months—a 100% decrease in the time required to achieve full productivity. While we do not yet have the results of the most recent employee survey, a poll of team leaders indicates that they perceive stress levels to be “significantly lower” than before we implemented the new onboarding program.
  • We have 90 people in the director of recruitment role in our organization. Before training, the annual employee turnover for the director of recruitment role was 30%. It cost our organization $18,000 for every new hire we had to make, which meant our annual spend on the recruitment, hiring, and training of this role was $486,000. After training, we were able to reduce it to 20% AND decrease the time to full productivity by 3 months. Our annual costs for recruiting and hiring decreased to $324,000, which is a 33% cost savings.
  • Before training we were spending up to 3 months at a customer site following installation of our product. After we launched the revamped customer education program, satisfaction ratings improved from an average of 3.75 out of 5 to an average of 4.5 out of 5.  In addition, we reduced time on site by 30 days. This resulted in a cost savings of $8,000 per customer.

The common denominator in these stories is some form of data that identifies a problem, quantifies its impact to the business, and then quantifies the results obtained from implementing a training solution. Too often, training is not quantified.

So what can you do when you don’t have data? The simple answer is to get some, and here are some tangible techniques you can use to help you do that.

Probe more than once. Don’t accept the first answer as the final answer.

If you ask a subject matter expert or stakeholder, “What is the problem and how can you quantify it,” avoid accepting the initial response that might go like this:

“We don’t have any actual numbers, but I’ve been hearing from the field that this is an issue.  I’ve talked to our lab chemists and they  tell me that they are answering the same questions over and over. They are sharing basic information that field reps should really know themselves. If they were able to answer questions, I know it would be beneficial to us.

Your stakeholder or SME may well be right, but you should probe. If this is really a big issue that costs the company money, chances are that data is not as difficult to assemble as the SME thinks. Here are things you can ask to help quantify the problem and its impact on your company:

  • How many chemists are affected by a sales rep’s need to call for technical support?
  • Ask the chemists: In a given week, how many calls or emails from sales reps do you respond to and what are the most common issues? How much time do you spend on this per week – 30 minutes, an hour, 2 hours, etc.
  • Ask a handful of key distributors: How frequently do you ask a sales rep a product question that he or she cannot answer? If he or she cannot answer you immediately, can you quantify any dollar impact to your business? What about your perception of ACME as a supplier?

Ask for the stakeholder to give you a dollar value that they would associate with whatever problem they describe to you. Ask them: Is it worth $10K, $20K, $30K, etc? Why? What benefit will ACME get by implementing this solution?

Sometimes, the act of asking them to assign a value will help the stakeholder or SME realize they need more data before jumping to solution design. After all, the data might help create a better solution!

In truth, if it is worth $10K or less, then you are not looking at a very robust learning solution…and if you are not looking at a robust solution, will you truly affect performance change? Even if a solution is 100% designed, developed, and delivered internally (no vendors), the cost is likely to quickly approach at least $10,000 when we factor in the time for a training person to design and build the solution, the time a SME will spend providing content expertise, and the time all the employees will spend completing the training.

What about innovation?

This is the other element to the “secret sauce” of winning awards You have to go beyond defining a problem and quantifying results. You need to think about how you did it DIFFERENTLY than others have done. How is the solution an advancement in the field? What new approaches does it use that might be a model for others?


Of course, the “innovation” must be relevant to the topic you are submitting the award for. If you have submitted your project for a “Best Use of Blended Learning” award, but the results of the project are not at all related to the blended learning approach, then your chances of winning are lower, even if the results are good.

In the award we submitted with Cisco, the “before” problem they identified was a challenge with getting new sales associates to retain large amounts of product and technical information. Through learner surveys and learning objective completion rates, they were able to determine that the spaced repetition built into Knowledge Guru games had a meaningful impact on solving their problem. In this case, the way gaming was connected to learning science was considered “innovative,” and the innovation mattered because it drove results for Cisco.

So, You’re Saying There’s a Chance?

In the end, there’s no guarantee that a particular learning solution will win an award, no matter what organization you submit it to. A small percentage of projects will win any given award, and even fewer will win a “Gold” distinction. Whether you plan to submit your work for awards or not, adopt an award-winning mentality by showing measurable results and using innovative designs and approaches to drive those results.


Building Skill and Knowledge, iEV, and Storytelling Tips: This Week on #BLPLearn

blp-learn-banner#BLPLearn is our way of saving all of the great content our team curates… and sharing it with the wider community. We’ll take the best articles shared by our Learning Services, Multimedia, and Product Development teams in their weekly meetings and include them in the weekly #BLPLearn blog. We’ll usually include some commentary from the original team member who found the article, too.

Our goal is to make the weekly #BLPLearn blog a dependable source for quality, curated L&D content. Check back every Thursday.


Rather than restricting the social media conversation to a 30 minute window, we’re inviting everyone inside and outside BLP to share interesting links, thoughts, and articles with the #BLPLearn hashtag on Twitter. We’ll check the feed once a week and include the best articles submitted via Twitter in the post, too.


Now that introductions are out of the way, let’s dive in to this week’s articles:

Games That Build Skill and Knowledge
Submitted by Sharon Boller, President and Chief Product Officer 

One of the reasons we created the Quest game type was to develop a game type that allowed for both knowledge acquisition AND skill building. We enabled people to create “performance challenges” that could be skill-based as opposed to knowledge-based while still retaining the knowledge component. Because Guru is a game engine that includes an authoring tool, we needed to make it very, very easy for people to develop a game quickly and with minimal to no game design skill.

We know that one of the primary needs our client base has is to help people build process knowledge and skill. Imagine you have the freedom and $$ to develop a 100% custom game. It’s important for people to perform the process quickly and without errors. Asking questions about it won’t be enough. They have to practice DOING it – over and over again. A game is an ideal tool for this frequent practice. An eLearning company in the UK developed an intriguing game for the UI division of McDonald’s Corporation’s. The game teaches people how to operate the “till,” (you have to love British English) and also incorporates a knowledge component in the form of Q&As. The business results were impressive, as was the implementation technique – merely embedding the game within an employee website/portal and letting employees discover it for themselves.

They designed achievements that linked effectiveley to what they wanted employees producing on the job (perfection, happy camper, etc),and, in general, modeled lots of the traits of effective learning game design.

You can read about it here:

Example of Game that builds knowledge AND skill

Submitted by Brandon Penticuff, Technology Director 

Pop quiz: What’s the #1 concern that someone usually has when considering an electric vehicle?

It turns out that it’s something called “Range Anxiety”, essentially the concern about the range of electric vehicles and whether or not your needs would be sufficiently covered by an electric powered car.

The link/app that I want to share with everyone today is designed to address this anxiety as well as provide additional information on what your driving patterns would look like if you had an electric vehicle. So this is a very targeted learning app really.


  • What do you think, would this information address anxieties you might share?
  • This is a pretty unique way of problem solving, using essentially a data collection app to compare results with a different model in the same environment, any thoughts on how that could be applied to other learning?
  • One neat feature of the app is that if you agree to share anonymous data about your drives, it unlocks additional features. I thought this was a novel way to approach “pro features” instead of making the user pay for them.


4 Storytelling Tips From the Co-Creator of Blockbuster Mystery Podcast “Serial
Submitted by Jennifer Bertram, Director of Instructional Design

I am obsessed with this podcast. I listen to it religiously every week – it’s my favorite “show”. There are some great storytelling tips in this article that I think we could translate to the stories we tell in our courses. How can we weave stories throughout the learning experience? How can we do a better job of parceling out details over time?

  • The idea of spreading it out over multiple weeks due to the amount of content. This could really apply to a “bite sized” learning solution. Helps with the goal of leaving the learner wanting more.
  • The narrator has a perspective and you can see her feelings/questions as you go. This helps you engage in the story. By taking this approach, you can really see the different perspectives of all of the people in the story.
  • Holding back some of the details of the stories is a great way to engage the audience and keep them in the story/learning.
  • Think about what questions you want the story to answer, worry about that more than the ending.

4 Storytelling Tips

Our “Recipe” for Learning and Remembering in Corporate Learning


Some employees get too little training. They sit through a few classroom sessions, see some slides, and get very little help at actually doing their job.

Others get too much training. The list of required eLearning courses is too long, and actually takes them away from their responsibilities. The learning and remembering ends up happening outside of, or in spite of, the training requirement.

Most organizations invest heavily in training their employees, yet employees still do not retain the critical knowledge they need to be successful. This is why we focus our research on why employees forget. How do our brains respond when we learn new information? Is there a pattern to forgetting?

Sharon Boller’s work has explored the divide between remembering and forgetting extensively. Her white paper, When Remembering Really Matters, identifies eight strategies, four for learning and four for remembering, that help fight forgetting.

learning and remembering as a “recipe”

In her presentation, Sharon illustrates how the elements required for learning and remembering fit together into a repeatable process. When used correctly this process, or “recipe,” can yield our desired outcomes.

Here’s a look at the entire recipe from start to finish:


What elements are required to learn?

Before we can remember anything, we have to first learn it! Research (and experience) tells us that motivation, relevant practice, and specific, timely feedback are all required for learning… but that’s not the whole story. These are all essential parts of the learning process, but we have to take remembering into consideration to complete our recipe.

What elements are required to remember?

In her white paper and presentation, Sharon presents four strategies to use when you really want learners to remember:

  1. Spaced intervals – not a single “glop”
  2. Repetition – several instances of it
  3. Feedback – with requirement to do it right after making mistake
  4. Stories

One particular reason spacing works is that it eliminates the “glop.” With learning, too much = nothing. If you overload the learner with information then none of it will stick. Space the learning out and use repetitions to cement the content.

Finally, story helps create context and an emotional response in the learner, both of which are proven to increase retention. This is one of the reasons that games can be such a powerful learning tool.

So what’s the real recipe for learning and remembering?



Motivation: Employees/players/learners need to be motivated to learn. The most obvious way to do this is to incentivize them somehow, and that can work, but that provides only extrinsic motivation. The best learning happens when the learner is intrinsically motivated. Think about what your learners might need to want to participate in the training. Could you make it more fun? Do they want to compete?


Relevant Practice: It is crucial that your learners practice. The saying “practice makes perfect” might be cliche, but it’s true. Think about ways you can encourage practice over time… and make sure it’s relevant to the goals you set.


Specific, Timely Feedback: Feedback is one of the most essential ingredients because it allows your learners to correct mistakes and stops them from building any bad habits or repeating incorrect information. Behavioral psychology shows time and time again, however, that the feedback must be specific and it must be quick, so that the learner can make the connections between the correct feedback and their mistake.


Spacing and Repetition: Now we’re getting into the ingredients that are crucial to long-term retention. Without repetition at strategically spaced intervals, learners will forget 30 – 90% of what they learned in 2-6 days time. Spaced repetition is the secret to fighting this forgetting curve.


Story: As we stated above, story both gives the learner context and creates an emotional element that will help them retrieve the information later. It’s easier to remember an alien telling the safety guidelines that can help you keep your lab safe from invaders than it is to remember those same guidelines from a boring PDF.

Test the recipe in your own “Kitchen”

What do you think of these strategies? Have you applied any of them in your own training? What obstacles make it hard to do so?

Want to learn more about remembering? Get four strategies for improving long-term retention in Sharon Boller’s When Remembering Really Matters white paper.