how-to-award-winning

The after-glow of our three 2014 Brandon Hall awards is starting to fade away. Yes, excitement can only last for so long… and we are hard at work making plans to submit for 2015 awards. Many of the clients we talk to care about awards and would like to have the prestige that comes from having them attached to their training efforts.

But if you’ve never submitted for an award before, how do you know your learning solution is award-worthy? And what do you need to include in the submission to draft a winner?

The “Secret Sauce”

If you want to receive awards for the training you create, you have to be able to articulate two things: value and innovation. Too frequently training functions cannot articulate the value their solution(s) delivered to the organization. They cannot describe a problem or how their training solution helped solve or at least reduce the severity of the problem. That’s the secret sauce.

secret-sauce

The folks in L&D who do gather up all the fancy trophies and industry accolades have submissions that describe innovative solutions that solved a quantifiable problem. Think about it in terms of submitting a “before and after” story. Here are some examples:

  • Before we implemented training, we had numerous safety accidents on the XYZ machine. It was costing us $X and X days in lost productivity as a result of accidents.  After training, our accidents decreased by X%, our costs were $X, and our days of lost productivity decreased by X.
  • Before training, we were spending up to 24 months ramping up a new-hire. Our team leads were self-reporting high levels of stress and our employee surveys indicated low job satisfaction for those in team leadership roles.  After we implemented our new employee onboarding training program, we winnowed down the ramp-up to 12 months—a 100% decrease in the time required to achieve full productivity. While we do not yet have the results of the most recent employee survey, a poll of team leaders indicates that they perceive stress levels to be “significantly lower” than before we implemented the new onboarding program.
  • We have 90 people in the director of recruitment role in our organization. Before training, the annual employee turnover for the director of recruitment role was 30%. It cost our organization $18,000 for every new hire we had to make, which meant our annual spend on the recruitment, hiring, and training of this role was $486,000. After training, we were able to reduce it to 20% AND decrease the time to full productivity by 3 months. Our annual costs for recruiting and hiring decreased to $324,000, which is a 33% cost savings.
  • Before training we were spending up to 3 months at a customer site following installation of our product. After we launched the revamped customer education program, satisfaction ratings improved from an average of 3.75 out of 5 to an average of 4.5 out of 5.  In addition, we reduced time on site by 30 days. This resulted in a cost savings of $8,000 per customer.

The common denominator in these stories is some form of data that identifies a problem, quantifies its impact to the business, and then quantifies the results obtained from implementing a training solution. Too often, training is not quantified.

So what can you do when you don’t have data? The simple answer is to get some, and here are some tangible techniques you can use to help you do that.

Probe more than once. Don’t accept the first answer as the final answer.

If you ask a subject matter expert or stakeholder, “What is the problem and how can you quantify it,” avoid accepting the initial response that might go like this:

“We don’t have any actual numbers, but I’ve been hearing from the field that this is an issue.  I’ve talked to our lab chemists and they  tell me that they are answering the same questions over and over. They are sharing basic information that field reps should really know themselves. If they were able to answer questions, I know it would be beneficial to us.

Your stakeholder or SME may well be right, but you should probe. If this is really a big issue that costs the company money, chances are that data is not as difficult to assemble as the SME thinks. Here are things you can ask to help quantify the problem and its impact on your company:

  • How many chemists are affected by a sales rep’s need to call for technical support?
  • Ask the chemists: In a given week, how many calls or emails from sales reps do you respond to and what are the most common issues? How much time do you spend on this per week – 30 minutes, an hour, 2 hours, etc.
  • Ask a handful of key distributors: How frequently do you ask a sales rep a product question that he or she cannot answer? If he or she cannot answer you immediately, can you quantify any dollar impact to your business? What about your perception of ACME as a supplier?

Ask for the stakeholder to give you a dollar value that they would associate with whatever problem they describe to you. Ask them: Is it worth $10K, $20K, $30K, etc? Why? What benefit will ACME get by implementing this solution?

Sometimes, the act of asking them to assign a value will help the stakeholder or SME realize they need more data before jumping to solution design. After all, the data might help create a better solution!

In truth, if it is worth $10K or less, then you are not looking at a very robust learning solution…and if you are not looking at a robust solution, will you truly affect performance change? Even if a solution is 100% designed, developed, and delivered internally (no vendors), the cost is likely to quickly approach at least $10,000 when we factor in the time for a training person to design and build the solution, the time a SME will spend providing content expertise, and the time all the employees will spend completing the training.

What about innovation?

This is the other element to the “secret sauce” of winning awards You have to go beyond defining a problem and quantifying results. You need to think about how you did it DIFFERENTLY than others have done. How is the solution an advancement in the field? What new approaches does it use that might be a model for others?

differently

Of course, the “innovation” must be relevant to the topic you are submitting the award for. If you have submitted your project for a “Best Use of Blended Learning” award, but the results of the project are not at all related to the blended learning approach, then your chances of winning are lower, even if the results are good.

In the award we submitted with Cisco, the “before” problem they identified was a challenge with getting new sales associates to retain large amounts of product and technical information. Through learner surveys and learning objective completion rates, they were able to determine that the spaced repetition built into Knowledge Guru games had a meaningful impact on solving their problem. In this case, the way gaming was connected to learning science was considered “innovative,” and the innovation mattered because it drove results for Cisco.

So, You’re Saying There’s a Chance?

In the end, there’s no guarantee that a particular learning solution will win an award, no matter what organization you submit it to. A small percentage of projects will win any given award, and even fewer will win a “Gold” distinction. Whether you plan to submit your work for awards or not, adopt an award-winning mentality by showing measurable results and using innovative designs and approaches to drive those results.

[av_button label=’See our award-winning learning solutions’ link=’manually,http://www.bottomlineperformance.com/client-success/#awards’ link_target=” size=’large’ position=’center’ icon_select=’yes’ icon=’ue8a8′ font=’entypo-fontello’ color=’theme-color’ custom_bg=’#444444′ custom_font=’#ffffff’]