Monday, February 09, 2015

Session Notes: "Driving Impact with Learning Analytics" Jeff Grisenthwaite #TrainingMag

These are my live blogged notes from the Training 2015 Conference and Expo happening this week in Atlanta. Forgive any typos or incoherencies.

Metrics that Matter with Jeff Grisenthwaite, VP of Client Success at CEB (www.executiveboard.com). 

Scrap learning -- on average across industries 41% of learning is considered "scrap" learning. ATD Research shows annual learning spend per employee is $1,195 -- so at 45% we're wasting a lot per learner ($537)!

So how do we reduce scrap learning and create more impactful learning experience? Our goal: reduce scrap, improve performance.

To increase the impact of learning programs, L&D need to ensure decision making and continuous improvement are supported by analytics.

Do you have these analytics in place?

Comprehensive Metrics

  • Efficiency (we've been good at reporting this, because it's what the LMS reports).  This is cost, volume, utilization, speed (time to market - how long does it take to get the learning solution out there in response to the need.
  • Effectiveness. How well are we doing the learning? Learning, impact, support, alignment (map the learning & dev portfolio to the business priorities of the org). Need to take a broader view of what's happening on the job. How are your vendors doing?
  • Outcomes. You need to be able to talk about the things that the business cares about. How your program is actually influencing business metrics.
We need to move from smile sheets to smart sheets. Broaden our view from L&D to what matters in the business. L&D focus is typically on instructor, content, learning environment, knowledge gain, support materials.  Business focus cares about manager support, or support, performance improvement, business results, ROI.

Multiple Sources

We do want to continue to get feedback from the learner. But we want to get more and get feedback from multiple sources.  
  • Learning Intervention: post event evaluation from learner AND the instructor (the instructors can let you know if the right people are attending the course). 
  • Measure Impact: learner follow up evaluation (are they better at doing their job), manager follow up eval (get the manager's perspective - has the learner's performance changed), business metrics follow up eval.
Good benchmark is 800:1 (# of staff: # of L&D team to support that). So what do you do if you're a one-person shop supporting a staff of 10,000? Go after one thing - he says that Learner Post-Event Evals are actually pretty reliable predictors -- if you ask the learner if they are going to apply it...

If you have to pick three - go with learner post event, learner follow up, manager follow up.

Remember, our goal is to improve performance within the organization.

Performance Benchmarks
  • External benchmark: How does your company compare to the competitors? Scrap learning rates, manager support for learning, delivering training at the time of need, increases in sales as a result of training.
  • Internal benchmark: Imagine a ranking of your courses, your vendors

Get the fundamentals right before you start embracing the next big thing.

Process Automation

Where does your time go in terms of analytics? For most orgs, 80% of the time is spent on the manual gathering and date entry and reports. This leaves very little time for Action. Let's look to spend the same amount of time but move the non-value added admin work to focus on action.

Automate as much of the process that you can -- for data collection and insights.

Each role will have different insights.

Instructor -- daily (what should I change before my next class), monthly (which courses could I facilitate better), quarterly (how can I improve my overall delivery).

Same patterns across different roles, although different questions.

L&D exec -- monthly (which vendors who'll we continue working with), quarterly (where should we allocate resources)

Teeing up for the Big Meeting
This is when you share your data back.

Ask this -- "what are your business goals?" and not "what metrics do you want me to report on?"

Find the story in your data:
1. scan the data: -- review high level summary reports across all
2. analyze: use detailed reports to pinpoint wins, areas of focus and root cause
3. synthesize: combine summary data and detailed analysis to craft an insightful story

"We were expecting this and so far this is occurring and here's the data to back it up."
"Surprises -- we didn't expect this, but here's the data to back it up."

Focus on the future and not the past -- how do we build on this.

Present:
 your theme (elevator pitch), 
insights (surprises, validated hunches, forecasts), 
recommendations (address issues, expand impact, improve forecast), 
requests (ask for more resources, business data, support for program), 
broaden (next phase, expanded audience, don't just focus on l&d -- be looking at the business)

Don't just go into the meeting trying to prove value. Then you're in position of defense. Instead, you want to focus on how to improve results. You might recommend reducing scrap learning...

No comments: