More on Re-evaluating Evaluation – Jack Phillips and ROI

Dashe & Thomson

I have been blogging a lot about Training Evaluation this year—mostly Kirkpatrick , but also Brinkerhoff and Scriven. I just realized that I haven’t included a single word about Jack Phillips , who introduced Return on Investment (ROI) as Level 5 to Kirkpatrick’s Four Levels of Evaluation. My first exposure to Phillips’ ROI—although I didn’t realize it at the time—was through a colleague who introduced me to Kirkpatrick’s Four Levels.

Training Evaluation Beyond Kirkpatrick

Training Industry

Because showing the return on investment from training involves many components and metrics that are not readily measurable, such as the behavioral changes in the learner’s day-to-day work as a result of the training they attended. Today, when we discuss training evaluation, we likely reference Kirkpatrick’s model with the familiar four levels: reaction, learning, behavior and results.

Insiders

Sign Up for our Newsletter

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

An Overview: Phillips' Model for Learning Evaluation

Watershed

When we originally published our Learning Evaluation eBook and blog series, we covered four learning evaluation models representing a range of approaches and perspectives: Kirkpatrick, Kaufman, Brinkerhoff, and Anderson. While there are many other learning models we’ve yet to cover, the one we’re asked about most often that’s not mentioned in our original series is Phillips' Learning Evaluation Model.

Why We Should Stop Talking About ROI in Training

Mindflash

Many people look at Don Kirkpatrick ’s work from as early as 1959 as the beginning of ROI in learning and development. It was in his early work that Kirkpatrick developed his four-level model: Level 1: Reaction. Another model and methodology from Jack Phillips includes a fifth level – ROI – which adds an added financial metric to the mix. In concept, Kirkpatrick’s levels seem valuable. We Don’t Need New Metrics.

ROI 80

Is this thing on? Tips for measuring course effectiveness and return on investment

Obsidian Learning

The Kirkpatrick four levels of training evaluation. The most commonly used method of accomplishing this is Kirkpatrick’s Four Levels of Evaluation. Kirkpatrick and Kirkpatrick (2006) compare it to measuring customer satisfaction and note that when learners are satisfied with training, they are more motivated to learn. The historical support metrics will be analyzed to determine support call trends before and after training. Kirkpatrick, D. Phillips, J.

Workplace Learning 2025 – What is the ROI of learning and development?

Learning Pool

Time, costs, and measurement metrics are among drivers of this mismatch. While ROI can be elusive, organisations that do it well are starting with the business metric and examining performance outcomes,” says CIPD head of L&D Andy Lancaster. Six Part Series: Workplace Learning 2025.

ROI 75

Training Evaluation – 5 Best Ways to Evaluate Training Effectiveness and Impact

Kitaboo

The Kirkpatrick Taxonomy Model. Kirkpatrick Taxonomy is one of the most widely used methods for evaluating the effectiveness of corporate training programs. The Phillips ROI Model.

The Essential Guide to Learning Analytics in the Age of Big Data

Lambda Solutions

What metrics and sources to use in implementing learning analytics. This data can be sorted, filtered, and correlated to specific metrics, such as activity and course completions. The Kirkpatrick Evaluation Model is the gold standard for assessing online training programs.

Watershed's 7 Steps of Learning Evaluation

Watershed

We've combined our favorite elements from Kirkpatrick , Kaufman , Anderson's Value of Learning , Brinkerhoff's Success Case Method , and Phillips' Model to create our version of a "super method" of how to evaluate learning. Step 2: Define metrics for effective evaluation. Identify success metrics most appropriate to the organization. Ensure that these metrics cover the complete story from learning through to achievement of the program goals.

Unlocked Learning—Training analytics made easy

Coassemble

Instead of hunting down metrics, imagine being able to receive every training metric in one dashboard? But for your admins, it allows them to create training content and view various metrics including: When and where a user has accessed and self-enrolled in a training session. The earliest and most widely accepted and formalized training assessment process was the Kirkpatrick Model by Dr. Donald Kirkpatrick.

What Is Learning Evaluation?

Watershed

An Overview: Kirkpatrick Learning Evaluation Model. An Overview: Phillips Model for Learning Evaluation. Step 2: Define metrics for effective learning evaluation. Knowing how to aggregate and evaluate all types of learning is a priority for many learning and development practitioners. In fact, many say there's a growing demand from their organizations to show L&D’s impact across the organization.

Weighing the Options: Different Schools of Thought

CLO Magazine

The framework for learning evaluation and measurement embraced by most in the industry starts with Kirkpatrick. The framework for learning evaluation and measurement embraced by most in the industry starts with Kirkpatrick. Kirkpatrick is now retired and the honorary chairman of Kirkpatrick Partners LLC, a learning and evaluation consultancy led by his son, James, and his daughter-in-law, Wendy. Jack Phillips, chairman of ROI Institute Inc.,

Evaluation and ROI in Compliance Training

Interactive Services

While ROI is an important metric, it shouldn’t be the sole focus of a comprehensive compliance training initiative. Learning management systems (LMSs) can provide metrics and generate reports that give you data pointing to the effectiveness of training, like time spent on each module, assessment scores, number of times an assessment was attempted, and course completion rates.

The Elevator Pitch for Trainers

Training Industry

The conversation that began with Kirkpatrick and continues with Phillips takes place at virtually every training conference, is represented as an article in almost all training publications and is a topic of debate whenever training professionals meet. The dialog focuses on metrics, or the lack thereof; ROI and the challenge of measuring it; and the meaning of “business impact.”

ROI: It’s All About Proximity

CLO Magazine

I’m not saying training does not contribute to a learner’s on-the-job performance, but in the sense of measuring a Kirkpatrick or Phillips level 4/5 impact, we all know “contributing” is not a strong word. In the classic Kirkpatrick levels 1-4 model, training is best assessed at levels one and two. The next fair metric related to training is knowledge gain. Embrace a more robust approach to measuring learning outcomes.

ROI 62

Putting Thought to Work: Evaluation in Practice

CLO Magazine

The company organizes its learning evaluation data using a tool, Metrics That Matter, from KnowledgeAdvisors that connects directly to its learning management system. Managers input progress measures into the LMS, which streamlines the data to Metrics That Matter. Then she said the tool automates levels 1 through 5 — including the Phillips ROI model — to help learning leaders report and analyze the data.

Instructional Design Basics: What Is ADDIE?

Convergence Training

Use the information you discovered while learning about the task performance and organizational goals to begin creating evaluation strategies for your training and KPIs/metrics to assist with that evaluation. The Phillips “ROI” Model.

ADDIE 65

What is stopping companies from measuring learning: Skillsets, datasets, toolsets or mindsets?

Xyleme

By adapting our language and perspective to integrate terms like traffic, bounce rate, conversion, time on site and social sharing into an L&D context, we create a dynamic system that uses data to drive smarter decision-making and draw on accurate metrics for more actionable insights. Yet, L&D professionals continue to herald formal evaluation - Kirkpatrick and Phillips - as industry standards, essentially disregarding new tools at their disposal.

CLO Symposium

Jay Cross

It replaces the inward-looking mumbo-jumbo of Kirkpatrick and the Phillips with business measures. “Nobody cares about internal metrics.” Along with my Internet Time Alliance colleagues Jane Hart & Clark Quinn and several hundred chief learning officers, I attended the Fall CLO Symposium this week. Our theme was “Game-Changing Learning: Development for the New Normal.” ” Sports metaphors filled the air.

CLO 40

Free L&D webinars for May 2018

Limestone Learning

Discuss how to correlate learner engagement with other critical organization metrics to tell an important story about the value of L&D investments. Show you how to leverage Kirkpatrick’s model of assessment and create a targeted strategy around level three and four assessments. What metrics for training ROI do you use? What metrics to set up, how to install, and how to follow them up to get results. Phillips, along with her husband and business partner Dr. Jack J.