Categories
Continuous Learning

ROI vs. EOSC – Evidence of Sustained Capability

This return on investment (ROI) thing is getting out of hand. There is no question that valid ROI is invaluable in justifying the decision to make [or not] key investments. I have to ask, “So what is actionable about ROI after justifying/verifying an investment decision?” To get there, we seek event-specific evidence to confirm good or bad investment decisions; I cannot see anything else actionable. Seeking ongoing evidence that we created a sustained capability serves a better purpose. I do not know who said this, but it is so true, “The pursuit of true ROI does not yield a good ROI”. That said, if the concept of ROI falls into the wrong hands, it becomes a knee-jerk, ritualized, abused, and misapplied expectation, ultimately morphing into a ridiculous exercise I refer to as Return on Every Damn Thing (ROEDT).

This return on investment (ROI) thing is getting out of hand. There is no question that valid ROI is invaluable in justifying the decision to make [or not] key investments. I have to ask, “So what is actionable about ROI after justifying/verifying an investment decision?”  To get there, we seek event-specific evidence to confirm good or bad investment decisions; I cannot see anything else actionable. Seeking ongoing evidence that we created a sustained capability serves a better purpose. I do not know who said this, but it is so true, “The pursuit of true ROI does not yield a good ROI”. That said, if the concept of ROI falls into the wrong hands, it becomes a knee-jerk, ritualized, abused, and misapplied expectation, ultimately morphing into a ridiculous exercise I refer to as Return on Every Damn Thing (ROEDT).

Do not misunderstand what I am trying to articulate here. I already can feel vibrations in the Force where a CFO somewhere just screamed, “Off with his head!” This posting is not about “the WHAT” [the act of evidenced-based decision-making] – this is about “the HOW” [the choice of pursuing ROI vs. ROSTM – Return on Something That Matters]. For me, that “something” is our ability to create and sustain a capability in the organization through training, through technology, through [insert your choice of action here]. ROI is a joke when the result is CYA insurance on a potentially career-limiting investment decision; or is used to support the decision to fire the person whose bright idea it was to make the investment in the first place; or is used to validate that a gala event is necessary to celebrate a wise investment decision. What other action(s) does ROI support, really?

Nearly ten years have passed since my first opportunity to project ROI on a new piece of LMS technology in an environment where an LMS did not exist – we are talking brand spanking new. The CFO required validation on the return we could expect before he released the funds to make the purchase. Okay…so…you want me to tell you what our return will be if we do something we have never done before, and have had no experiences doing it, nor is there any historical data in the entire organization to use as a precedent. And he did.

And we did, as I am sure many of you have done by benchmarking other similar implementations in similar industry settings and similar size organizations, with similar technology, etc. We also researched all manner of external evidence sources, consulted with vendors, and investigated referrals to support a best guess on the return we could expect. The process was rigorous and it was hugely time consuming, but it got us to a green light to make the purchase. Turns out THAT was the outcome for the ROI drill. After all that effort, we never received a request to prove we actually acquired the ROI we projected. We did it anyway…and we shared it…and it was good.

I still scratch my head over why Finance never asked us to produce it formally. Then again, that scenario speaks to my point – was ROI necessary, or was it a ritual…or was it a check-box gate we had to go through to get a budget approved? This may describe a scenario some of you have experienced as well. For me, this was a blatant example of an ROI exercise that did not render a solid ROI on the effort. We realized an awesome ROI on the technology, but there was nothing actionable that came out of it…aside from planning the celebration party over none of us getting whacked for suggesting we drop $1.4 million bucks on a LMS.

Large technology investments will likely drive the decision to pursue an ROI projection. The more tangible the impact, the easier it is to extract valid ROI. LMSs, Virtual Classroom platforms, EPSS, these are technologies that render significant impact that is measurable and can easily render an objective ROI result. I wish there were other paths to take…but…there are side benefits – acquisition of actionable data one uncovers along the way. If ROI is unavoidable, make sure whatever evidence you assess are going to give tangible metrics that are accessible ongoing. What I am suggesting is to determine what data yield actionable information, which in turn, pre-determine viable learning analytics that enable confirmation of sustained capability.

Okay …you might be wondering why I am all raked up in a pile over this ROEDT mentality. Simple, it makes me nuts when somebody asks for ROI on something so multi-faceted there are no valid benchmarks to benchmark. What is the ROI on using Twitter or Yammer?  Who cares? We cannot default to ROI when we need actionable evidence of impact on some aspect of business operations that drive outcomes. We cannot gauge whether or not we are sustaining a capability by deriving ROI. In fact, if we have defined what a sustained capability looks like…or what a sustained capability produces…or prevents…or saves…or expands…or enhances, then who needs ROI?

It is nearly impossible to extract valid ROI when impact is highly subjective, fraught with soft dollar implications, and/or diluted by multiple unrelated factors that also contribute to the results. When it is virtually impossible to isolate the impact of your intervention, I find the ROI on extracting ROI is a bust. Jack Phillips, in his book, “Return on Investment”, clearly explains his methodology to get there, but a couple of layers of subjectivity inject percentages representing the level of confidence of your hypotheses and proportional contribution against other sources affecting results. I understand the formula and the approach, but I question the value of the effort when subjectivity is so…subjective. More than that, I question the extra effort when actionable evidence is MIA [missing in action].

So…where do we go with this?  We need a strategy to defuse the knee-jerk ritual of deriving ROI, especially when validating impact through the capture of tangible evidence is enough to support decision-making. Our objective should be to stay on track to that ultimate goal of creating and sustaining capability. ROI serves the bean counters in Finance, more so when they actually follow-up to see if it actually happened. Capabilities we sustain through our decisions, investments, and workforce actions that we can track through dashboard visible analytics are both actionable and relevant. Those measures tell a much more meaningful story and offer a much better return on actionable decision-making than ROI. When nothing actionable comes out of ROI specific to operational decision-making, I question the resources spent to get there. I vote we start a new ritual and go after something more useful to the business like EOSC – Evidence of Sustained Capability.

What do you think? How can we deflect or redirect the thinking behind ROI?

Gary G. Wise
Workforce Performance Advocate, Coach, Speaker
gdogwise@gmail.com 
(317) 437-2555
Web: Living In Learning
LinkedIn

 

8 replies on “ROI vs. EOSC – Evidence of Sustained Capability”

Gary, I agree with you. This might seem surprising since the name of my company – ROI Learning Services – would suggest that I am an advocate of focusing on ROI. The answer is a little more nuanced. I believe that ROI is very important and should be measured much more than it is; however, the key thing is to define what is meant by “return.” Most people assume that “return” must be measured in dollars or some other relevant currency. That is not necessary and, in fact, it leads to the ROEDT problem to which you referred.

On my company website (www.roi-learning.com) I introduce the ROI Learning Scorecard™ which is based on the Balanced Scorecard concept created in 1992 by Dr. Robert S. Kaplan and Dr. David P. Norton. Having a balanced scorecard requires you to plan, manage, and measure your organization from four perspectives:

1. The Learning and Growth Perspective
2. The Business Process Perspective
3. The Customer Perspective
4. The Financial Perspective

This philosophy is consistent with our holistic approach to training and organization development. By using our ROI Learning Scorecard™ companies can focus on and invest in the right programs to build and nurture a continuous learning organization with multiple benefits to their business, including but not limited to higher revenues and profits.

Thanks for starting this discussion.

Verne Morland
Managing Partner
ROI Learning Services

Hey Verne!

Nice to hear from you! You and I have had conversations about the “return” on learning directly and indirectly in the context of numerous discussion over the last couple of years. My whole point, and you nailed it with your comment that ROI is “a little more nuanced”. I could not agree more. I think for the purpose of your business “sweet spot” you need to use “ROI” because that is in the forefront of the minds of potential clients. Our job in selling innovation in learning, be it technology, or social media, or some other new paradigm yet to surface, constantly are delivering an education to prospective buyers. You are indeed an advocate, my friend, and you are an advocate on driving a “return” in any investment made to serve learning continunously across an ecosystem. We’re of the same mindset. I think sometimes, I’m not ppicking on ROI as an institution as I am trying break the paradigm that there has to be line-of-sight financial return to training efforts we take. Too many other variables “muddy the water” when trying to isolate the impact of training. What we find in terms of impact is “nuanced” and it points to a contribution…convoluted as it may be…and I see that as “evidence”. While I do advocate a “return” on investments we make in learning, I cannot in good conscience call any intervention a success unless the capability impacted is sustainable. When we are forced down the ROI road, the good news is the acquisition of evidence is going to happen as well, and often is in hand before the magic ROI percentage or ratio the bean counter seeks.

Take good care, and may your ROI always remain positive…clear evidence of sustained capability!

G

Perfect. Not too far off from the latest Kirkpatrick ROE vs. ROI line of thought. I like the “evidence,’ since “proof” is so elusive with so many variables affecting performance, and I’m pretty excited by the “sustained capability” as evidence of something that really matters. Exactly the right concept, to my way of thinking.

I will disclaimer that I think you can prove, or can get pretty close to proving ROI with sales training/sales effectiveness/sales performance work, but it’s also tougher than most people think, and even tougher for those outside the sales field. Essentially, you all end up having to align about what lies to agree to. 😉 But even when you do prove it, it doesn’t seem as valuable as a standalone study (which it usually is), as having evidence of sustained capability.

Hats off to you, on this one.

Mike Kunkle

[…] As stated several times throughout this library topics (and in materials linked from it), too many strategic plans end up collecting dust on a shelf. Monitoring and evaluating the planning activities and status of implementation of the plan is — for many organizations — as important as identifying strategic issues and goals. One advantage of monitoring and evaluation is to ensure that the organization is following the direction established during strategic planning. That advantage is obvious. However, another major advantage is that the management can learn a great deal about the organization and how to manage it by continuing to monitor and evaluate the planning activities and the status of the implementation of the plan. Note that plans are guidelines. They aren’t rules. Basics of Monitoring and Evaluating and Deviating from the Strategic Plan 7 Ways to Test Your Strategic Planning Approach The 2010 Twelve-step Checklist to Help You Evaluate Your Strategic Business Planning Process ROI vs. EOSC — Evidence of Sustained Capability […]

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.