Thursday, August 6, 2009

Return on expectations

Very often, learning professionals are challenged to come up with justification for training investments by providing ROI figures. Let's be honest, its almost impossible to accurately calculate the ROI for learning. So many factors impact a learner's performance, it is difficult to credit or blame a training initiative for a change in performance, particularly as it relates to monetary measures.

A viable alternative to ROI is measuring ROE, or return on expectations. What do you expect people to DO differently after training? Can you measure that change in behavior? This, in most instances, is a much easier way to measure the success of a training initiative. Have you reduced the number of help desk calls? Have you increased the amount of time a sales representative spends in the field? Have you shortened the average call length for call center reps, or increased call center satisfaction ratings? All of these measurements in some way eventually turn into money...why not advocate the measurement of direct training outcomes as a true measure of a training initiative's "worth" instead of the trickle-down ROI that can't as easily be controlled or claimed as a direct correlation?


  1. I've been using VOI (Value on Investment) for few years, most people respond to this with 'huh?'

    Working for the US Coast Guard for nearly two decades. In the organization's context monetary ROI rarely makes sense.

    I find it's a lot better to frame the projected benefit in more dimensions than a simple monetary return. Value can manifest in a lot of ways.

    Efficiency is one which can impact the bottom line.

    We shoot for enhanced capability in many cases. The investments in tools, training / perf support, process aren't going to have direct monetary return. The return is responsiveness, coverage, and service. The results / expectations of value return are lives saved, increased crew safety, etc.

    As a sidenote, part of my heartache with the whole ROI calculation argument is the process tends to focus exclusively on monetary factors. In many cases this only considers the short term returns and not the long term impact.

  2. To make matters worse, it would appear that all monetary returns are not created equally.

    A former employer was a stickler for business cases that showed postive ROI. But the financial team would only count 'hard' dollars. So what if our solution reduced the amount of time it took a call center agent to find an answer. Who cared if you could simplify life for a front line manager, giving them more time for development of their staff.

    Unless your business case included an increase in revenue, reduced expenses, or eliminated head count, it was unlikely that you would be able to compete against other projects for limited capital resources.

    So instead of being able to make incremental investments to improve the learning environment, you're left using band-aids and duct tape. Eventually, the 'system' implodes and needs to be completely overhauled - usually with much gnashing of teeth.

  3. I tend to quantify ROE as "return on engagement" rather than "return on expectations," especially when it comes to reporting on the success or failure of a given content model. For example - by measuring abandonment rates at given points, you can determine whether a piece of content is provided in the correct context. ROI is necessary but it rarely tells the entire story.