In the latest issue of CLO magazine, one of the stories that caught my attention from the front cover was the one by Timothy Hill of Blackboard, Inc., entitled Measure Smart: Trade ROI for IOB (it's in NXT book format). After reading it, I must confess I'm not sure what the hubbub is about. Have I missed the point?
Before I go further let me reveal that I am currently pursuing my ROI certification through my Capella University Masters program. The CLO article is very timely actually, as I am at the point in the course where I am putting together my project plan and data collection and analysis strategy. So naturally, being in the thick of the issue at the moment, I was more than interested in seeing what Hill had to say.
Hill's article is making the case that the return on investment evaluation model - well known by its acronym ROI - is insufficient for measuring the true impact of training interventions on a business. Hill proposes a new system, called impact on business, or IOB. As Hill says, IOB "...directly aligns employee development to business goals." He goes on to detail the ways in which business must tie training results to bottom-line goals. Bottom-line goals. That's a phrase I've been hearing a lot about of late. Why? Because that's exactly what Jack Phillips' ROI methodology targets.
Hill does little to explain what IOB actually is. He says it is a means for understanding how well learners in a training program are engaged, and that understanding this can lead to understanding how the business was substantively affected by the training. However, he doesn't appear to offer a model or particular perspective that backs up his case, and for all intents and purposes, I can't see how the ROI methodology doesn't do what Hill says it doesn't. After all, when running a Level 2 eval, the results of such a tool should report the knowledge level of students after they have completed the program. This can be compared to any pre-test measures or other baseline metrics. The results of levels 1-3 would be known through proper application of appropriate measures. Level 4 would then answer the question Hill appears to be asking of ROI and claiming for IOB.
Through parts of the article Hill refers to how ROI only measures quantitative factors such as materials costs, travel, etc. But this is not correct. The ROI methodology definitely takes into account quantitative items in order to arrive at a final answer, but it also seeks to understand intangible items as well. Mind you, Hill does not bring intangibles up, but I point this out about ROI in order to take issue with the way that Hill appears to perceive that process. Hill also refers to ROI, seemingly, as simply another level in the evaluation process for a program. This is not untrue - ROI is typically known as Level 5, sometimes Level 6 if Schneiderman's Balanced Scorecard is factored in - but ROI is without question the biggest step, and the one that requires the most planning. ROI as Phillips defines it is a total process, not a taxonomic action item that WLPs must remember to do. A well-thought out and planned ROI evaluation can span an entire project from conceptualization, well past implementation. ROI can take this long because to complete the process, all relevant business metrics must be accounted for properly and sufficient data collected and analyzed. This includes the data points Hill mentions for IOB; things like learner satisfaction with the training program, learner engagement (realized through the Results, or Level 4, portion of an evaluation), application to the job once back at the desk or on the floor. If Hill is attempting to point his readers to another possible way, he appears to have missed the mark.
The inset for the article doesn't help me understand Hill's position either. The case study in the inset discusses an extensive sales training program at a pharmaceuticals company putting three major new products on the market. But all the time is spent talking about the specifics of the program and how well it was implemented and received. IOB isn't mentioned once, let alone any sort of evaluation that affirmed and confirmed how well the learners actually applied. There is mention that they did, but not even a singular figure is offered as evidence. All I really found was: "The representatives were engaged, and empowered to sell with confidence." That may be so, but this thin explanation of the result says nothing of IOB, and is not at all different from thinly pursued evaluations that are conducted but end up not reporting a simple concrete figure to back the perception of results up.
Now, I'm not expecting a dissertation; I can go find plenty of those over at ProQuest, and a monthly periodical is surely no place for such detailed work. But as a student of the ROI methodology who is learning just how complex the process can be when taking into account organizational complexities, I was ready for a lively challenge to ROI, and a depth of counter-perspective. I didn't get that from this article, and I need more before I can take his proposal into consideration.