Stratety™

Stratety™ is our patent-pending Training Audit and Execution Platform

 

Is your internal training program effective? Have your trainees learned what they need to succeed? Have you selected the right trainer? Have you developed the best training program and selected the optimal way to deliver it? If you’ve outsourced the training, have you selected the right training company? What is the final return on your training investment? Most importantly, have your learning initiatives contributed meaningfully to the execution of your organization’s strategy?

 

There are perhaps no more important and controversial questions in the multi-billion dollar corporate training industry than those outlined above. As HR professionals around the world can attest, these questions are also among the most difficult and time-consuming to answer with anything like precision.image2

 

Stratety™ is our innovative and cost-effective solution to this need in the market. Our proprietary, patent-pending platform will help you respond to critical return on investment questions with world-class rigor and objectivity, at an affordable price. With Stratety™, we guarantee your satisfaction or we don’t get paid.* It’s that simple and that effective.

 

Stratety™ is a complete, end-to-end suite of services to help you execute your training strategy and measure Return On Investment.

 

How can we offer complete, highly-individualized evaluations at an affordable price? Answer: economies of scale and internal systems efficiency. We are a lean management organization whose entire business model is built on lean and six-sigma principles. We deliver our high-quality services using the leanest models and procedures on the planet. We pride ourselves on our world-class internal collaboration software that allows us to bring experts from around the world onto our framework and into your project at a fraction of what it costs you and our competitors to deliver comparable quality.

 

The conceptual framework and research underpinning the Stratety™ platform:

 

The person who first elaborated and spoke about a methodology to evaluate the impact of training on performance was Donald Kirkpatrick, Professor Emeritus at the University of Wisconsin. His first academic paper was published in 1959, through a series of articles and through his book, “Evaluating Training Programs”, published in 1994. Kirkpatrick’s 1994 book widely increased awareness of his model throughout academia and the business world. Today, the Kirkpatrick framework is the most widely used and talked about model when it comes to the evaluation of adult learning in the workplace. Though a great deal of new research has taken place in recent years, Kirkpatrick’s model remains seminal and is now standard vocabulary in HR and learning and development circles throughout the globe.   The four levels of the Kirkpatrick model are as follows:

 

1. Reaction – how participants reacted to and felt about the training receivedimage2

2. Learning – the knowledge and/or skills and/or change in attitude that have resulted from the training.

3. Behavior – this refers to the transfer of level two learning from classroom to the place of work. This evaluation typically takes place 3–6 months after the training has been completed and usually occurs through observation.

4. Results – the final results that have been obtained because of participation in the training program (financial, performance-based metrics)

 

By evaluating training sessions or programs against Kirkpatrick’s four levels, it is possible to gain a significant and measurable indication of how effective the training was, and whether or not it can be improved in the future. The problem, as busy professionals around the world are all-too-aware, is that the model isn’t very practical or cheap to implement. Using the model to measure learning initiatives is very time-consuming and requires an unjustifiable amount of resources. This has all changed with the arrival of the Stratety™ training evaluation platform. We take care of all the planning and execution, so you can take care of more important things. When we have the results, you simply review them and make strategic decisions that are aligned with the overall objectives of your organization.

 

Since Kirkpatrick’s original model, otherwriters have suggested that a fifth level – return on investment – and even a sixth level of evaluation be added to conceptual frameworks of training evaluation. Kirkpatrick referred to ROI in his Level Four, which was an attempt to link training results to business results. Over time, however, the need to measure the very concrete, financial return on training investment became so important to many organizations that Dr. Jack Phillips added a fifth level, ROI, which is designed to quantify performance improvements, measure financial benefits and compute precise investment returns. Phillip’s fifth level helps organizations make informed decisions based on quantitative data on returns and return comparisons between learning programs.   Some investigators believe that this fifth level is problematic because it includes hypothetical factors and focuses on measurement after the event. If careful precautionary measures are not taken into consideration, it is argued, this after-the-fact approach runs the risk of not quantifying correctly the desired added value at the start of the training. This is one area where the pre-evaluation phase of the Stratety™ platform can add great value to our customers. We measure from the beginning.

 

Phillip’s contribution to the Kirkpatrick framework – in combination with Stratety™ – allows stakeholders to compare the final return on training investment at previously unattainable low costs.

More recently, several investigators have built onto the foundations of Kirkpatrick’s four levels. Robert O. Brinkerhoff’s 2006 book, Telling Training’s Story focuses on the “Success Case Method” to evaluate the effectiveness of training. Brinkerhoff’s method seeks to survey and analyze the experience of learning participants against a number of important factors, including the effectiveness of the training function, the role and contribution of line managers, the working of systems and processes, and the contribution and support of line management.  Brinkerhoff’s model is complex but, in combination with the Stratety™ platform,

 

Educapro Levels of Evaluation

 

it can add great value as an audit of any existing training program – whether the program is delivered by Educapro or any other vendor. In addition, with Stratety™ reports, it is much easier to develop effective action plans to rectify training and/or targets, both of which are key elements of the Brinkerhoff approach.

Another interesting conceptual framework worth mentioning before closing is the Kearns model of evidence-based learning evaluation, named after Paul Kearns, who describes himself as the evidence-based HR professional and gives conferences and seminars on the subject. We are confident that you will not find a more cost-effective platform than Stratety™ to implement the ideas of Paul Kearns into your organization.

 

Conclusion

 

Although a great deal of research has taken place since Kirkpatrick’s original Level Four framework, it remains the core framework that is talked about and used to evaluate the effectiveness of training and it is still the model most widely adopted by learning and development professionals today. At Educapro, we focus on value creation rather than measuring for the sake of measuring. With the Stratety™ platform, we’ve made rigorous and highly personal and qualitative training evaluations accessible and affordable for today’s demanding and price-sensitive business climate. With Educapro and Stratety™, your organization will be able to improve processes and events that in the past were deemed to be too difficult or too expensive to measure. We think it’s time to move evaluation from being “something I know I should do” and “something I wish I had the time and resources for” to something that is accessible, rigorous and effective.

 

Does any of the above strike a deep chord with you? If you are frustrated with less than satisfactory training results, contact us now to arrange for a videoconference with one of our representatives. One call will bring you one step closer to the training evaluation solution you are looking for.

* Conditions apply

 

Further reading:

Alvarez, K., Salas, E and Garofano, C. M. (2004), “An integrated model of training evaluation and effectiveness” Human Resource Development Review, vol. 3, no.4, pp 385-416.
Baldwin, T. T. and Ford, J. K. (1988), “Transfer of Training: A Review and Directions for Future Research”, Personnel Psychology, vol.41, no.1, pp. 63-103.
Berk, J. (2008), “The manager’s responsibility for employee learning” Chief Learning Officer, vol. 7 no. 7 pp. 46-48.
Brinkerhoff, R. O. and Gill, S, J. (1994), The Learning Alliance: System thinking in Human Resource Development, Jossey-Bass, San Francisco, CA
Broad, M. L. (2005), Beyond transfer of training: Engaging systems to improve performance, Jossey-Bass, San Francisco, CA Broad, M. L. and Newstrom, J. W. (1992), Transfer of training: Action packed strategies to ensure high pay/off from training investments. Addison Wesley, MA Burke, L. A. and Hutchins, H. M. (2007), “Training Transfer: An Integrative Literature Review”, Human Resource Development Review vol.6, no.3, pp. 263-296.
Burke, L. A. and Hutchins, H. M. (2008), “A study of best practices in training transfer and proposed model of transfer” Human Resource Development Quarterly vol.19, no.2, pp. 107-128.
Cheng, E. W. L. and Ho, D. C. K. (2001), “A review of transfer of training studies in the past decade.” Personnel Review, vol. 30, pp. 102-118.
Fitzpatrick, R. (2001), “The strange case of the transfer of training estimate” Industrial/Organizational Psychologist, vol. 39, no. 2, pp. 18-19.
Ford, J. K. and Weissbein, D. A. (1997), “Transfer of training: An updated review and analysis”, Performance Improvement Quarterly, vol.10 no.2, pp. 22-41.
Foxon, M. (1993), “A process approach to the transfer of training” Australian Journal of Educational Technology, vol. 9, no. 2, pp. 130-143.
Georgenson, D. L. (1982), “The problem of transfer calls for partnership” Training and Development Journal, vol. 36 pp. 75-78.
Georges, J. C. (1988), “Why soft skills training doesn’t take” Training and Development, vol. 25, no. 4, pp. 42/47.
Grabowski, S. M. (1983), “How educators and trainers can ensure on-the-job performance:strengthening connections between educators and performance” New Directions for Continuing Education, vol. 18, pp. 5-10.
Newstrom, J. W. (1986), “Leveraging management development through the management of transfer” Journal of Management Development, vol. 5 no. 5, pp. 33-45.
Saks, M. A. (2002), “So what is a good transfer of training estimate? A reply to Kirkpatrick” The Industrial/Organizational Psychologist, 39, pp. 29-30.
Salas, A. M. and Belcourt, M. (2006), “An investigation of training activities and transfer of training in organizations” Human Resource Management, vol. 45, no. 4, pp. 629-648.
Salas, E., Cannon-Bowers, J. A., Rhodenizer, L. and Bowers, C. A. (1999), “Training in organizations: Myths, misconceptions and mistaken assumptions” in G Ferris (Ed.), Research in Personnel and Human Resources Management vol. 17, CT., pp. 123-161.
Tannenbaum, S. (2002), “A strategic view of organizational training and learning” in Kraiger, K (Ed.), Creating, Implementing and Managing Effective Training and Development, Jossey-Bass, San Francisco, CA, pp. 10-52.
Velada, R. and Caetano, A. (2007), “Training transfer: the mediating role of perception of learning”, Journal of European Industrial Training Vol. 31, no.4, pp 283-296.
Wexley, K. N. and Latham, G. P. (2002), “Developing and Training Human Resources in Organizations” Prentice Hall, Upper Saddle River, NJ.