Image of a globe flanked by the text 'Resources for Recruitment and Retention, Support in the Workplace' and wrapped in a banner that says 'Plan It.'

Training Intervention Strategies

Levels of Evaluation

Evaluation is often looked at in terms of four different levels that are known as the "Kirkpatrick levels" that are listed below. In general, the complexity of evaluation increases with the level of training required. 

1. Level 1. Reaction

What does the learner feel about the training?
Gathering information to answer this question is relatively easy, since it can be done immediately following training and usually does not require the careful thought needed for higher levels of evaluation.

Strategies. Common strategies to gauge reaction include having a discussion with participants (e.g., as a whole or in small groups) or asking them to fill out a brief survey.

Sample questions. Sample questions in an open format survey that collects qualitative information include:        
  • Would you recommend this course to others? Why or why not?
  • What were your expectations coming into the course? Were they met?
  • What was the most useful aspect of the training? Least useful?
  • Do you feel your time was well spent in this training? Why or why not?
You can also ask questions in a format that can be quantified. For example, make a series of positive statements (see below) and ask respondents to fill in one of five bullets to the right of each labeled “strongly agree; agree; neither agree nor disagree; disagree; or strongly disagree.” These responses represent a Likert scale, which is commonly used to measure attitudes. Examples of ways to capture responses using this type of scale can be found at The University of Connecticut’s Neag Center for Gifted Education and Talent Development, which has a tool that helps you create a Likert Scale.

Sample statements participants can be asked using a Likert scale include:
  • I would recommend this course to others.
  • The training met my expectations.
  • My time was well spent in this training. 

2. Level 2. Learning

How did learners’ knowledge, skills, or attitudes (KSAs) change as a result of this training? 
The second level of evaluation measures whether or not learning objectives of the training were achieved, as assessed after the training.

Strategies. Pre and post questions provide detailed feedback on the training. Ideally, you should ask the same questions before and after training so that you can measure the impact of training more accurately. In practice, to save time, many training sessions simply ask for a single self-report at the end of training. For instances, a common approach is to list each objective, and then ask each participant to rate how well he or she could perform that objective before and after training. This one-time assessment is less accurate than using a pretest and post-test.
 
In-class activities offer instructors a chance to see whether learners can apply new material within the classroom environment. Activities should resemble as closely as possible real-life applications. For example, after demonstrating a counseling technique for the class, the instructor can ask class members to pair off and role play a situation in which a technique that has been taught would be used. Role playing allows the instructor to get a general idea of how well the class understands the material by simply walking around the classroom and listening.

3. Level 3. Transfer of Skills to the Workplace

What skills did the learner develop? What new information is the learner using on the job to perform his or her job functions?
While a Level 2 Evaluation can measure how well the learner “got” the content by the end of training, we know that our memories are designed for efficiency. If a participant cannot use what they have learned in a short time period, the information will be lost. Examples of training that are likely to waste dollars include:
  • Training employees to perform new job functions months before they are required to perform them, or after they have been on the job long enough to develop their own ways of doing things.
  • Training evaluators to use data management software 6 months before it is installed on their computers.
  • Training providers to comply with new regulations months before they go into affect, or when they are likely to change.
We also know training is ineffective when there are barriers to using concepts presented in the trainee’sreal world.” For example:
  • Train staff to use a best practice that cannot be implemented as taught because of funding restrictions, population characteristics, State law, or other reasons.
  • Train staff to provide integrated care for co-occurring disorders when, in fact, licensing, funding restrictions, or agency standard operating procedures do not allow the level of integration between mental health and substance abuse counseling envisioned in the training. 
  • Train staff to implement a practice their supervisors do not support.
A Level 3 Evaluation is designed to determine whether there are barriers to implementation that were not anticipated at the design stage. If a component is not being implemented, an evaluation should try to collect data on the reason for this. Possible explanations may include:
  • Learner did not understand skill well enough to use it;
  • Purpose of the new procedure was unclear;
  • New skill was not encouraged by a supervisor or conflicted with agency procedures;
  • Trainee’s own attitudes and beliefs prevented him or her from using the procedure and perceiving its relevance.
Strategies. To determine whether or not the learner (or learners) has been able to apply skills from training on the job, you have five basic choices. You can use any one approach, or all of them in combination, that include:
  • Training participant report. Ask the learner how many times he or she has used a particular skill in a given period (e.g., related questions may include what difficulties he or she encountered in trying to use the skill, whether desired outcomes were achieved, and what additional training or assistance would help him or her improve);
  • Teammate or supervisor report. Ask someone who works with the learner to report on his or her apparent mastery of the skill (e.g., the supervisor);
  • Client report. When skills are used in counseling sessions or other work with the trainees’ clients, it is often possible to design surveys or focus groups that allow clients to give feedback on their experience of these skills;
  • Observer report. Have an outside evaluator or expert observe learners using skills; or
  • Outcome or product review. Assess appropriate outcomes or products that enable you to determine whether or not the skill has been used successfully.
Sample Questions. A provider has trained addiction counselors to contact and involve Concerned Significant Others (CSOs) in addiction treatment in hope of enhancing support for clients as they seek to change patterns of addiction. Examples of questions to pursue in each of the five techniques covered through this training include:
  • Training participant report. For each CSOs engaged for at least one treatment session, please describe any barriers to successful engagement of the CSO. What support did the CSO agree to give your client? Did this support occur? 
  • Teammate or supervisor report. What level of proficiency do you believe (name) has achieved in CSO engagement?
  • Client report. Did your counselor explore the possibility of involving a CSO? Did he or she explain the potential benefits of this approach? Was your CSO approached? Did the CSO meet with you and the counselor, either by phone or in person? Did you receive any support from your CSO that you would not have received otherwise?
  • Observer report. Did the counselor explain the benefits of involving a CSO? Did counselor suggest alternative ways to approach the CSO to maximize the likelihood of receiving assistance? Did counselor offer the CSO the option of a telephone conference if he or she was unable to attend in person?
  • Outcome or product review. How many CSOs agreed to attend at least one session? Which of the following kinds of support did each CSO supply, in the opinion of the client? (List supports.)

4. Level 4. Results or Effectiveness

What results occurred? Did the learner apply new skills to necessary tasks in the organization? If so, what results were achieved?
Training should ultimately result in benefits to an organization that can be assessed, though some benefits are easier to describe and measure than others. As training is designed, the anticipated benefits should be stated clearly so that training can be prepared with the intended results clearly in focus. Ideally, a clear baseline should be established to determine impact of training afterwards.

Strategies. Indicators of changes in performance may be gauged using “hard data” (i.e., changes that can be quantified) or “soft data” (i.e., changes that are qualitative in nature and depend on the perceptions of those involved). Both may be used together. Examples of these types of data include:
  • Hard data. Examples include changes in a regularly used customer satisfaction index, in the number of clients screened for co-occurring disorders, number of accidents at work, number of counselors remaining with the agency for a stated period of time, number of clients reached for follow-up surveys, number of staff who receive a certain credential, or time required to respond to telephone requests for appointments.
  • Soft data. Examples include reported changes in job satisfaction, greater fidelity in use of an evidence-based practice, improved skills using a counseling technique, or greater use of interactive training strategies instead of lecture.
Sample Questions. An addiction treatment and mental health provider provides staff with training designed to improve referrals and information sharing for clients who have co-occurring disorders. Examples of the type of data agencies might use include are:
  • Hard data. How many referrals did you make? What types of information were shared? How many times have you called a counselor in another agency to follow up on a referral? What was the result of the call?
  • Soft data. What coordination mechanisms have been developed to improve referrals and information sharing? Are they being used? Are they resulting in more clients with co-occurring disorders receiving treatment for both disorders? 
Tracking Long-Term Impact. It takes time to see if training results in actual changes in behavior that benefit the organization. Trainees may be enthusiastic immediately following training, especially if the instructor is dynamic, but find on returning to work that they lack the incentive or understanding of the new skill needed to apply it. Consider any of the following strategies:
  • Send post-training surveys to training participants. Ask them to indicate how frequently they have used various skills addressed in the training and whether they felt they were successful.
  • Offer ongoing, sequenced training and coaching over a period of time, while coordinating with managers to ensure they are encouraging and supporting use of new skills. Ask for managers’ perspective periodically and help remove any organizational barriers to progress.
  • Conduct a follow-up needs assessment to determine whether training needs have changed.
  • Check metrics (e.g., errors, numbers of specific actions) to determine if participants are using new behaviors. 
  • Interview trainees and their managers, or their clients, on how new skills are being used and how the impact of these skills is perceived.  

Additional Reading on Training Evaluation

Kirkpatrick, D.L. (1994). Evaluating Training Programs: The Four Levels. San Francisco, CA: Berrett-Koehler.
 
San Diego State University’s Department of Educational Technology, offers a helpful exposition of the four levels, contained in its “Encyclopedia of Educational Technology.” 
 
Lockee, B., Moore, M., and Burton, John (2002.). Measuring Success: Evaluation Strategies for Distance Education. Educause Quarterly (1).

Submit your Feedback




Upload or attach a document:
Upload:

Go to Chapter: