The surveys passed out after training sessions are largely ineffective for evaluating training because they are often too long, poorly written, ignored by participants, and, therefore, don't contain actionable data. Even so, we conduct them because it's easy and it's what we know.
If you've ever read the ASTD State of the Industry Report, you know that most training professionals use surveys, but hardly ever use other methods to evaluate training. For some reason, we find it reassuring to receive a collection of survey results with solid 5's, even though we suspect the participant just circled the 5's to get out of there faster. It is understandable that many learning and development professionals (and students, too) have become disillusioned to the effectiveness of training surveys.
Even if training surveys are the most used tool in the trainer’s tool chest for evaluating training, they aren’t really used at all if we take a deeper look. For one, the survey questions are generally meaningless, so the scores either confirm what we already believe or cause us to blame the responder for the low scores they gave. Worst of all, trainers are so busy that they often don't take the time to review survey results or change anything.
In other words, training surveys aren't actionable.
I have begun to consider replacing all of my training surveys with the Net Promoter System (NPS). I won't get into the specific details here (you should study up on it) but basically, NPS asks one simple question, using an 11 point Likert scale from 0 to 10:
- Based on your recent experience, how likely are you to recommend this company/product/training course to a friend or colleague?
- There is usually a follow up question that looks something like this: What is the primary reason for your score?
- I would add a final statement that says: If you would like to be contacted by a training manager about your score, please complete the fields below: name, email, phone number.
That’s it. No questions about the pace of class. No questions about whether the materials were sufficient. No questions about whether a student believed the course would be useful in their job. Just one (well, almost one) simple question: would someone recommend the class?
I like NPS for several reasons:
- It's simple and short: It's a short survey that will get a higher response rate than long training surveys with 10 to 20 questions on two pages.
- There's opportunity to provide a primary reason for score: It allows people to provide a specific reason for their score. Narrowing points of action.
- Respondents can volunteer to be contacted: It allows people to volunteer to speak with someone about their score. Usually this will come from people who give low scores. A training manager can then call, discuss, and get to the bottom of something that could improve the course.
Companies that are highly regarded in their industries as being extremely customer focused use NPS. I am not sure it is much different from what a learning and development department should be working towards. In fact, great learning professionals are customer focused. Their customers just happen to be the companies where they work and the students in their classrooms.
I believe all learning professionals should consider replacing all of their training surveys with the net promoter system (NPS) because NPS will provide better data and will increase the likelihood that action is taken and improvements made.
- Why should we treat participants in our training classes any differently than companies treat customers?
- What is the value in asking questions about how people like the training materials or the pace of a course or whether they believe a trainer was knowledgeable?
- Doesn’t someone’s willingness to recommend something say more about how good it is that anything else?
Photo: Can Stock