Assessing the effectiveness of your training
Last week, we looked at some of the theory of evaluating your training programmes and learning courses.
This week, we'll examine the practical ways to dig into the first level of evaluation – getting feedback from online course participants.
Surveys and questionnaires
It’s often the case that participants know better than anybody how effective a programme or course is, so the obvious way to find out is to ask them! Generally speaking, they’ll have an idea about what they were hoping to learn, what their weaknesses are, and whether they’ve improved thanks to your content. The classic method is the questionnaire which should be completed as soon as possible after a programme ends. It can ask about anything from overall satisfaction to engagement, presentation and design. Choose a small number of key aspects of your online content and focus on them rather than trying to cover all the possibilities. Later, you can vary the questions to take account of improvements and focus on something else.
Questionnaires are great, but once a programme is online, the scope for asking questions expands dramatically. Earlier this year, we started rolling out comments and Amazon-style grading to every page of some of our courses. Users don’t have to comment or grade the content, but when they do it is extremely useful because you catch them at the moment when a relevant insight appears. Think of all the times you’ve dismissed or ignored surveys, no matter how well-intentioned they were, because you simply don’t have time or you’re thinking of the next thing The research body JISC (once known as Joint Information Systems Committee) suggests another possible way to canvass opinion that they call “Interview plus”, with the plus being an activity designed to jog the memory. “The ‘plus’ represents some artefact or activity chosen to guide recall or aid thinking aloud during the interview (e.g. diary, observation plus thinking aloud, learner progress files, student work, tracking data from a virtual learning environment)”.
Pre and post-course assessment
One practical method of assessing learning impact is to test before a course and then test after. This is highly effective in the case of compact courses with specific answers. The difference between pre and post results is the impact your programme has had. A pre-assessment can be useful to get participants into the right frame of mind. However, keep in mind that learners won’t appreciate being tested in depth on a topic that they have never studied. That will rightly seem a waste of their time.
Bear in mind that learners are increasingly getting their information from different sources, and you may not be tracking what you think you’re tracking. This applies particularly to blended learning which may take place online, in the classroom, in the field or from reading. The Tin Can API (now officially called the Experience API) allows learners to capture this sort of fragmented learning and turn it into useful data. It is this API that forms part of the ingredient mix of Coracle’s Learning Line.
A few additional thoughts
You can double the effectiveness and meaning of your evaluation by considering change over time. Make individual changes to the programme and analyse the change in response – do users score the content better, the same, or worse?
There will always be the temptation to coax out the result you want by guiding survey respondents towards reporting satisfaction. This might be through leading questions, or through limiting choices unrealistically. There are useful quick tips on formulating questions here and a more thorough analysis here.
Don’t forget to consider the ethical basis of asking questions. It’s important to respect the individual participant. Learning is in some ways private, and if they’re struggling users don’t necessarily want analysis of “why they just didn’t get it”. As a general rule, anonymous questionnaires will yield more honest opinions.
The nitty-gritty – details to look out for
In the fourth and final article on evaluating your online training and learning, we’ll run through some of the things you will want to consider before hitting the ‘publish’ button. We’ll also provide a couple of templates that cover everything in this series and wrap it up with some thoughts on what you can do with the results of your evaluation.