Online learning: if they tune in and drop out, focus on the persistent learners
The new generation of online courses is still in its infancy, so course developers are still working out how to judge the effectiveness of their content and how much learners engage with it.
Universities that run massive online open courses (MOOCs) (previous thoughts on these here) have their own priorities in this respect, one of which is to reach as many people as possible all over the world. But it’s still interesting to see how two of the largest open course providers are assessing their courses. The seventeen reports published by MIT and Harvard last week are fascinating reading. They may provide pointers to smaller organisations figuring out how to evaluate the success of their courses.
What MIT and Harvard are interested in
Naturally, the first thing MIT and Harvard focus on is registration numbers. They’re fortunate enough to have very large data sets. Over the year, 597,692 people signed up for 841,687 courses. This ranges from 181,000 for the Computer Science course to 12,000 for the Chemical Structures course. We’ll see why they achieved instant popularity in a while.
The next item of interest is learner persistence and engagement throughout the course. MOOCs have been criticised because huge numbers sign up with good intentions, but their participation dwindles as the weeks go by. But how should we be judging this: should we say it’s terrible that 24,493 registered for the Introduction to Chemistry course but 91.6% didn’t complete it? Or should we note that an amazing 2,052 persisted through to the end and received certification to prove it? It all depends on your expectations.
MIT and Harvard certainly argue the latter, particularly since they don’t regard certification and official completion as the only purpose of their courses. “Large numbers of non-certified registrants access substantial amounts of course content”. For this, and other reasons, “Course certification rates are misleading and counterproductive indicators of the impact and potential of open online courses.” Interestingly, around half of all participants who drop out do so in the first week of the course. Students who carry on after the first week stand a good chance of persisting through to the end.
Harvard and MIT are also interested in ‘events’, which they define as ‘interactions indicating when students played videos, attempted problems, browsed through text pages, read or posted entries in the forum, and other activities’. They recorded over 25 million events. Naturally, participants who later earned certificates were very active, most creating at least 1,000 events, and (estimating from the small chart in the report) on average interacting 10,000 times. But many participants who didn’t bother with the certificate also maintained their activity throughout.
Only tentative lessons can be drawn from the MIT and Harvard data, but here are some thoughts:
- An extra effort can be made to keep learners engaged in the early stages of a course, helping them clear the hurdle of dropping out shortly after starting. Your course quality could be very high, but learners might need help to orientate themselves in an unfamiliar subject. And there is sunk costs: It’s possible that once a learner has invested a certain amount of time in a course, they will be more motivated to carry on.
- Sign-ups in the weeks before say something about how well you’re publicising the course. In the case of MIT and Harvard, however, the extraordinary figures may tell us more about the novelty of the format and how news of the courses went viral. The strong reputation of the two institutions meant that social media was ablaze when the courses were announced. And registrations exploded shortly after thanks to a segment on the Colbert Report, and an American satire programme.
- Engagement during the course may say something about the quality or interest of the content. The Computer Science course, for example, which saw a large number of participants not certified, nevertheless continued to attract a considerable amount of exploration as the weeks progressed. This is more evidence that certification is not necessarily an indicator of utility.
- Here at Coracle, we thought the participation of learners in the Challenges of Global Poverty course was pretty interesting. The report notes that “There emerged about 15-20 very active users who posted hundred of comments, many of which were high quality and frequent. By promoting some of these users to become community TAs, the course was able to build effective forums and shape the community discussion.” The idea of social learning is at the heart of what we do, so it’s good to see how other organisations are starting to take the same approach.
All in all, MIT and Harvard have provided an interesting set of reports. The challenge for any course provider now is to build on the basic metrics they talk about, identifying segments and adapting their courses to the behaviour of individuals.