How the Pioneers of the MOOC Got It Wrong

  • 2017-03-30
  • Spectrum

In 2011, when Stanford computer scientists Sebastian Thrun and Peter Norvig came up with the bright idea of streaming their robotics lectures over the Internet, they knew it was an inventive departure from the usual college course. For hundreds of years, professors had lectured to groups of no more than a few hundred students. But MOOCs—massive open online courses—made it possible to reach many thousands at once. Through the extraordinary reach of the Internet, learners could log on to lectures streamed to wherever they happened to be. To date, about 58 million people have signed up for a MOOC.

Familiar with the technical elements required for a MOOC—video streaming, IT infrastructure, the Internet—MOOC developers put code together to send their lectures into cyberspace. When more than 160,000 enrolled in Thrun and Norvig’s introduction to artificial intelligence MOOC, the professors thought they held a tiger by the tail. Not long after, Thrun cofounded Udacity to commercialize MOOCs. He predicted that in 50 years, streaming lectures would so subvert face-to-face education that only 10 higher-education institutions would remain. Our quaint campuses would become obsolete, replaced by star faculty streaming lectures on computer screens all over the world. Thrun and other MOOC evangelists imagined they had inspired a revolution, overthrowing a thousand years of classroom teaching.

These MOOC pioneers were therefore stunned when their online courses didn’t perform anything like they had expected. At first, the average completion rate for MOOCs was less than 7 percent. Completion rates have since gone up a bit, to a median of about 12.6 percent, although there’s considerable variation from course to course. While a number of factors contribute to the completion rate, my own observation is that students who have to pay a fee to enroll tend to be more committed to finishing the course.