Studion

Drowning in Data: The Biggest Hurdle for MOOC Proliferation

At a workshop, we focused on the need to generate a more reliable process for assessing the effects of various aspects of MOOCs. The power of our current analytics suffers from the absence of a systematic approach. Much of the information we have to draw from exists solely in anecdotal form. For example, teachers who have been teaching a certain course in a traditional classroom style for years try implementing MOOCs and find that their students perform better or worse with the new style. However, it is difficult to compare the claims of these anecdotes because of the large number of variables involved.

Generating Meaningful Analytics

If we are to truly understand the impact that MOOCs can have on learning, and how to optimize education through MOOC and traditional classroom strategies, we will need to take a more scientific approach that involves comparing the effects of courses that differ along just one variable. In other words, if an American History teacher, who has always taught a course through in class lectures, designs a MOOC that includes online lectures, videos, historical interviews, and interactive discussions, then it is impossible to identify the specific cause for any change in student performance. If the MOOC strategy was more successful, it could have been a result of any one, or a combination, of the changes made to the original course.

At our workshop, Zhongzhou Chen described ways that we can more methodically determine the factors of MOOCs that can improve learning. In his session, titled “Researching for better online instructional methods using AB testing,” he described the A/B testing employed by his team at MIT.. This technique allows for better experimental control over MOOC studies because “A” and “B” represent the only differences in courses. Thus, any differences in engagement or performance can be directly attributed to the modified factor.

A/B testing

Chen explained a few of the A/B experiments they are currently running to determine the impact of interactivity on performance. For example, would students perform better in a physics course where they are provided with the value of variables to solve equations (“A”) or if they were provided instead with an interactive video that walks them through directly measuring these values themselves (“B”)? One could argue that the interactivity should provoke more participation. However, it could also be argued that the students will be less motivated if extra effort is required and should thus participate less. Large-scale studies into this issue of the influence of interactivity could help identify whether there’s any positive impact on student performance so that educators can determine if the cost of creating interactive videos is justified.

 

Direct Measurement Example, courtesy Carleton College Direct Measurement Example, courtesy Carleton College

 

Chen also talked about ways that the design of the course and relevant exams could bolster learning by reducing cognitive load. Certain interfaces require more focus and attention than necessary and thus distract from learning related to the targeted content. One suggestion Chen made was the use of drag and drop questions (“A”) rather than multiple choice questions (“B), with the idea being that multiple choice questions are cognitively laborious because of the amount of irrelevant information that must be held in short-term memory. According to Chen, MIT has collected some new data that will be published later this year. So stay tuned.

 

Example of Drag and Drop, Courtesy Wellesley University
Example of Drag and Drop, Courtesy Wellesley University

 

Going Forward

As we design studies to elucidate the potential influence of various course modifications, we may be aided by pre-existing data that provides insight into learning and attention processes. For instance, researchers in a Harvard Memory Lab recently showed that interpolating online lectures with simple memory tests helped students maintain attention, take more notes, and perform better in courses.1 Another study showed that online chatting activities disrupted learning if students were simultaneously taking notes by hand, but that if notes were also taken on the computer, the chatting did not have this negative impact on learning.2

Certain principles of learning can also be applied when designing MOOCs and blended MOOC/traditional classroom courses. For instance, studies in neuroscience have shown the importance of increasing the number and strength of synaptic connections in learning and memory. Whereas the strength of connections can be enhanced by repeating information at appropriate timed intervals,3-6 the number of connections depends largely on how many different ways the information is encoded in the brain. 7 In other words, the more things that are capable of triggering recall of certain information, the less likely that information is to be lost over time.

This aspect of learning represents one huge advantage of MOOCs and blended techniques. By presenting essential information in different ways and through different channels, the learning process can be significantly improved. The specifics of how to integrate content and present information in a manner that optimizes learning will continue to real themselves as more controlled studies are performed and more meaningful data is generated. Given the great promise of big data analytics for MOOCs, it is critical that educators focus efforts on planning the course to enable successful data capture for meaningful analysis.

 

References

1 Szpunar, K., Khan, N. & SChacter, D. Interpolated memory tests reduce mind wandering and improve learning of online lectures. Proceedings of the National Academy of Sciences 110, 6313-6317 (2013).
2 Wei, F., YK, W. & Fass, W. An experimental study of online chatting adn notetaking techniques on college students' cognitive learning from a lecture. Computers in Human Behavior 34, 148-156 (2014).
3 Giedd, J. Brain development, IX: human brain growth. The American journal of psychiatry 156, 4 (1999).
4 Giedd, J. N. et al. Brain development during childhood and adolescence: a longitudinal MRI study. Nat Neurosci 2, 861-863, doi:10.1038/13158 (1999).
5 Bliss, T. V. & Collingridge, G. L. A synaptic model of memory: long-term potentiation in the hippocampus. Nature 361, 31-39, doi:10.1038/361031a0 (1993).
6 Lynch, G. AMPA receptor modulators as cognitive enhancers. Current opinion in pharmacology 4, 4-11, doi:10.1016/j.coph.2003.09.009 (2004).
7 Munakata, Y. & Pfaffly, J. Hebbian learning and development. Developmental science 7, 141-148 (2004).