ManyClasses 2 is now released as a preprint

Motz, B., Chinni, A., de Leeuw, J., Jankowski, H., Aggarwal, A., Amato, M., Berlin, K., Britten, K., Brown, A., Cerchiaro, M., Evans, N., Findley, A., Gorman, R., Gregg, K., Hansen, K., Hollender, H., Hullinger, R., Larkin, P., Lion, M., Long, R., Mannarino, A., Mocko, M., Moore, K., Packowski, J., Palmer, C., Ritchie, K., Scott, J., Stanton, ., Talcott, L., Wagner, L., Waite, M., Yeager, R., and Fyfe, E. (2024). ManyClasses 2: The Effects of Prequestions on Media Interactions and Learning, PsyArXiv. https://doi.org/10.31234/osf.io/3xbma

ManyClasses 2 Abstract

We conducted a ManyClasses study in 30 different classes, where we randomly assigned individual student participants (n = 1,571) to answer ungraded prequestions prior to watching online lecture videos as part of their routine learning activities. In the laboratory, decades of research have observed that prequestions reliably improve learning from subsequent instruction when compared to instruction not preceded by prequestions, but clear evidence in authentic education settings has been elusive. The current analysis in authentic classes confirmed that prequestions generally improve student learning performance, on average, as measured on delayed classroom assessments. Further, we examined how prequestions affected students’ interactions with the online videos. We found that, despite the generalizable average benefit of prequestions for student learning, prequestions also reliably caused some students to disengage, skipping the assigned video entirely. Furthermore, among the students who watched the videos, there was no increase in the amount viewed after answering prequestions compared to videos with no prequestions. These findings lead us to advocate for a major revision to the theoretical mechanism underlying the prequestion effect: prequestions do not increase attention to the subsequent learning materials, but rather, prequestions cause learners to activate a relevant mental model in advance of instruction. This account is corroborated by the finding that answering prequestions correctly was a significant moderator of the benefits of prequestions, suggesting that the prequestions, while beneficial on average, may cause a rich-get-richer scenario when implemented in authentic education settings.

ManyClasses 2 Study Context

ManyClasses 2 was conducted as an entry in a national research competition, the XPRIZE Digital Learning Challenge. The competition’s goals were to modernize, accelerate, and improve the ways in which we identify effective learning tools and processes that improve learning outcomes. Specifically, our entry in the competition was Terracotta, a learning management system (LMS) plug-in that enables rapid experimental research on instructional practices, educational materials, and learning activities; and yields insights into what works for whom, for improving student performance. ManyClasses 2 was conducted as a demonstration of Terracotta’s ability to support a multi-site distributed experimental research study in a relatively short amount of time.

In keeping with competition guidelines, the current study was conducted in four separate preregistered waves. In the final wave, Terracotta deployed the study across 26 classes in just 6 months (September 2022 to March 2023), in which each classroom implementation of the experiment was customized to the local instructors’ norms and materials, set up by participating teachers, and each student provided informed consent to participate. This represented a dramatic time savings when benchmarked against comparable studies. For example, ManyClasses 1 took 34 months from experiment design (May 2018) to the conclusion of data collection (March 2021) – over 500% longer than the current 6-month effort. In their report, Fyfe et al. (2021) advocated bluntly, “We need technology that enables experimentation in diverse classrooms” (p. 19). Terracotta fills this gap, demonstrates corresponding benefits in the speed and efficiency of education research and discovery. Terracotta won the runner-up award in the XPRIZE Digital Learning Challenge.

Note: Previously, ManyClasses2 had been planned as an examination of retrieval practice. We changed direction when the XPRIZE Digital Learning Challenge was announced.

ManyClasses 2 Details

DetailManyClasses 2 Stats
Number of Classes30
Number of Participants1,571
Data Collection SitesIndiana University, University of Nebraska Lincoln, University of Florida, and South Fayette School District

ManyClasses 2 Acknowledgements

This project benefited from the input and advice of Shana Carpenter, Paulo Carvalho, and Rob Goldstone, and from the camaraderie of Steve Ritter, April Murphy, and Joseph Jay Williams during the XPRIZE Digital Learning Challenge. We thank the many people who provided assistance with this study, including Kevin Brokamp (Indiana University [IU]), Candy Buck (IU), Jeffrey Goetz (IU), Brooke Komar (IU), Adam Maksl (IU), Maggie Ricci (IU), Heather Maness (University of Florida [UFL]), Andrew McDonald (UFL), Chad Brassil (University of Nebraska Lincoln [UNL]), Amy Goodburn (UNL), Matthew Callison (South Fayette Township School District [SFY]), Lee Ann Jubas (SFY), Robert Warfield (SFY), Doug Sonaty (Unicon), Diego del Blanco Orobitg (Unicon), Patty Wolfe (Unicon), and Kim Burgas.