An iterative approach to developing, refining and validating machine-scored constructed response assessments
The Automated Analysis of Constructed Response (AACR) project seeks to develop a community of faculty who use evidence based practices to improve instruction by presenting faculty with novel assessment platforms for written assessment. Written assessments provide faculty in-depth evidence of student learning as they allow faculty to gather student understanding in studentsﾒ own words. However, written assessments are used infrequently in undergraduate biology courses, particularly courses with high student enrollment, because of the time and effort necessary to read and provide feedback.
The primary AACR goals are to (1) provide the means for faculty to gather evidence on student learning using formative written assessments and computerized analysis tools and (2) facilitate widespread use of these written assessments. The goal of the question development group within AACR is twofold 1) to develop a suite of formative written assessments in biology, chemistry and statistics that uncover student conceptual difficulties, and 2) develop text analysis and machine learning models that automatically analyze student writing, providing faculty with immediate feedback.
Our approach is to use pre-existing concept inventories, the science education literature, and interviews with faculty to identify areas of biology, chemistry and statistics where students have persistent conceptual difficulties. We then develop questions that target these conceptual difficulties. Questions are refined based on input from faculty and data from student interviews. Questions are piloted and revised, so answers can be analyzed by computers. After we have developed a question, we use two approaches to analyze student answers: text analysis and machine learning. Both methods identify and extract words and phrases from student writing that are used to build models of human scoring. The models classify the key concepts or correctness of a response and do so in high agreement with human scoring. Finally, models are piloted in the classrooms of members of our faculty learning communities at six different institutions.
We have developed 53 questions in biology, chemistry, chemical engineering, and statistics. We have collected responses from 7854 students and provided 123 reports to faculty. We have also improved our process of question development through the use of clustering and multinomial logistic regression analyses. We also have created more interactive and user friendly feedback reports for faculty.
Currently 19 faculty are using AACR assessments and participating in our faculty learning communities. We have also recruited 12 new faculty members across our institutions to join our FLCs and use AACR assessments and resources. Additionally, we have expanded to collaborate with faculty in physics at Michigan State University and Stony Brook University and statistics at Grand Valley State University. We disseminated our findings though 37 presentations and 12 journal articles. AACR products are currently available to faculty via 2 websites.
Members of our faculty learning communities would like to broaden their use of AACR questions to a wider range of biology topics. To meet this challenge, we are developing AACR questions for ecology, cell biology and other topics. We are also employing new statistical approaches to facilitate rubric development and improve our scoring models.
ﾕ Kaplan, J. J. and Haudek, K. C. and Ha, M. and Rogness, N. and Fisher, D. (2014). Using Lexical Analysis Software to Assess Student Writing in Statistics. Technology Innovations in Statistics Education. 8 (1), https://escholarship.org/uc/item/57r90703
ﾕ Urban-Lurain, Mark and Cooper, Melanie M. and Haudek, Kevin C. and Kaplan, Jennifer J. and Knight, J. K. and Lemons, Paula P. and Lira, Carl T. and Merrill, John E. and Nehm, Ross H. and Prevost, Luanna B. and Smith, Michelle K. and Sydlik, Maryanne (2015). Expanding a National Network for Automated Analysis of Constructed Response Assessments to Reveal Student Thinking in STEM. Computers in Education Journal. 6 (1), 65-81.
ﾕ Weston, Michele and Prevost, L. B. and Haudek, K.C. and Merrill, J. and Urban-Lurain, M. (2014). Examining the Impact of Question Surface Features on Students' Answers to Constructed Response Questions on Photosynthesis. CBE Life Sci Educ. 14 (2). DOI: 10.1187/cbe.14-07-0110
ﾕ Moharreri, Kayhan and Ha, Minsu and Nehm, Ross H (2014). EvoGrader: an online formative assessment tool for automatically evaluating written evolutionary explanations. Evolution: Education and Outreach. 7 (1), 1-14. DOI: 10.1186/s12052-014-0015-2