Analysis of an informed peer review matching algorithm and its impact on student work on model-eliciting activities

Matthew Alan Verleger, Purdue University

Abstract

Model-Eliciting Activities (MEAs) are realistic, open-ended, client-driven engineering problems designed to foster students’ mathematical modeling abilities. Since 2005, the MEAs used in Purdue University’s first-year engineering core course have included a double-blind peer review wherein individuals in the course (peers) are randomly assigned a student team’s response to an MEA to review.^ In 2007, a calibration exercise where by students evaluated a prototypical piece of student work and compared their review to that of an expert was added to the MEA implementation sequence in an attempt to increase the quality of feedback individuals were provided during the peer review. At that time the reviewer-reviewee assignment process remained random. The calibration exercise’s value was limited only to the self-reflective knowledge a student gained from comparing their responses on the MEA Rubric to those of the expert.^ This research investigated the impact of informed peer review matching algorithms on the quality of team’s final MEA responses. The algorithms use data from the calibration exercise and Teaching Assistant marks on the team’s first draft response as measurements of the reviewers accuracy and reviewees degree of assistance needed in order to make more informed matches. Three informed assignment methods were developed and one was thoroughly investigated to determine its impact. The violation of multiple critical assumptions surrounding the assignment method resulted in no apparent differences between the selected informed assignment method and the blind random assignment method. The failure of those assumptions indicates that the existing training methods and/or the rubric are inadequate for producing sufficiently valid TA and student marks on MEAs. Details of how the assumptions were violated and what must be done to resolve them to better investigate the research question are discussed. ^

Degree

Ph.D.

Advisors

Heidi Diefes-Dux, Purdue University.

Subject Area

Education, Mathematics|Engineering, General|Education, Curriculum and Instruction

Off-Campus Purdue Users:
To access this dissertation, please log in to our
proxy server
.

Share

COinS