Background Peer review is a beneficial pedagogical tool. Despite the abundance of data instructors often have about their students, most peer review matching is by simple random assignment. In fall 2008, a study was conducted to investigate the impact of an informed algorithmic assignment method, called Un‐weighted Overall Need (UON), in a course involving Model‐Eliciting Activities (MEAs). The algorithm showed no statistically significant impact on the MEA Final Response scores. A study was then conducted to examine the assumptions underlying the algorithm.
Purpose (Hypothesis) This research addressed the question: To what extent do the assumptions used in making informed peer review matches (using the Un‐weighted Overall Need algorithm) for the peer review of solutions to Model‐Eliciting Activities decay?
Design/method An expert rater evaluated the solutions of 147 teams' responses to a particular implementation of MEAs in a first‐year engineering course at a large mid‐west research university. The evaluation was then used to analyze the UON algorithm's assumptions when compared to a randomly assigned control group.
Results Weak correlation was found in the five UON algorithm's assumptions: students complete assigned work, teaching assistants can grade MEAs accurately, accurate feedback in peer review is perceived by the reviewed team as being more helpful than inaccurate feedback, teaching assistant scores on the first draft of an MEA can be used to accurately predict where teams will need assistance on their second draft, and the error a peer review has in evaluating a sample MEA solution is an accurate indicator of the error they will have while subsequently evaluating a real team's MEA solution.
Conclusions Conducting informed peer review matching requires significant alignment between evaluators and experts to minimize deviations from the algorithm's designed purpose.
Journal of Engineering Education
Wiley-Blackwell Publishing Inc.
Grant or Award Name
National Science Foundation grants DUE 0535678 and EEC 0835873
Required Publisher’s Statement
This is the peer reviewed version of the following article: Verleger, M., Diefes‐Dux, H., Ohland, M. W., Besterfield‐Sacre, M., & Brophy, S. (2010). Challenges to informed peer review matching algorithms. Journal of Engineering Education, 99(4), 397-408, which has been published in final form at https://doi.org/10.1002/j.2168-9830.2010.tb01070.x. This article may be used for non-commercial purposes in accordance with Wiley Terms and Conditions for Use of Self-Archived Versions.
Scholarly Commons Citation
Verleger, M., Diefes-Dux, H., Ohland, M. W., Besterfield-Sacre, M., & Brophy, S. (2010). Challenges to Informed Peer Review Matching Algorithms. Journal of Engineering Education, 99(4). https://doi.org/10.1002/j.2168-9830.2010.tb01070.x