Submitting Campus
Daytona Beach
Department
Engineering Fundamentals
Document Type
Article
Publication/Presentation Date
10-2016
Abstract/Description
Background Students conducting peer review on authentic artifacts require training. In the training studied here, individual students reviewed (score and provide feedback on) a randomly selected prototypical solution to a problem. Afterwards, they are shown a side-by-side comparison of their review and an expert’s review, along with prompts to reflect on the differences and similarities. Individuals were then assigned a peer team’s solution to review.
Purpose This paper explores how the characteristics of five different prototypical solutions used in training (and their associated expert evaluations) impacted students’ abilities to score peer teams’ solutions.
Design/Method An expert rater scored the prototypical solutions and 147 student teams’ solutions that were peer reviewed using an eight item rubric. Differences between the scores assigned by the expert and a student to a prototypical solution and an actual team solution were used to compute a measure of the student’s improvement as a peer reviewer from training to actual peer review. ANOVA testing with Tukey’s post-hoc analysis was done to identify statistical differences in improvement based on the prototypical solutions students saw during the training phase.
Results Statistically significant differences were found in the amount of error a student made during peer review between high and low quality prototypical solutions seen by students during training. Specifically, a lower quality training solution (and associated expert evaluation) resulted in more accurate scoring during peer review.
Conclusions While students typically ask to see exemplars of “good solutions”, this research suggests that there is likely greater value, for the purpose of preparing students to score peers’ solutions, in students seeing a low-quality solution and its corresponding expert review.
Publication Title
Journal of Engineering Education
DOI
https://doi.org/10.1002/jee.20148
Publisher
Wiley-Blackwell Publishing Inc.
Grant or Award Name
National Science Foundation EEC 0835873 and EEC 1264005
Required Publisher’s Statement
This is the peer reviewed version of the following article: Verleger, M.A., Rodgers, K.J. and Diefes‐Dux, H.A. (2016), Selecting Effective Examples to Train Students for Peer Review of Open‐Ended Problem Solutions. J. Eng. Educ., 105: 585-604, which has been published in final form at https://doi.org/10.1002/jee.20148. This article may be used for non-commercial purposes in accordance with Wiley Terms and Conditions for Use of Self-Archived Versions.
Scholarly Commons Citation
Verleger, M., Rodgers, K. J., & Diefes-Dux, H. (2016). Selecting Effective Examples to Train Students for Peer Review of Open‐Ended Problem Solutions. Journal of Engineering Education, 105(4). https://doi.org/10.1002/jee.20148