CQUniversity
Browse

Comparison of two peer evaluation instruments for project teams

Download (198.42 kB)
Version 3 2025-01-07, 06:17
Version 2 2024-10-29, 03:05
Version 1 2017-12-06, 00:00
conference contribution
posted on 2025-01-07, 06:17 authored by K Meyers, Matthew Ohland, Stephen Silliman, Leo McWilliams, Tracy Kijewski-Correa

The College of Engineering at the University of Notre Dame has utilized a paper-pencil instrument for peer evaluations since 2005 as a portion of the assessment of project team efforts (typically 4-5 students per team) in its First Year Engineering Course. The College was considering moving from paper-pencil peer evaluations to an on-line, behaviorally based evaluation instrument, CATME1 . The instructors at Notre Dame conducted a comparative study of student feedback on these two instruments during the fall 2007. During the fall semester, the students (~380) within the first year course were divided into two groups, one group using the paper-pencil instrument and the second group using CATME, both groups of approximately equal size. After completion of peer evaluations for a seven-week course project, the students were required to complete a survey providing their reaction to the instrument they used in terms of perceived simplicity, comfort, confidentiality, usefulness of feedback, and overall experience. Comparison of results from the surveys provided insight into both the relative merit and drawbacks of the two administrations. Several of the follow up survey questions comparing the instruments did not show statistically significant differences in the sample means. In spite of the confounding of the instrument design and the administration method, useful results emerged. The biggest differences in student survey results were seen in the areas of feedback and overall experience, both of which were higher for CATME. Student confidence in instructor confidentiality (keeping their comments confidential) was high for both instruments, but it was slightly higher for the paper-pencil instrument. Because student perception of the quality of the feedback is critical to both rater accuracy and the student learning experience, this study enabled the College to make a data-driven decision to use the CATME instrument in future offerings of the first year course.

History

Start Page

13.315.1

End Page

13.315.19

Number of Pages

19

Start Date

2008-06-22

Finish Date

2008-06-25

ISSN

2153-5965

Location

Location

Publisher

2008 Annual Conference & Exposition Proceedings

Place of Publication

Place of Publication

Publisher License

© 2008 American Society for Engineering Education

Peer Reviewed

  • Yes

Open Access

  • No

Name of Conference

2008 Annual Conference & Exposition

Presentation Date

2008-06-22

Usage metrics

    CQUniversity

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC