(379s) Technical Writing Rubric Design Based on Inter-Rater Reliability | AIChE

(379s) Technical Writing Rubric Design Based on Inter-Rater Reliability

Authors 

Wettstein, S. - Presenter, Montana State University
Brown, J., Montana State University
Creating rubrics that multiple instructors used to grade technical writing from consistently is challenging. The purpose of this work was to evaluate the inter-rater reliability (IRR) of a rubric used to grade technical reports in a senior-level chemical engineering laboratory course that has multiple instructors that grade deliverables. The rubric consisted of fifteen constructs that provided students detailed guidance on instructor expectations with respect to the report sections, formatting, and technical writing aspects such as audience, context and purpose. Four student reports from previous years were scored using the rubric, and IRR was assessed using a two-way mixed, consistency, average-measures intra-class correlation (ICC) for each construct. Constructs were rated from poor to excellent based on their ICC and then, the instructors met as a group to discuss their scoring and reasoning. Revisions to the rubric were made and the IRR completed again. A key learning from this process was the importance of the instructor discussion around their reasoning for the scores and the importance of an ‘instructor orientation’ involving discussion and practice using the rubrics in the case of multiple instructors or a change in instructors. Additionally, the fewer constructs that instructors interpreted has having potential overlap (e.g., double-counting errors), the more reliable the grading. The developed rubric has the potential for broad applicability to engineering laboratory courses with heavy technical writing components and could be adapted for alternative styles of technical writing genre.