(366h) Increase Recycle to Reduce the Purge – a System to Improve Curriculum Retention | AIChE

(366h) Increase Recycle to Reduce the Purge – a System to Improve Curriculum Retention

Authors 

Wagner, J. - Presenter, Trine University
Malefyt, A. P., Trine University
Hersel, A., Trine University
Borden, J., Northwestern University
Students’ ability to remember concepts from one course to the next is shockingly low. This is evident when students are asked to recall information and more so when asked to apply concepts from previous courses. In fact, some students who successfully complete engineering programs, carry over little knowledge from one course to the next. The current educational system where both students and faculty are evaluated based on individual courses is an integral part of this problem. Little time or effort is spent addressing this course-to-course concept / knowledge loss. While there have been some proposals to address this issue, e.g. spiral curriculum or project-based learning, the current course-to-course linear structure of engineering programs is unlikely to change for most universities over the foreseeable future.

A novel review system is being developed that attempts to address this issue. This review system will fit into any academic program curriculum and consists of two major parts. This first part is a concept question distribution system, and the second part involves a student-centered mechanism to create those questions.

The distribution system is initiated by first selecting concept areas for review and then scheduling the timing and frequency that questions are to be distributed. The instructor can pick individual questions for a time slot and let the system randomly select questions for the undesignated time slots from the selected concepts. For each day of scheduled questions, the question in the first time slot is emailed to all the students in the class. When the student responds to the email, it takes them to the QRQuestion web site where the student receives feedback and additional questions for that day. If the student gets the question incorrect the system will ask the student that same question again after a few days have passed. If the student gets the question correct, the student will not receive that question again for at least a couple of weeks. After a student answers the question correctly a couple of times in a row, that question is retired from the students backlog and the student’s individual score in that concept area improves. By keeping track of the performance in this way, a concept “heat map” can be developed for each student. The question delivery system incorporates repetition, spacing and interleaving. These techniques have all been shown to be important in long-term information retention. The instructor is emailed the questions and students performance on the questions. This makes it possible for the instructor to review the problematic questions in the next class and provide additional explanation.

It is important to have a mechanism to remove ambiguous or otherwise ineffective questions from the system. The first time a student sees a question, but before they answer it, they rate the question on clarity and relevance. Questions that receive low ratings or are too easy or too difficult as measured by student performance are eliminated from the question bank. In addition to a question being categorized according to concept, questions are also classified by the author as “Basic Knowledge”, “Basic Concept”, “More Advanced Concepts” or “Questions Involving Calculations”. “Basic Concept” and “More Advanced Concepts” type questions provide an explanation and “Questions Involving Calculations” include worked out solution as part of the feedback.

The point system for student participation is set up to reduce the incentive for cheating. The first time a student sees a question, the points received are the same whether they answer it correctly or not. When students receive the question again, points are removed if they answer it incorrectly.

The second part of the review system is the question creation process. Questions can be input by the instructor or students can use the system to create questions and then review and edit questions submitted by other students. In this way, students interact with the course content at a deeper level than simply answering questions, aiding in content mastery by the student. Students receive points for both authoring and reviewing questions. Student generated questions that have been reviewed and approved by two other students, serve as candidate questions for entry into the distribution system mentioned above. Candidate questions are then reviewed by a subject matter expert, typically the instructor, and promoted, rejected, or sent back to the student author. If the question is rejected or sent back, the instructor can decide if the editors and/or author should lose points for errors that should have been caught.

The Chemical Engineering Program developing this system is ideal for evaluating the effect of this review system since the department has many years of data for several key performance indicators of student retention of program content. The curriculum assessment exam, CAE, was developed by the faculty of the department over 20 years ago. The CAE is a 6-hour 2-part multiple choice examination covering the required curriculum taken by ChE students in their final semester. The department has also required students to register for the NCEES Fundamentals of Engineering Examination as a graduation requirement for the last 4 years. In addition, an objective mathematics assessment examination has been taken by senior chemical engineering students for over a decade.

The review system will be implemented in designated required courses throughout the program so students are constantly reviewing previous course content from sophomore through the senior year. The combination of the two parts of the system should help students “recycle” previous course content to reduce content “purge” and provide a mechanism to monitor personalized student progression through our program. Aggregate improvements can be monitored longitudinally from the department’s key performance indicators. If found effective, this system could serve as a template for other systems across other academic disciplines

Topics