(3fs) An Evidence-Based Approach to Chemical Engineering Education | AIChE

(3fs) An Evidence-Based Approach to Chemical Engineering Education

Authors 

Burkholder, E. - Presenter, Stanford University
Research Interests: We tell undergraduate students that chemical engineers can do anything because we teach them to be good problem-solvers and prepare them for future learning. Indeed, our students go into all sectors from finance, to academia, to pharmaceuticals and beyond. No curriculum could be expected to train students in all of these areas, so how do we equip students to transfer the skills they learn to solve novel problems in drastically different domains? More fundamentally: what does transferable expertise in problem-solving even look like, and how would we even begin to measure it? Using scientific methods to answer such questions is at the core of the emerging field of chemical engineering education research. To begin to answer these questions, we first identified a set of decisions made by expert scientists and engineers as they solve authentic problems. We then developed an assessment to measure how well students are able to make decisions and reason like experts in the context of chemical engineering design. Pilot-testing of this assessment has revealed key expert/student differences that we are using to design instructional interventions aimed at teaching problem-solving more effectively. In designing these interventions, we also draw on our studies of demographic performance gaps (which, in reality, are gaps in students’ preparation) in large introductory STEM courses to ensure that we are teaching these skills equitably and preparing all students to succeed in their future careers.

Teaching Interests: As a scholar of engineering and physics education research, I view the classroom as a laboratory—a place to develop more effective and targeted instructional tools to improve how we teach. Though research in learning science provides a strong framework from which to construct good teaching practices, a large amount of work remains to be done in designing targeted instructional tools. The process is iterative: designing an instructional activity based on our research, testing it in courses, measuring the outcomes, refining the activity, and testing again. In a physics course I developed, we developed a problem-solving template designed to guide students through the expert problem-solving process as they worked on in-class activities, homework, and exams. Though this practice improved the quality of these students’ problem solving, including their ability to frame the problem and write a solution plan, they struggled to translate their plans into effective action. In the coming iteration of this course, we have devised a set of materials that will provide targeted practice in doing just this. Because my teaching experience spans a variety of institutions, disciplines, and student backgrounds—from tutoring community college veterans in statistics and chemistry, to teaching transport phenomena to Caltech graduate students—I have a uniquely broad perspective on how students learn. This allows me to compare how various instructional practices work (or don’t) for different groups of students in different disciplines, and gives me a more diverse array of teaching tools from which to choose.

Topics