(19f) Hands-On, Remote, and Simulated Labs: Is There a Productive Synergy? | AIChE

(19f) Hands-On, Remote, and Simulated Labs: Is There a Productive Synergy?

Authors 

DiBiasio, D. - Presenter, Worcester Polytechnic Institute
Henry, J. - Presenter, University of Tennessee at Chattanooga
Ozkaya, M. O. - Presenter, University of Tennessee at Chattanooga
Henderson, J. - Presenter, University of Tennessee at Chattanooga
Miletic, M. - Presenter, University of Illinois Urbana-Champaign
Clark, W. - Presenter, Worcester Polytechnic Institute

Hands-on, Remote, and Simulated Labs: Is There a Productive Synergy?

David DiBiasio1, Jim Henry2, Murat Ozkaya2, Jerrod A. Henderson2, William M. Clark1, and Marina Miletic3, 1Chemical Engineering Department, Worcester Polytechnic Institute, 100 Institute Road, Worcester, MA 01609, 2Chemical Engineering, University of Tennessee at Chattanooga, 615 McCallie Avenue, Chattanooga, TN 37403, 3Department of Chemical and Biomolecular Engineering, University of Illinois Urbana-Champaign, 206 Roger Adams Lab, Urbana, IL 61801

Background

Student learning outside the classroom may be complemented by a variety of structures. Laboratory and project work are typical modes. However, lab work can be hands-on, remote, or based upon simulations. Ideally, remote experiments can be conducted at any time from any place. They are particularly useful for students at universities where resources are severely limited and there is no access to any significant experimental equipment. Simulations are also accessible 24/7, do not require substantive resources (unless the software is expensive) and can provide efficient and easy ways to study multiple variables. Multiple runs are frequently impossible or too costly with real lab equipment whether remotely operated or not.

Although this work does not directly address the Unit Operations course, it represents an attempt to integrate UO equipment appropriately in the sophomore year. In addition it engages students in issues related to critical thinking (ambiguous problem statements, multiple solutions, teamwork, and difficulties understanding differences between theory and reality.

In all contexts, understanding student learning is critical. Each mode has unique advantages and disadvantages that contribute to learning. However, each format is different and it is not always clear what is optimal for student learning. For example, do the challenges in simulating a batch distillation column enhance or confuse basic concept acquisition?

At WPI distillation is taught in the sophomore year throughout a project-based spiral curriculum. All projects are team-based. We introduce basic concepts early in the sophomore year then revisit distillation throughout the year with successively more complex assignments and projects. This includes at least two lab experiences. The first experiment uses a batch column operated at total reflux to introduce students to multistage distillation including efficiency and energy balances at total reflux. A follow-on course explores pressure swing distillation without a lab component. Near the end of the year, the batch column operated at a constant external reflux ratio is used again in a lab project. This project engages students in process dynamics and challenges them to compare differences between analysis and reality using the Rayleigh analysis and to compare simulated dynamics (using Aspen BatchSep®) to observations.

Large enrollments forced us to “outsource” lab work by taking advantage of the available but remotely-operated column at UTC. The column is the same as WPI’s but is accessible and controllable through a web-based interface. 

Methodology

At recent AIChE meetings we presented results that showed student learning in remote and hands-on environments was essentially equivalent and that students who did simulation-only (no lab work) “experiments” might have demonstrated a slight edge in learning concepts related to process dynamics. This past year all student teams conducted hands-on experiments (remote or local) and all were supplemented with simulation runs.  We compared learning among these cohorts.

Teams were four students each (teams were self-selected), and they were randomly assigned to cohorts. Six teams ran a local bubble-cap column (hands-on), six teams ran a local sieve tray column (hands-on), six teams ran the UTC column (remote) and all teams used the simulation. All teams conducted identical experiments with identical assignments. Simulations required a base case assignment but made additional runs were made varying reflux ratio, heat input, and initial compositions. Analysis and project reporting requirements were the same for all cohorts.

Evaluation had three components with direct and indirect assessments, giving us some degree of triangulation. All cohorts completed surveys managed by a third partner: UIUC. The online survey included closed-end, Likert-scaled responses and open-ended questions. The open-ended questions included student attitudes but also probed team processing since we wanted to learn more about how each team handled roles, logistics, scheduling, data collection, and analysis.

Course instructors compared final reports from all teams. A pre/post in-class quiz compared individual learning for students in all cohorts in a quantitative manner. Qualitative measures include evaluating student attitudes about the experience and assessing students' improvement in ABET outcomes such as: (b) – design, analysis and interpretation of data; (d) – functioning on multidisciplinary teams; (g) – effective remote communication; (h) - have broad education necessary to understand the impact of engineering solutions in a global, economic, and environmental societal context; and (k) - using techniques, skills, and modern engineering tools (such as computers and web interfaces) necessary for engineering practice.

Results

Logistics

As previously reported, having completed two cycles of this pilot we have a fairly good understanding of what it takes to put 20 teams through this lab project. It would have been essentially impossible to run all teams through the local WPI lab or the UTC lab, during the time allotted for the course. However, splitting effort between 2 columns at WPI and UTC makes logistics workable. We learned that running either the remote or the simulated column presents different but no more burdensome scheduling issues than running the local column.

Student Attitudes

            Survey results will not be available until sometime after May, 2011. Previous results indicated that there are few, if any, serious issues. Students don’t mind being assigned to cohorts nor was there any indication of teams feeling they lost out on something because they weren’t in a different cohort. New results examining how student perceptions of simulations helped (or didn’t) will be available over the summer and will be analyzed in time for the conference.

Student Learning—Analysis and Critical Thinking

            All teams produced a significant final report that was graded by the WPI instructor using the same rubrics. We evaluated presentation quality, demonstration of appropriate concepts and ability to critically analyze and explain results, particularly system dynamics and differences between theory and observations. We examined learning issues as they specifically related to two important concepts: distillation column dynamics and column analysis using the Rayleigh method. We also gave a pre/post quiz regarding simple concepts in these areas to compare differences among cohorts. The WPI semester ends on May 3 so results from this analysis are not available as of this writing but will be presented at the conference.

Our previous content analysis of the reports showed there were common deficiencies that were independent of cohort. These included mistakes with energy balance calculations; and another typical confusion over quantitative comparison of experiment to Rayleigh analysis. Students expect exact comparisons yet reality is never that predictable. In too many cases, the report presentation quality was marginal. This result is not unusual given the end-of-semester time demands.

            Our report analysis is ongoing and details will not be available as of the deadline abstract. However, a preliminary observation is that difficulties with Aspen BatchSep® may have caused unnecessary frustration with some teams. The impact on learning is not known at this time.

The authors note that this is very preliminary and will only be confirmed (or not) when the content analysis is done.  Previously we’ve used the psychology of “presence” (Ma and Nickerson, ACM Computing Surveys, v. 68 no. 3, article 7, 2006) as a framework for interpretation. Student learning in lab environments is connected to their perception of presence and the process of collaboration. Presence can be physical, telepresence, or virtual (for example, a simulation). In our case the remote teams experienced telepresence, local teams experienced physical presence, and simulation adds a virtual presence to all teams. Hence a hands-on lab that has minimal physical interaction may not be significantly different than a remote one where the interaction is mediated via a computer. In fact, the UTC column has three cameras that can be manipulated by students bringing an element of physical presence into the mix. However, the Aspen simulation is viewable only through the standard Aspen interface. It is not clear whether students made synergistic connections between simulation and lab, or if they segmented the two experiences resulting in less than optimal learning. By meeting time we will understand these issues and also know more about students’ collaboration process, and how that dynamic affected interactions, and hence learning, within each cohort.

Topics