(2gu) Designing Dynamic Materials for Selective Reactions at Ultra-Low Substrate Concentrations, Enabling Direct Air Carbon Capture and Utilization
AIChE Annual Meeting
2022
2022 Annual Meeting
Meet the Candidates Poster Sessions
Meet the Faculty and Post-Doc Candidates Poster Session
Sunday, November 13, 2022 - 1:00pm to 3:00pm
Increasing CO2 emissions continue to raise global temperatures, threatening ecosystemsâ stability and future generationsâ health. Many climate models suggest that negative emissions technologies, such as carbon capture, may be a key strategy for reaching zero global emissions. While technologies to capture and store carbon are relatively well-developed for concentrated sources, such as for power generation and chemical facilities, capture and utilization technologies are in their infancy for the low concentrations present in the atmosphere. Recent advances in electrochemical-based CO2 separations show the promise of dynamic materials in efficiently capturing carbon at concentrations much lower than traditional industrial sources. However, there remain methodological and modeling challenges that prevent widespread carbon capture and use at ultra-low concentrations.
My future research group will develop new methodologies for designing dynamic materials that efficiently capture and upgrade molecules at ultra-low concentrations. These methodologies will consist of physics-based models that are parameterized with quantum chemistry, and data-based methods with physics-imposed constraints.
First, my research group will develop quantum-based descriptors to capture substrate surface interactions that can help determine whether materials will selectively bond target molecules, such as CO2. These quantum-based descriptors should allow us to identify classes of materials that are ideally suited for capturing different substrates before performing high-throughput computations. Such descriptor design will also aid in understanding how to upgrade captured molecules with minimal added thermal energy. Opportunities to minimize thermal input include improving photo-catalysis, taking advantage of thermal distributions, and application of room temperature spontaneous endothermic reactions.
The second research focus of my group is to overcome computational complexity constraints associated with modeling carbon capture. Specifically, we will develop faster methods to explore conformers of important complexes, compute reaction rates, and design optimal catalytic structures via screening and inverse design paradigms. We will do this by using transfer learning to adapt existing machine learning methods for property prediction and molecule generation to work for these systems, which have limited data. Examples of these adaptions include co-training, embedding physical models into the machine learning architectures, and adding physical constraints.
Finally, in conjunction with methodology development for predictive computational models, I am interested in developing new methods for quantifying uncertainty. Particularly I am interested in indirectly using experimental data to bound model error and developing methods to quantify error in weighted ensembles.
The new computational methods that my lab develops for direct air carbon capture, will also broadly apply to design materials that will selectively capture and catalyze reactions of other substrates at ultra-low concentrations, such as methane, nitrous oxides, fluorinated gases, and nanoplastics.
Postdoctoral Project:
âProperty Prediction and Generative Modeling for Small Chemical Datasets via Transfer Learningâ
Advised by Brian C. Barnes (Army Research Laboratory) and Klavs F. Jensen (Massachusetts Institute of Technology)
PhD Dissertation
âEnabling Predictive Science for Catalysis under Uncertainty.â
Advised by Dionisios G. Vlachos (University of Delaware)
Research Experience:
My research experience primarily lies in theoretical and computational methodology development. Specifically, I developed surrogate models for predictive catalysis and designed transfer learning approaches for applying machine learning to small chemical datasets.
During graduate school, the methods I designed required new physics-based models inspired from quantum-based principles and parameterized with quantum chemistry calculations [2, 3, 7], as well as data-based models for solving inverse design problems via machine learning [3, 5]. The merging of physics and data-based models was necessary because my past research often covered multiple length or time scales. For example, I developed a site-specific mean-field model for optimizing the surface structure of oxygen reduction reaction catalysts to compute rates for thousands of different structures in aqueous environments using just a small number of quantum chemistry calculations [6]. I also developed an inverse model to characterize surface microstructure from infrared spectra of probe molecules via machine learning that required simulation of ordered overlayers of small molecules at variable coverages [5]. I later derived new theory to determine the best probe molecule for characterizing a specific material [3].
My postdoctoral research experience has primarily focused on building Directed-Message Passing Neural Networks (D-MPNNs) for property prediction with small experimental datasets. The advantage of D-MPNNs is that they build molecular features during training. As D-MPNNs were designed for training with tens of thousands of data points in mind, application to smaller datasets is challenging. To overcome this size-scale limitation, I have worked on several transfer learning approaches. One approach has been co-training a model on a small number of experimentally measured properties simultaneously with larger datasets of computationally computed related properties, sharing the burden of feature and parameter learning [1]. Another approach has been embedding physical models and constraints into the network architecture.
In almost every model I have worked on, I have incorporated experimental data throughout model design. This includes parameterization of certain model components when experiments are superior to computation, and quantifying uncertainty in model predictions. Quantifying model uncertainty and developing new methods to do so, either independently or in collaboration with mathematicians, has been an integral part of my graduate and postdoctoral projects.
Teaching Interests:
ChemE Core Courses: Chemical Kinetics, Reaction Engineering, Thermodynamics, Statistical Mechanics, Numerical Methods, Organic Chemistry, Separations
ChemE Electives: Quantum Chemistry, Machine Learning, Molecular Modeling
I am open to teaching other traditional chemical engineering courses, as well as designing new courses. My background in teaching spans many different formats and age groups. As a graduate student at the University of Delaware, I mentored both undergraduate and graduate students on senior theses and graduate research projects, respectively. While working as a teaching assistant for Delawareâs senior process design course, I initiated, developed, and taught a new in-person team-building course for ChemE honors students. As president of the chemical engineering honor society at my undergraduate institution, I tutored freshman and sophomores.
While working as postdoctoral scholar at MIT, I have also advised Harvard undergraduate upperclassmen on research and graduate school. As both an informal mentor for students at the University of Delaware and a formal graduate school advisor for students at Harvard, I have developed a philosophy for teaching and advising that focuses on developing a scientific curiosity. The first step is understanding the student, knowing their strengths, and tracking their progress. From there, I can more effectively teach core material in a way that captures that interest. On the advising side of things, I find that introducing a list of key research papers, listed in the order they should be read, can aid in understanding how current methods evolved, and where advances need to be achieved to fill in the gaps.
I am interested in maintaining involvement in introductory science and engineering education. Based on personal experience volunteering at under-resourced elementary schools, I think that that hands on experiments are a great way for the youngest learners to engage with science. Looking forward, I think coding platforms designed for young students are also a good way to help them develop their own ideas about how things work.
Selected Publications:
[1] J. L. Lansford, B. C. Barnes, B. M. Rice, and K. F. Jensen, Building Chemical Property Models for Energetic Materials from Small Datasets using a Transfer Learning Approach (submitted).
[2] J. L. Lansford, Sophia Kurdziel, and D. G. Vlachos, Scaling of Transition State Vibrational Frequencies and Application of d-Band Theory to the BrønstedâEvansâPolanyi Relationship on Surfaces. J. Phys. Chem. C, (2021).
[3] J. L. Lansford and D. G. Vlachos, Spectroscopic Probe Molecule Selection Using Quantum Theory, First-Principles Calculations, and Machine Learning. ACS Nano. 14, 17295 (2020)
[4] J. Feng, J. L. Lansford, M. A. Katsoulakis, and D. G. Vlachos, Explainable and trustworthy artificial intelligence for correctable modeling in chemical sciences. Sci. Adv. 6, eabc3204 (2020).
[5] J. L. Lansford and D. G. Vlachos, Infrared Spectroscopy Data- and Physics-driven Machine Learning for Characterizing Surface Microstructure of Complex Materials. Nat. Commun. 11, 1513 (2020).
[6] M. Núñez, J. L. Lansford, and D.G. Vlachos, Optimization of the facet structure of transition-metal catalysts applied to the oxygen reduction reaction. Nat. Chem. 11, 449â456 (2019).
[7] J. L. Lansford, A. V. Mironenko, and D. G. Vlachos, Scaling relationships and theory for vibrational frequencies of adsorbates on transition metal surfaces. Nat. Commun. 8, No. 1842 (2017).