(342ae) Accelerating on-the-Fly Active Learning of Catalyst Simulations Using Large Scale Pretrained Models
AIChE Annual Meeting
2021
2021 Annual Meeting
Computational Molecular Science and Engineering Forum
CoMSEF Poster Session
Tuesday, November 9, 2021 - 3:30pm to 5:00pm
Given the current energy and climate crises, new technologies must be developed to enable sustainable harvesting and usage of energy. Many proposed solutions, such as hydrogen fuel cells or the reduction of carbon dioxide to fuels, require the discovery of new heterogeneous electrocatalysts to be viable. Since physical experiments cost significant time and materials, high throughput computational screening methods are required to accelerate discovery. The Open Catalyst 2020 (OC20) dataset and community challenges has set forth tasks for models to improve upon if they are to serve as effective screening tools. Recently sophisticated models have been developed to perform initial structure to relaxed structure (IS2RS) tasks, at higher throughput than is possible than with traditional density functional theory (DFT) methods. However, while these models are fast, they are not quite accurate enough to be used reliably for IS2RS tasks. We take an alternative approach to this task, by transferring the knowledge gained by these pretrained models to an active learning framework for accelerating DFT relaxations. The active learning framework has been shown to be an effective method of accelerating molecular relaxations, and by combining it with the knowledge learned by a pretrained model we show that its performance may be further improved, with greater accuracy than the underlying model.