(244c) Cattsunami: Accelerating Transition State Energy Calculations with Pre-Trained Graph Neural Networks | AIChE

(244c) Cattsunami: Accelerating Transition State Energy Calculations with Pre-Trained Graph Neural Networks

Authors 

Kitchin, J., Carnegie Mellon University
Zitnick, C. L., Facebook AI Research
Ulissi, Z., Carnegie Mellon University
Direct access to transition state energies at low cost unlocks the possibility of accelerating catalyst discovery. We show that the top performing graph neural network potential trained on the OC20 dataset, a related but different task, is able to find transition states energetically similar (within 0.1 eV) to density functional theory (DFT) 91% of the time with a 28x speedup. This speaks to the generalizability of the models, having never been explicitly trained on reactions, the machine learned potential approximates the potential energy surface well enough to be performant for this auxiliary task. We introduce the Open Catalyst 2024 Nudged Elastic Band (OC24NEB) dataset, which is made of 932 DFT nudged elastic band calculations, to benchmark machine learned model performance on transition state energies. To demonstrate the efficacy of this approach, we replicated a well-known, large reaction network with 61 intermediates and 174 dissociation reactions at DFT resolution (40 meV). In this case of dense enumeration, we realize even more cost savings and used just 12 GPU days of compute, where DFT would have taken 52 GPU years, a 1500x speedup. Similar searches for complete reaction networks could become routine using the approach presented here. Additionally, we replicated an ammonia synthesis activity volcano and systematically found lower energy configurations of the transition states and intermediates on six stepped unary surfaces. This is a scalable approach to drive faster insights with more complete treatment of configurational space to accelerate catalyst discovery.

Topics