(197bf) Transition State Searches on Neural Network Potential Energy Surfaces | AIChE

(197bf) Transition State Searches on Neural Network Potential Energy Surfaces

Authors 

Gomes, J. S., Stanford University
Computational mechanistic studies of organic chemical reactions combine a model chemistry describing the potential energy surface on which chemical transformations take place together with algorithms for exploring these surfaces with the overall goal of producing reaction pathways, including the relative energies of all minima and potential energy barriers interconnecting them. Secondary goals include minimizing the computer time required to identify reaction pathways, and minimizing user intervention in the reaction pathway identification process. This is typically achieved through the use of local, surface-walking optimization techniques together with a nonlocal path-finding method such as the freezing string method.

Here, we describe the use of a neural network potential energy function together with nonlocal path-finding methods for guessing transition state structures connecting two minima. The neural network potential energy function is trained using a dataset of energies computed at the wB97X/6-31G* level of theory (ANI-1) for minimum energy structures of small organic molecules and off-equilibrium conformations obtained by normal mode sampling. We find that, though the calculation of energies along the reaction pathways is out of data distribution, the resulting model is able to qualitatively describe the potential surface along the coordinates of bond-making and breaking reactions.

We benchmark this methodology with a set of classic organic reactions and difficult cases. We show how transition state guess structures obtained with this method perform similarly to those obtained by electronic structure theory calculations in local surface-walking algorithms for saddle point searching, as measured by the number of electronic energy gradient calls required to successfully identify the transition state, with some exceptions. There is significant computational cost savings during the path-finding optimization by replacing the electronic energy gradients calls with gradients of the neural network potential energy function obtained by automatic differentiation.