(607e) Transformer-Based Hybrid Modeling and Control of Evolving, Nonlinear Processes
AIChE Annual Meeting
2024
2024 AIChE Annual Meeting
Nanoscale Science and Engineering Forum
Machine Learning for Nanomaterials for Energy Applications
Wednesday, October 30, 2024 - 4:50pm to 5:10pm
Our hybrid modeling framework incorporates a recent innovation: attention-based time-series transformers (TSTs) coupled with positional encoding. This marks a pioneering venture into applying the transformer algorithm â a cornerstone in ChatGPTâs triumph - to nonlinear, time-varying processes. By analyzing data across both current and preceding time steps, the TST captures both immediate and historical changes in process states, granting a contextual insight on process dynamics, mirroring ChatGPTâs textual context understanding. This TST-based hybrid model identifies correlations between process parameters and state variables. Its versatility is evident as it adapts to a spectrum of models - from density function theory to computational fluid dynamics - and scales, spanning from laboratory to extensive industrial environments. We will present applications of this hybrid modeling and control architecture, showcasing its utility from labs to industrial processes, made possible through partnerships with leading chemical process enterprises.