(126d) Dow2Vec - How Text Embeddings Enhance NLP Models for Manufacturing | AIChE

(126d) Dow2Vec - How Text Embeddings Enhance NLP Models for Manufacturing

Authors 

Dessauer, M. - Presenter, Dow Chemical
Over the past few years, several language embedding model architectures have been shared with the open source community which have achieved top accuracy measures across many natural language processing benchmarks. One of these techniques called Bidirectional Encoder Representations from Transformers (BERT) can provide both out-of-the-box text features for downstream NLP tasks or a fine-tuned model trained on a smaller set of labelled data. This talk will provide the audience with an overview of how to apply this model to a manufacturing-specific document classification task and share the accuracy results when compared to traditional NLP feature extraction methods.