Multimodal NLP Systems in Healthcare


At Winterlight Labs, we develop novel multimodal AI technology that can quickly and accurately quantify speech and language patterns to help detect and monitor cognitive and mental diseases, such as Alzheimer’s.

With my colleagues at Winterlight, we analyze the value of acoustic and linguistic modalities of speech and proposed a framework to detect cognitive impairment after reaching agreements between modalities. To tackle the problem of interpretability of black-box deep neural language models, we use probing tasks to identify linguistic information not well represented in various layers of BERT, but important for detecting cognitive impairment. We also investigate how robust are deep neural language models in classifying cognitive diseases and associated mental health issues, as such an understanding is a crucial part of model evaluation, which is also necessary for successful deployment of NLP models in real world.

Some additional details in the presentations below:



Solving Real-life Challenges in Detecting Cognitive Diseases from Speech using ML from Jekaterina Novikova, PhD


Multi-modal NLP Systems in Healthcare from Jekaterina Novikova, PhD