AI Seminar

The Role of Context in Natural Language Processing

Soujanya PoriaPostdocSchool of Computer Science & Engineering, NTU, Singapore
SHARE:

Context is at the core of Natural Language Processing (NLP) research. Newly introduced contextual word embedding models like ELMo and BERT have obtained state-of-the-art results in several NLP tasks. Similar to the common use of other embeddings like Glove, contextual word embeddings can be used to initialize word vectors as the lowest layer of any downstream NLP tasks. The notion of context can vary from problem to problem. For example, while calculating word representations, the surrounding words carry contextual information. Likewise, to classify a sentence in a document, other neighboring sentences are considered as its context. In this talk, I will present a number of approaches that leverage contextual information to tackle four different NLP tasks, namely: aspect-level sentiment analysis; utterance level sentiment analysis in monologues; affective dialogue; and sarcasm detection. I will argue that awareness to context is a critical component for building NLP models that can understand and generate language in the way we do as humans, and I will outline short- and medium-term prospects for research in NLP that leverages context at different levels.

Sponsored by

CSE