Contextualized Word Embeddings

Contextualized word embedding are the last development of the word embedding research. This word embedding not only takes the semantic meaning of the word but also considers the context of the word in the sentence. There are some words that can have multiple meaning based on the sentence it is used. For example, the game "go" vs the verb "go". It all depends on which context the word "go" is used.

There are various methods to get contextualized word embeddings:

  1. BERT Embeddings
  2. ELMo Embeddings

References

  1. https://aman.ai/primers/ai/bert/#contextual-vs-non-contextual-word-embeddings

Related Notes