Contextualized Word Embeddings
Contextualized word embedding are the last development of the word embedding research (till 2024). This word embedding not only takes the semantic meaning of the word but also considers the context of the word in the sentence. There are some words that can have multiple meaning based on the sentence it is used. For example, the game "go" vs the verb "go". It all depends on which context the word "go" is used.
There are various methods to get contextualized word embeddings:
Analogy:
Think of each token as a person in a meeting. Initially, each person has their own prepared statement (embedding). During discussion (self-attention), everyone listens to others (computes attention scores), weighs the importance of each input (softmax weights), and then updates their own viewpoint (contextual embedding) based on what they heard.