Word2Vec Embedding

Intuition

Issues:

  1. Single vector per word
    1. Homonym or Polysemy words are represented with same vector
  2. Not contextualized
  3. Context window limitation
    1. capturing only local information rather than global information
  4. OOV words
  5. Phase representation
  6. The large vocabulary size in the softmax layer becomes a very big issue as the model has to predict all the probabilities even if there only one single target
    1. The issue was solved by Hierarchical Softmax

References

  1. https://jalammar.github.io/illustrated-word2vec/
  2. https://mccormickml.com/2016/04/19/word2vec-tutorial-the-skip-gram-model/
  3. https://aman.ai/primers/ai/word-vectors/#count-based-techniques-tf-idf-and-bm25

Related Notes