Bag of Words

Bag of words is a count based Word Embeddings where each sentence/document is represented with the frequency of the words. The dimension of that embedding is the total number of words in the vocabulary and the value of each position is the frequency of that word in the specific sentence/document.

Its a variation of One Hot Vector, but has the count instead of just 0/1


Related Notes