Continuous Bag of Words
Continuous bag of words is another variation of Word2Vec Embedding training, where given the context the model has to predict the current word.
- Unlike Skip Gram Model, the model has to predict current word based on it's neighboring words.
- For the sentence: "a brown fox went to home" and window size of 2
- for word "brown", the training data is (a, fox, went), (brown)
- for word "fox", the training data is (a, brown, went, to), (fox)