- Contrastive learning can be Supervised Learning, Unsupervised Learning and also Semi-supervised Learning
- The main advantage of contrastive learning is to it can learn from unlabeled data
- In contrastive learning, there is an anchor, a positive and 1+ (in batch) negative points
- The main goal of this learning is to make the anchor closer to the positive
- And make the anchor far from the negative(s)
- During this learning, it will learn representation for the anchor, positive and negative(s)
Things to remember:
- Batch size is very important parameter for contrastive learning. Larger batch size is better as it gives more diverse negative samples
- We need hard negative and not false negatives
Loss
- Standard Contrastive Loss
- Triplet Loss
- InfoNCE Loss