Entropy
- Entropy is the Expected Value of Surprise
- Less the probability
- More the Surprise
- More the Entropy
Entropy Formula
- Used 2-based log, as there are 2 class
- For N class, we will use
How entropy can be used?
Entropy is used to quantify similarity or differentiate in the number of classes
- Low entropy means high difference in the class probabilities, i.e., 0.01 class A, 0.99 class B
- so Low Entropy means High Confidence
- High entropy means low difference in the class probabilities, i.e., 0.49 class A, 0.51 class B
- so High Entropy mens Less Confidence