Entropy

E(surprise)=log21P(x)P(x)=P(x)(log21log2P(x))=P(x)(0log2P(x))=P(x)log2P(x)=Entropy
Entropy Formula

Entropy=P(x)log2P(x)

How entropy can be used?

Entropy is used to quantify similarity or differentiate in the number of classes

  • Low entropy means high difference in the class probabilities, i.e., 0.01 class A, 0.99 class B
    • so Low Entropy means High Confidence
  • High entropy means low difference in the class probabilities, i.e., 0.49 class A, 0.51 class B
    • so High Entropy mens Less Confidence

Related Notes