Entropy

E(surprise)=log21P(x)P(x)=P(x)(log21log2P(x))=P(x)(0log2P(x))=P(x)log2P(x)=Entropy
Entropy Formula

Entropy=P(x)log2P(x)

How entropy can be used?

Entropy is used to quantify similarity or differentiate in the number of classes

  • Low entropy means high difference in the classes, i.e., 1 class A, 100 class B
  • High entropy means low difference in the classes, i.e., 49 class A, 51 class B

Related Notes