< prev | next >

Entropy

As a general term, Entropy refers to the level of disorder in a system.

The entropy of a random variable is the average level of information (also thought of as uncertainty) in possible outcomes:

entropy = uncertainty = randomness = average level of information

One use of entropy in Machine Learning is in Cross Entropy Loss.

An Example

An example is the mutually exclusive events toss of a fair coin two times. There are four possible outcomes as illustrated below:

Each toss of the coin can be represented by one binary digit (bit) of information representing two values:

  • 1: heads

  • 0: tails

Therefore, two tosses of the coin would be represented by two bits of information, aka: two bits entropy.

References