Entropy - sagr4019/ResearchProject GitHub Wiki
Entropy
General
The term entropy is used in multiple fields of science, this includes thermodynamics, statistical mechanics or information theory.
Information theory
In information theory a message is send from a transmitter to a receiver. The entropy of this message is the expected or avarage amount of information the message contains. The information content in contrast is the actual amount of information a message contains. The higher the probability of an information is, the lower its entropy. In a coin toss the outcome can not be predicted, therefore the entropy of the result is high. When using a coin with heads on both sides, the entropy of the outcome is very low, since it always will be head (Example from Wikipedia).