Complexity Explorer Santa Few Institute

Explore


entropy

Entropy, in the thermodynamic sense, is the tendency of a system to move from a more ordered state to a less ordered state. In Boltzmann's statistical mechanics, the notion of "order" and "disorder", and thus the definition of entropy, corresponded to the number of possible microstates corresponding to a given macrostate. In information theory, Shannon entropy and Hartley entropy measure the distribution of discrete states in a system. A uniform distribution would have maximum entropy. Shannon entropy measures frequencies of states, while Hartley entropy ignores frequency and only examines the presence of states (out of all possible states).


Topics
Information Theory, Entropy
Difficulty
1