A notion of entropy.
In the context of probability theory, the max-entropy or Hartley entropy of a probability distribution on a finite set is the logarithm of the number of outcomes with non-zero probability.
In quantum probability theory this means that the max entropy is the logarithm of the rank of the density matrix.
The ordinary Shannon entropy of a probability distribution is never greater than the max-entropy.
The Hartley entropy of a finite probability distribution $(p_i)_{i = 1}^n$ is the special case of Renyi entropy
for $\alpha = 0$.
order | $\phantom{\to} 0$ | $\to 1$ | $\phantom{\to}2$ | $\to \infty$ | ||||
---|---|---|---|---|---|---|---|---|
Rényi entropy | Hartley entropy | $\geq$ | Shannon entropy | $\geq$ | collision entropy | $\geq$ | min-entropy |
Review:
Scholarpedia, Quantum entropies – Hartley entropy
J. Aczél, B. Forte and C. T. Ng, Advances in Applied Probability Advances in Applied Probability, Vol. 6, No. 1 (Mar., 1974), pp. 131-146 (jstor:1426210)
Last revised on June 1, 2021 at 10:47:20. See the history of this page for a list of all contributions to it.