# nLab max-entropy

Contents

### Context

#### Measure and probability theory

measure theory

probability theory

# Contents

## Idea

A notion of entropy.

In the context of probability theory, the max-entropy or Hartley entropy of a probability distribution on a finite set is the logarithm of the number of outcomes with non-zero probability.

In quantum probability theory this means that the max entropy is the logarithm of the rank of the density matrix.

## Properties

### Relation to Shannon entropy

The ordinary Shannon entropy of a probability distribution is never greater than the max-entropy.

### Relation to Renyi entropy

The Hartley entropy of a finite probability distribution $(p_i)_{i = 1}^n$ is the special case of Renyi entropy

$S_\alpha(p) \;=\; \frac{1}{1 - \alpha} ln \left( \underoverset {i = i} {n} {\sum} (p_i)^\alpha \right)$

for $\alpha = 0$.

order$\phantom{\to} 0$$\to 1$$\phantom{\to}2$$\to \infty$
Rényi entropyHartley entropy$\geq$Shannon entropy$\geq$collision entropy$\geq$min-entropy

## References

Review:

Last revised on June 1, 2021 at 10:47:20. See the history of this page for a list of all contributions to it.