Documentation

Mathlib.Analysis.SpecialFunctions.BinaryEntropy

Properties of Shannon q-ary entropy and binary entropy functions #

The binary entropy function binEntropy p := - p * log p - (1 - p) * log (1 - p) is the Shannon entropy of a Bernoulli random variable with success probability p.

More generally, the q-ary entropy function is the Shannon entropy of the random variable with possible outcomes {1, ..., q}, where outcome 1 has probability 1 - p and all other outcomes are equally likely.

qaryEntropy (q : ℕ) (p : ℝ) := p * log (q - 1) - p * log p - (1 - p) * log (1 - p)

This file assumes that entropy is measured in Nats, hence the use of natural logarithms. Most lemmas are also valid using a logarithm in a different base.

Main declarations #

Main results #

The functions are also defined outside the interval Icc 0 1 due to log x = log |x|.

Tags #

entropy, Shannon, binary, nit, nepit

Binary entropy #

noncomputable def Real.binEntropy (p : ) :

The binary entropy function binEntropy p := - p * log p - (1-p) * log (1 - p) is the Shannon entropy of a Bernoulli random variable with success probability p.

Equations
Instances For
    theorem Real.binEntropy_eq_negMulLog_add_negMulLog_one_sub' :
    Real.binEntropy = fun (p : ) => p.negMulLog + (1 - p).negMulLog
    @[simp]

    binEntropy is symmetric about 1/2.

    theorem Real.binEntropy_pos {p : } (hp₀ : 0 < p) (hp₁ : p < 1) :
    theorem Real.binEntropy_nonneg {p : } (hp₀ : 0 p) (hp₁ : p 1) :
    theorem Real.binEntropy_neg_of_neg {p : } (hp : p < 0) :

    Outside the usual range of binEntropy, it is negative. This is due to log p = log |p|.

    Outside the usual range of binEntropy, it is negative. This is due to log p = log |p|.

    theorem Real.binEntropy_neg_of_one_lt {p : } (hp : 1 < p) :

    Outside the usual range of binEntropy, it is negative. This is due to log p = log |p|

    Outside the usual range of binEntropy, it is negative. This is due to log p = log |p|

    For probability p ≠ 0.5, binEntropy p < log 2.

    Binary entropy is continuous everywhere. This is due to definition of Real.log for negative numbers.

    Binary entropy has derivative log (1 - p) - log p. It's not differentiable at 0 or 1 but the junk values of deriv and log coincide there.

    q-ary entropy #

    noncomputable def Real.qaryEntropy (q : ) (p : ) :

    Shannon q-ary Entropy function (measured in Nats, i.e., using natural logs).

    It's the Shannon entropy of a random variable with possible outcomes {1, ..., q} where outcome 1 has probability 1 - p and all other outcomes are equally likely.

    The usual domain of definition is p ∈ [0,1], i.e., input is a probability.

    This is a generalization of the binary entropy function binEntropy.

    Equations
    Instances For
      @[simp]
      @[simp]
      theorem Real.qaryEntropy_one (q : ) :
      Real.qaryEntropy q 1 = Real.log (q - 1)
      theorem Real.qaryEntropy_pos {q : } {p : } (hp₀ : 0 < p) (hp₁ : p < 1) :
      theorem Real.qaryEntropy_nonneg {q : } {p : } (hp₀ : 0 p) (hp₁ : p 1) :
      theorem Real.qaryEntropy_neg_of_neg {q : } {p : } (hp : p < 0) :

      Outside the usual range of qaryEntropy, it is negative. This is due to log p = log |p|.

      theorem Real.qaryEntropy_nonpos_of_nonpos {q : } {p : } (hp : p 0) :

      Outside the usual range of qaryEntropy, it is negative. This is due to log p = log |p|.

      The q-ary entropy function is continuous everywhere. This is due to definition of Real.log for negative numbers.

      theorem Real.differentiableAt_qaryEntropy {q : } {p : } (hp₀ : p 0) (hp₁ : p 1) :
      theorem Real.deriv_qaryEntropy {q : } {p : } (hp₀ : p 0) (hp₁ : p 1) :
      theorem Real.hasDerivAt_binEntropy {p : } (hp₀ : p 0) (hp₁ : p 1) :

      Binary entropy has derivative log (1 - p) - log p.

      theorem Real.hasDerivAt_qaryEntropy {q : } {p : } (hp₀ : p 0) (hp₁ : p 1) :
      theorem Real.deriv2_qaryEntropy {q : } {p : } :
      deriv^[2] (Real.qaryEntropy q) p = -1 / (p * (1 - p))

      Second derivative of q-ary entropy.

      theorem Real.deriv2_binEntropy {p : } :
      deriv^[2] Real.binEntropy p = -1 / (p * (1 - p))

      Strict monotonicity of entropy #

      theorem Real.qaryEntropy_strictMonoOn {q : } (qLe2 : 2 q) :

      Qary entropy is strictly increasing in the interval [0, 1 - q⁻¹].

      theorem Real.qaryEntropy_strictAntiOn {q : } (qLe2 : 2 q) :

      Qary entropy is strictly decreasing in the interval [1 - q⁻¹, 1].

      Binary entropy is strictly increasing in interval [0, 1/2].

      Binary entropy is strictly decreasing in interval [1/2, 1].

      Strict concavity of entropy #