Zulip Chat Archive
Stream: new members
Topic: Comparing the sum over the whole with an element
Asei Inoue (Feb 25 2024 at 13:34):
structure DiscreteRandomVariable where
/-- all possible events may happen -/
events : Type*
/-- the events are discrete -/
discrete : ∃ f : events → ℕ, Function.Injective f
/-- the probability mass function -/
prob : events → ℝ
/-- the probability is non-negative -/
prob_nneg : ∀ x, 0 ≤ prob x
/-- the sum of all probabilities is 1 -/
sum_eq_one : ∑' (x : events), prob x = 1
open DiscreteRandomVariable
@[simp]
theorem prob_leq (X : DiscreteRandomVariable) (x : X.events) : X.prob x ≤ 1 := by
have nneg := prob_nneg X x
have eq_one := sum_eq_one X
by_contra h; simp at h
have := calc
1 = ∑' (y : X.events), X.prob y := by rw [← eq_one]
_ ≥ prob X x := by sorry -- I want to show this!
replace : ¬ (1 < prob X x) := by exact not_lt.mpr this
contradiction
Asei Inoue (Feb 25 2024 at 13:35):
I am trying to formalise information theory in lean and have defined discrete random variables. I want to show its basic property, but I don't know how to prove it.
If it is already in the library, please let me know. Thanks.
Matt Diamond (Feb 25 2024 at 17:56):
you can use docs#le_tsum
Matt Diamond (Feb 25 2024 at 18:03):
in fact you can prove the whole thing with le_tsum
, no need for a proof by contradiction:
lemma prob_summable {X : DiscreteRandomVariable} : Summable X.prob :=
by
by_contra h
have := tsum_eq_zero_of_not_summable h
simp [sum_eq_one] at this
@[simp]
theorem prob_leq (X : DiscreteRandomVariable) (x : X.events) : X.prob x ≤ 1 := by
rw [← X.sum_eq_one]
apply le_tsum prob_summable
exact (fun _ _ => X.prob_nneg _)
Matt Diamond (Feb 25 2024 at 18:06):
there's also docs#PMF
Matt Diamond (Feb 25 2024 at 18:20):
if you switch to using PMF
, prob_leq
is equivalent to docs#PMF.coe_le_one
Asei Inoue (Feb 26 2024 at 10:51):
@Matt Diamond Thank you!!
Asei Inoue (Feb 27 2024 at 13:45):
The PMF allows probability values to go to infinity, but I don't want to allow that....
Asei Inoue (Feb 27 2024 at 13:47):
I cannot prove entropy's non-negativity...
import Mathlib.Tactic
import Mathlib.Probability.ProbabilityMassFunction.Basic
/-- PMF returns a value in `ENNReal`, which is equal to `[0, ∞]` -/
example {Ω : Type*} (X : PMF Ω) (x : Ω) : ENNReal := X x
-- an example of discrete variable.
namespace PMF.Example
/-- six-sided dice is an exmaple of discrete random variable. -/
def Die : PMF (Fin 6) := by
dsimp [PMF]
use ![1/6, 1/6, 1/6, 1/6, 1/6, 1/6]
-- I want to show that
sorry
end PMF.Example
open Real
variable {Ω : Type*}
def entropy (X : PMF Ω) : ℝ :=
- ∑' (x : Ω), (X x).toReal * log (X x).toReal
private abbrev H := @entropy
-- I want to show this!
theorem entropy_nonneg (X : PMF Ω) : 0 ≤ H X := by sorry
Asei Inoue (Feb 27 2024 at 13:50):
Because ENNReal.toReal
send ∞
to 0.
Yaël Dillies (Feb 27 2024 at 13:54):
Do you know we have a definition of entropy in PFR?
Asei Inoue (Feb 27 2024 at 13:55):
@Yaël Dillies Thanks for telling me. I did not know that ...
Asei Inoue (Feb 27 2024 at 14:04):
where is the definition of entropy? I dont find it...
Yaël Dillies (Feb 27 2024 at 14:32):
It's all in here: https://github.com/teorth/pfr/tree/master/PFR/ForMathlib/Entropy
Last updated: May 02 2025 at 03:31 UTC