Documentation

Mathlib.Probability.Distributions.TwoValued

Distributions on two values #

This file proves a few lemmas about random variables that take at most two values.

theorem MeasureTheory.integral_of_ae_eq_zero_or_one {Ω : Type u_1} {m : MeasurableSpace Ω} {X : Ω} {μ : Measure Ω} (hXmeas : AEMeasurable X μ) (hX : ∀ᵐ (ω : Ω) μ, X ω = 0 X ω = 1) :
(x : Ω), X x μ = μ.real {ω : Ω | X ω = 1}

If an AEMeasurable function is ae equal to 0 or 1, then its integral is equal to the measure of the set where it equals 1.

theorem MeasureTheory.integral_one_sub_of_ae_eq_zero_or_one {Ω : Type u_1} {m : MeasurableSpace Ω} {X : Ω} {μ : Measure Ω} (hXmeas : AEMeasurable X μ) (hX : ∀ᵐ (ω : Ω) μ, X ω = 0 X ω = 1) :
(ω : Ω), 1 - X ω μ = μ.real {ω : Ω | X ω = 0}

If a random variable is ae equal to 0 or 1, then one minus its expectation is equal to the probability that it equals 0.

theorem ProbabilityTheory.condVar_of_ae_eq_zero_or_one {Ω : Type u_1} {m : MeasurableSpace Ω} {X : Ω} {m₀ : MeasurableSpace Ω} (hm : m m₀) {μ : MeasureTheory.Measure Ω} [MeasureTheory.IsFiniteMeasure μ] (hXmeas : AEMeasurable X μ) (hX : ∀ᵐ (ω : Ω) μ, X ω = 0 X ω = 1) :
condVar m X μ =ᵐ[μ] μ[X | m] * μ[1 - X | m]

If a random variable is ae equal to 0 or 1, then its conditional variance is the product of the conditional probabilities that it's equal to 0 and that it's equal to 1.

theorem ProbabilityTheory.variance_of_ae_eq_zero_or_one {Ω : Type u_1} {m : MeasurableSpace Ω} {X : Ω} {μ : MeasureTheory.Measure Ω} [MeasureTheory.IsZeroOrProbabilityMeasure μ] (hXmeas : AEMeasurable X μ) (hX : ∀ᵐ (ω : Ω) μ, X ω = 0 X ω = 1) :
variance X μ = μ.real {ω : Ω | X ω = 0} * μ.real {ω : Ω | X ω = 1}

If a random variable is ae equal to 0 or 1, then its variance is the product of the probabilities that it's equal to 0 and that it's equal to 1.