Variance of random variables #
THIS FILE IS SYNCHRONIZED WITH MATHLIB4. Any changes to this file require a corresponding PR to mathlib4.
We define the variance of a real-valued random variable as Var[X] = 𝔼[(X - 𝔼[X])^2]
(in the
probability_theory
locale).
Main definitions #
probability_theory.evariance
: the variance of a real-valued random variable as a extended non-negative real.probability_theory.variance
: the variance of a real-valued random variable as a real number.
Main results #
probability_theory.variance_le_expectation_sq
: the inequalityVar[X] ≤ 𝔼[X^2]
.probability_theory.meas_ge_le_variance_div_sq
: Chebyshev's inequality, i.e.,ℙ {ω | c ≤ |X ω - 𝔼[X]|} ≤ ennreal.of_real (Var[X] / c ^ 2)
.probability_theory.meas_ge_le_evariance_div_sq
: Chebyshev's inequality formulated withevariance
without requiring the random variables to be L².probability_theory.indep_fun.variance_add
: the variance of the sum of two independent random variables is the sum of the variances.probability_theory.indep_fun.variance_sum
: the variance of a finite sum of pairwise independent random variables is the sum of the variances.
The ℝ≥0∞
-valued variance of a real-valued random variable defined as the Lebesgue integral of
(X - 𝔼[X])^2
.
The ℝ
-valued variance of a real-valued random variable defined by applying ennreal.to_real
to evariance
.
Equations
Chebyshev's inequality for ℝ≥0∞
-valued variance.
Chebyshev's inequality : one can control the deviation probability of a real random variable from its expectation in terms of the variance.
The variance of the sum of two independent random variables is the sum of the variances.
The variance of a finite sum of pairwise independent random variables is the sum of the variances.