Zulip Chat Archive
Stream: new members
Topic: Should "sampling" be non computable?
Dominic Steinitz (Oct 15 2024 at 10:47):
I expected that "sampling" from a normal random variable would be non computable but it seems that Lean tries to unroll the definition. I am assuming that increasing maxHeartbeats
would just take longer to give me an error.
def I : Set ℝ := {x | 0 ≤ x ∧ x ≤ 1}
variable {Y : I → ℝ} (hY : Measure.map Y ν = gaussianReal 0 1)
def x : I := ⟨0.5, ⟨by norm_num, by norm_num⟩⟩
#reduce Y x
-- Y
-- ⟨{
-- cauchy :=
-- Quot.mk (fun f g ↦ (f - g).LimZero) ⟨fun x ↦ { num := Int.ofNat 1, den := 2, den_nz := ⋯, reduced := ⋯ }, ⋯⟩ },
-- x.proof_1⟩
#reduce (proofs := true) Y x
-- (deterministic) timeout at `whnf`, maximum number of heartbeats (200000) has been reached
-- Use `set_option maxHeartbeats <num>` to set the limit.
Dominic Steinitz (Oct 15 2024 at 11:00):
It seems that a problem is with 0.5 as #eval x
gives
Real.ofCauchy (sorry /- (1 : Rat)/2, (1 : Rat)/2, (1 : Rat)/2, (1 : Rat)/2, (1 : Rat)/2, (1 : Rat)/2, (1 : Rat)/2, (1 : Rat)/2, (1 : Rat)/2, (1 : Rat)/2, ... -/)
Eric Wieser (Oct 15 2024 at 11:06):
Lean is only unrolling the definition because #reduce
tells Lean to do exactly that
Eric Wieser (Oct 15 2024 at 11:07):
And you can't #eval Y x
because Y
is a variable
Kyle Miller (Oct 15 2024 at 14:51):
All you're seeing is that it's unfolding the definition of x
. Notice the output is Y
followed by something that looks like the Cauchy sequence definition of 0.5.
This reduction isn't aware of the hY
variable. With #reduce
think "unfold definitions, apply fun
s"
Last updated: May 02 2025 at 03:31 UTC