Zulip Chat Archive
Stream: new members
Topic: Lazy infinite lists in Lean
Anthony Wang (May 10 2025 at 01:57):
Hi all,
I'm an undergraduate who recently started learning Lean. I have some experience with the Dafny verification language but I somehow found a soundness bug in Dafny a few months ago so I guess I'm trying Lean now. Coming from a Haskell background, I tried implementing the lazy power series trick from Haskell using Lean's thunk feature instead. My code currently works, but I have a few questions about it:
inductive LazyList (α : Type u) where
| nil
| cons : α → Thunk (LazyList α) → LazyList α
def LazyList.take : Nat → LazyList α → List α
| 0, _ => .nil
| _, nil => .nil
| n + 1, cons x xs => .cons x <| xs.get.take n
def LazyList.map (f: α → β) : LazyList α → LazyList β
| nil => nil
| cons x xs => cons (f x) <| xs.get.map f
def LazyList.zipWith (f: α → β → γ) : LazyList α → LazyList β → LazyList γ
| nil, _ => nil
| _, nil => nil
| cons x xs, cons y ys => cons (f x y) <| zipWith f xs.get ys.get
unsafe def ints := LazyList.cons 1.0 <| ints.map (· + 1)
#eval ints.take 10
-- Doesn't work?
-- unsafe def integrate s (c : Float) := LazyList.cons c <| LazyList.zipWith (· / ·) s ints
notation s "integrate" c => LazyList.cons c <| LazyList.zipWith (· / ·) s ints
unsafe def expSeries := expSeries integrate 1.0
#eval expSeries.take 10
def evalAt n (s : LazyList Float) x := (s.take n).foldr (λ a acc => a + acc * x) 0
mutual
unsafe def sine := cosine integrate 0.0
unsafe def cosine := (sine integrate -1.0).map (-·)
end
#eval sine.take 10
#eval evalAt 100 sine 2.0
#eval 2.0.sin
-- Yay they match!
- Why does the
integratefunction cause infinite recursion but theintegratemacro works fine? - Is there a way to do this without
unsafesince my program doesn't actually run infinitely? My friend figured out a way to do this withpartialfunctions but the code is less elegant and a lot slower since it isn't memoized:
inductive LazyList (α : Type u)
| nil
| cons (head : α) (tail : Unit → LazyList α)
deriving Inhabited
def LazyList.map (f: α → β) : LazyList α → LazyList β
| nil => nil
| cons x xs => cons (f x) <| λ () => (xs ()).map f
partial def ints : Unit → LazyList Float := λ () => LazyList.cons 1.0 <| λ () => (ints ()).map (· + 1)
Aaron Liu (May 10 2025 at 02:03):
Do note that all elements of LazyList are provably finite
Anthony Wang (May 10 2025 at 02:10):
What do you mean by that? It seems to me that if I did def ints := LazyList.cons 1.0 <| ints.map (· + 1) and then wrote a function that recursed on ints until reaching a nil list, it would run infinitely, so that means ints in some sense is infinitely large. However, I only ever take a finite number of elements from it in my code.
Niels Voss (May 10 2025 at 02:30):
You might be interested in this long thread, which is about representing possibly non-terminating computations (which is not the same as what you are trying to do but is similar): #Is there code for X? > Divergence monad
The conclusion of the thread is there's a bit of a sore spot in Lean right now where you have to either choose between logical reasoning capabilities and good performance. Right now, you seem to have chosen performance, since you're using unsafe to define infinite lists when you should only be able to define finite lists. This problem might be resolved eventually, and if you're extremely clever about the way you use Thunk you might be able to work around it.
Anthony Wang (May 10 2025 at 02:45):
Thanks, I'll take a look at that thread.
Aaron Liu (May 10 2025 at 02:53):
Anthony Wang said:
What do you mean by that? It seems to me that if I did
def ints := LazyList.cons 1.0 <| ints.map (· + 1)and then wrote a function that recursed onintsuntil reaching anillist, it would run infinitely, so that meansintsin some sense is infinitely large. However, I only ever take a finite number of elements from it in my code.
I can define a function LazyList.length which forces the whole list and returns its length (which will be a finite number). I can define an append function and prove that appending a nonempty list will always result in a different list. I can define an accessor and prove that for every lazy list, there's always a large enough index such that trying accessing that index will be out of bounds. Basically anything you can think of when I say "provably finite".
Your def ints := ... errors with a failed to show termination, so it doesn't count. There's ways to get around this, but they make the provable results not reflect the actual generated code. If you make it unsafe, then that's completely opting out of the safety you get by being able to prove stuff and for sufficiently unsound definitions you can run unreachable code. If you make it partial then now it's opaque to proofs, and the behavior in the code generator could deviate from the proof model.
Anthony Wang (May 10 2025 at 02:56):
Anyways, I was just talking about this with my friend and they refined their idea of using partial to also have a thunk so that it's fast and doesn't need unsafe.
inductive LazyList (α : Type u)
| nil
| cons : α → Thunk (LazyList α) → LazyList α
deriving Inhabited
def LazyList.take : Nat → LazyList α → List α
| 0, _ => .nil
| _, nil => .nil
| n + 1, cons x xs => .cons x <| xs.get.take n
def LazyList.map (f: α → β) : LazyList α → LazyList β
| nil => nil
| cons x xs => cons (f x) <| xs.get.map f
def LazyList.zipWith (f: α → β → γ) : LazyList α → LazyList β → LazyList γ
| nil, _ => nil
| _, nil => nil
| cons x xs, cons y ys => cons (f x y) <| zipWith f xs.get ys.get
partial def ints _ := LazyList.cons 1.0 <| (ints ()).map (· + 1)
#eval (ints ()).take 10
def integrate (s : Unit → LazyList Float) (c : Float) := LazyList.cons c <| LazyList.zipWith (· / ·) (s ()) (ints ())
partial def expSeries _ := integrate expSeries 1.0
#eval (expSeries ()).take 10
def evalAt n (s : LazyList Float) x := (s.take n).foldr (λ a acc => a + acc * x) 0
mutual
partial def sine _ := integrate cosine 0.0
partial def cosine _ := (integrate sine (-1.0)).map (-·)
end
#eval (sine ()).take 10
#eval evalAt 1000 (sine ()) 2.0
#eval 2.0.sin
Aaron Liu (May 10 2025 at 03:13):
Just for fun, here's a proof that ints can't actually equal its definition:
inductive LazyList (α : Type u)
| nil
| cons : α → Thunk (LazyList α) → LazyList α
def LazyList.map (f: α → β) : LazyList α → LazyList β
| nil => nil
| cons x xs => cons (f x) <| xs.get.map f
def LazyList.length {α} : LazyList α → Nat
| .nil => 0
| .cons a b => b.get.length + 1
@[simp]
theorem LazyList.length_map {α β} (l : LazyList α) (f : α → β) : (l.map f).length = l.length := by
match l with
| .nil => simp [LazyList.length, LazyList.map]
| .cons x xs => simp only [LazyList.length, LazyList.map, Thunk.get]; rw [LazyList.length_map]
theorem not_exists_ints : ¬∃ ints : Unit → LazyList Float, ints = (fun _ => LazyList.cons 1.0 <| (ints ()).map (· + 1)) := by
intro ⟨ints, h⟩
have ss := congrArg (fun f => (f ()).length) h
simp [LazyList.length, Thunk.get] at ss
Vasilii Nesterov (May 10 2025 at 08:48):
There is docs#Stream'.Seq for infinite lists that we can reason about, but it has poor computational performance.
Last updated: Dec 20 2025 at 21:32 UTC