Zulip Chat Archive

Stream: new members

Topic: Whether to increase recursion depth


J. J. Issai (project started by GRP) (Feb 17 2026 at 19:34):

I asked Aristotle for code to find some primes in various intervals. It responded with some code that did some initial sieving, some Miller-Rabin tests, and some #eval statements.

One of the statements was

#eval Nat.Prime 3003000000000000037

which in the Lean playground takes some time to evaluate.

Another was part of a routine, namely

def all_primes : List (Nat × Nat) :=
  (List.range 1000).map fun i =>
    let n := i + 1
    (n, solution_fast n)

#eval all_primes[77]

This gives the response at the line '#eval all_primes[77]'

maximum recursion depth has been reached
use `set_option maxRecDepth <num>` to increase limit
use `set_option diagnostics true` to get diagnostic information

Changing 1000 to 100 gets me a response which is a pair with the second member a 20 digit number which is likely prime, which is part of my goal.

Based on this, it suggests that the recursion depth has been reached by forming the list, not by accessing elements with large index. Is there a better method to generate and store the pairs for access? Would Array be useful here? (For proof of concept, I don't even need to store them if the playground can print some of them to the console.) Most importantly, do I need to increase recursion depth to generate the list?

The actual goal is to see how fast Lean can generate large (probable) primes and then verify the outputs are prime using something that is not Nat.prime.


Last updated: Feb 28 2026 at 14:05 UTC