Zulip Chat Archive

Stream: general

Topic: noncomputable theory


Kevin Buzzard (Apr 03 2020 at 08:36):

I remember @Mario Carneiro once saying something of the form "I wish people wouldn't start files with noncomputable theory". It now occurs over 100 times in mathlib. Is what is going on the following? In a file with no noncomputable theory in it, If I incorrectly do or don't mark a definition as noncomputable then Lean will typically tell me. But if I start a file with noncomputable theory then this check is just switched off. Is it then the case that various possibly-computable functions get marked as noncomputable? Does this change anything? Could a combination of a linter and a way of temporarily switching off noncomputable theory do something about this, or have I just misunderstood what noncomputable theory is doing? Of course I'm not suggesting that topology/metric_space/* be made computable, this clearly shouldn't be done in mathlib which has made the decision to be resolutely classical (thank goodness!) -- but I'm just wondering whether we have computable functions incorrectly tagged as noncomputable.

Alex J. Best (Apr 03 2020 at 08:42):

I just made a file put noncomputable theory at the start and made a def and eval'd it, worked fine, so it seems adding noncomputable theory doesn't stop basic functions from being computable, just removes the warning when you "accidentally" make something noncomputable.

Kevin Buzzard (Apr 03 2020 at 08:59):

I wonder why Mario was complaining then? Maybe I misremembered?

Alex J. Best (Apr 03 2020 at 09:02):

https://leanprover.zulipchat.com/#narrow/stream/113488-general/topic/computability/near/168198880

Kevin Buzzard (Apr 03 2020 at 10:05):

Me : "I just don't get why computability is important to me if I just want to make mathematical objects and state and prove theorems about them."

Mario : "It's not, unless you are Anrdej Bauer"

This is not right. I might not be interested in it, but if I want to do mathematics in Lean "the way mathmaticians do it" then I am absolutely relying on it.

Kevin Buzzard (Apr 03 2020 at 10:05):

because I absolutely need high-powered tactics like ring

Kevin Buzzard (Apr 03 2020 at 10:06):

I need details_left_to_the_interested_reader, that's the kind of tactic which will get generic mathematicians interested. simpon steroids.

Mario Carneiro (Apr 03 2020 at 10:15):

I don't recall making this specific complaint

Mario Carneiro (Apr 03 2020 at 10:16):

In a file with no noncomputable theory in it, If I incorrectly do or don't mark a definition as noncomputable then Lean will typically tell me. But if I start a file with noncomputable theory then this check is just switched off.

Actually, the check isn't switched off, the noncomputable marking is inferred rather than checked. If you look at defs made in a file with noncomputable theory, you will see that they are still being marked noncomputable, you just don't have to write this keyword on the def yourself

Mario Carneiro (Apr 03 2020 at 10:17):

and defs that are actually computable will not be unnecessarily marked noncomputable

Mario Carneiro (Apr 03 2020 at 10:33):

Kevin Buzzard said:

Me : "I just don't get why computability is important to me if I just want to make mathematical objects and state and prove theorems about them."

Mario : "It's not, unless you are Anrdej Bauer"

This is not right. I might not be interested in it, but if I want to do mathematics in Lean "the way mathmaticians do it" then I am absolutely relying on it.

This is an argument for why you should care that tactics, i.e. programs meant to be run, are computable. It is not an argument for why e.g. real numbers, algebraic varieties, or multivariate polynomials should be computable. Tactics are already meta, which means they are essentially opting out to the "proofy" part of lean, which means if they didn't fit in the "programming" part of lean either they would be useless. So these functions are all expected to be computable. For non-meta functions, there is a proofy aspect, and you can opt out of the "programming" part by using noncomputable. My point is that in most or all of these cases, you the mathematician don't have to ever care about that marking, it only concerns programming applications.

It may turn out that there is some part of lean that is in the intersection of both of these worlds, such as nat or int or list A, where we want the functions to be both non-meta and computable, because the types serve both purposes. However, it is often the case that the best way to represent something for programming is not the best way for proofs, so even in these cases having a distinction is useful. In fact, even in these three cases, this is exactly what happens. nat and int are only good for computation because of a dirty hack in the lean VM to replace the incredibly inefficient unary representation with a bignum arithmetic package. If you had to do this in lean the type gmp_nat would be significantly more complicated and would certainly not look like an inductive type with zero and succ. The computational content of list A is a singly linked list, which is well known not to be the best way to store lists of values in a computer by a pretty significant margin. Here we have the buffer A type as an alternative, but it's not as nice for proving theorems.

So I think that to a first approximation there is little overlap between "lean for proofs" and "lean for programming". Having the ability to do both at once is nice, but usually this requires a trade off between the computationally efficient thing and the easy to prove thing. So when you are doing maths just forget about computability, and when you are doing programming choose the best types for the job.

Kevin Buzzard (Apr 03 2020 at 11:40):

Re list A and buffer A we can prove the theorems about lists and you can use the transfer tactic to port the theorems into theorems about buffers

Kevin Buzzard (Apr 03 2020 at 11:43):

The ring tactic works with the real numbers and I really need it on the real numbers for pedagogical purposes. The real numbers aren't computable but if there were no ring tactic I would have left long ago. Scott's tactic obviously make it clear that this is what we mathematicians are aiming for -- a system that can do the obvious bits

Kevin Buzzard (Apr 03 2020 at 11:44):

But for that all the computability stuff is super important

Kevin Buzzard (Apr 03 2020 at 11:45):

I'm just putting it all together that's all. Why doesn't a transfer tactic solve your buffer list woes?

Mario Carneiro (Apr 03 2020 at 12:25):

Re list A and buffer A we can prove the theorems about lists and you can use the transfer tactic to port the theorems into theorems about buffers

Right, I think this should be the preferred method. This is how all the theorems about num are proved, and if we wrote a gmp_nat type it would also be presumably isomorphic to nat and hence we could do the same trick. But this is only underscoring the importance of separating the implementation type from the abstract mathematical type

Mario Carneiro (Apr 03 2020 at 12:27):

Kevin Buzzard said:

The ring tactic works with the real numbers and I really need it on the real numbers for pedagogical purposes. The real numbers aren't computable but if there were no ring tactic I would have left long ago. Scott's tactic obviously make it clear that this is what we mathematicians are aiming for -- a system that can do the obvious bits

The ring tactic does not work on real numbers, it works on expr, which is a perfectly reasonable computable type. It just so happens that the exprs that it manipulates are talking about real numbers, but by moving to meta land even the most complicated noncomputable mathematical types become computable again because everything is an expr

Mario Carneiro (Apr 03 2020 at 12:29):

Kevin Buzzard said:

But for that all the computability stuff is super important

Nope, ring should demonstrate this fairly well - for tactics it doesn't matter at all if the type that is being discussed is computable

Mario Carneiro (Apr 03 2020 at 12:30):

Kevin Buzzard said:

Why doesn't a transfer tactic solve your buffer list woes?

I think it does, and in the proofs in data.num.lemmas I think you will actually see a poor man's transfer tactic in there. This kind of thing is one of the primary motivating examples for the transfer tactic

Anton Lorenzen (Apr 04 2020 at 13:37):

Kevin Buzzard said:

Re list A and buffer A we can prove the theorems about lists and you can use the transfer tactic to port the theorems into theorems about buffers

That sounds super useful for doing computations in Lean! Could it be used to transfer the lemmas about data.finset to lemmas about a red-black tree? Mathlib matrices to sparse matrices? Reals to floating point numbers up to some error margin?

Kevin Buzzard (Apr 04 2020 at 13:48):

I am not sure about the reals because those types are not isomorphic

Alex J. Best (Apr 04 2020 at 13:55):

As Kevin says, you need two equivalent types, so I'm not sure what the first example you mention means. The second is totally the sort of thing transfer should do. And as for the third fixed precision floating points are not the same as the reals, but perhaps the equivalence (cauchy sequences) <-> (cauchy sequences with a fixed convergence rate) is another example in the direction you are thinking.

Mario Carneiro (Apr 04 2020 at 15:33):

Actually transfer doesn't require isomorphism. It requires a couple things depending on what operators it sees, but for example you can enter a forall binder if the relation is bi-total, which means (∀b, ∃a, R a b) ∧ (∀a, ∃b, R a b)

Mario Carneiro (Apr 04 2020 at 15:34):

The proof of properties of int by transfer from nat x nat is very much not an isomorphism


Last updated: Dec 20 2023 at 11:08 UTC