Zulip Chat Archive

Stream: general

Topic: Lessons from History


Thomas Eckl (Jul 18 2020 at 15:45):

When rereading Quinn's AMS article I realized that it does not contain a very extensive description of the "definition-and-proof technology" introduced around the turn of the last century, and why Quinn thinks it was (and is!) so successful with mathematicians. You find more in his (unpublished) book on https://intranet.math.vt.edu/people/fquinn/history_nature/nature0.pdf .

Heather Macbeth (Jul 18 2020 at 16:18):

Regarding the new logical foundations of the early 20th century, Quinn (in his AMS article linked by @Thomas Eckl in the earlier thread) suggests an interesting advantage of rigour:

this approach has unexpected advantages. Trying to fix gaps in first approximations to proofs can lead to conclusions we could not have imagined and would not have dared conjecture. They also make research more accessible: rank-and-file mathematicians can use the new methods confidently and effectively, while success with older methods was mostly limited to the elite.

Thomas Eckl (Jul 18 2020 at 18:45):

In his book, Quinn sketches a "science of contemporary mathematics" which aims to describe the practice of core mathematics where rigour plays an essential role, from many different angles: historical, sociological, pedagogical, philosophical, ... I particularly like the remarks using cognitive neuroscience which explain that mathematicians pack mathematical concepts into (equivalent) definitions, examples, first consequences because this allows them to build internal models of perception, as humans do for all sorts of perceptions. These models can be more or less useful, and that's why in the beginning theories are so fluid: mathematicians look for the right way to package them into the brain. In the relevant chapter Quinn extensively discusses an algebraic hierarchy - a hint why Lean is so successful with formalizing maths. (-;

Thomas Eckl (Jul 18 2020 at 19:02):

I find work like Quinn's so important, because they give clues how the next revolution in mathematics can and should be shaped. There is a revolution ahead, since the methods of 20th century mathematics come to their limit, as all the entertaining examples of published proofs of important results that are in doubt or right away false show. And proof assistants are there to help us out, but then better design them in a way that makes them fit successfully into the core mathematician's practice, and even makes it more effective. That's how it worked last time around.

Johan Commelin (Jul 18 2020 at 20:06):

Interesting thoughts! I would love to discuss this with you in more detail. But I should maybe first try to get my hands on that book by Quinn (-;

Jalex Stark (Jul 18 2020 at 20:20):

Quinn's (draft) book is freely available from quinn. Thomas linked it earlier.

Johan Commelin (Jul 18 2020 at 20:22):

Thanks, must have missed that other link

Heather Macbeth (Jul 19 2020 at 00:01):

In the last wave of formalization, more precise definitions led to famous counterexamples, which helped to convince mathematicians that these precise definitions were meaningful or necessary. Quinn mentions the nowhere-differentiable function and the space-filling curve (in his AMS article), which are well-known. In his book (section 9.2.1) he mentions a third example, about a definition I had not known was controversial at the time:

We now know that analytic techniques require a ‘smooth structure’ and not all manifolds have them. However early proposals to work with smooth structures were rejected on philosophical grounds: manifolds should be objects, not structures on an object. Furthermore, smooth structures seemed to amount to assuming what you wanted to prove, and seemed intellectually dishonest if not logically circular.

Apparently the modern definition of manifold was not fully accepted until Milnor's nonstandard smooth structures on the 7-sphere proved its necessity (in 1957!!).

Most mathematicians believe that there are a certain number of wrong theorems out there, but probably most mathematicians think we have found the right definitions. I wonder whether this wave of formalization will bring any wild counterexamples to change that consensus.

Kevin Buzzard (Jul 19 2020 at 09:58):

The Milnor fact is extraordinary. Grothendieck was defining schemes to be structures on topological spaces in the late 50s.

Thomas Eckl (Jul 26 2020 at 15:37):

Finding such counterexamples would be a strong case for formalization. However, the problem at Milnor's time was that it was not universally accepted to peel off layers of structures of well-known (geometric) objects. Now, where is the current conceptual framework whose formal specification would lead to new (counter)examples by analysing the finer details?

Thomas Eckl (Jul 26 2020 at 15:38):

My bets are on the new ways to "argue up to isomorphisms" developed in homotopy type theory, but you may have different suggestions ...


Last updated: Dec 20 2023 at 11:08 UTC