Zulip Chat Archive
Stream: condensed mathematics
Topic: profinitely filtered pseudo normed topological groups
Johan Commelin (Jan 29 2021 at 09:25):
Currently, there are a lot of statements about Mbar
and Mbar_le
that should be generalised to something like the category of "profinitely filtered psuedo normed topological groups". In other words:
- topological groups with an increasing filtration such that
- is a profinite set
- is pseudo normed, so , , and
Morphisms of these gadgets should be continuous and bounded homs.
Johan Commelin (Jan 29 2021 at 09:25):
Are these things bornological? Can does continuity follow from boundedness, or vice versa?
Johan Commelin (Jan 29 2021 at 09:26):
Reason for the generalisation:
- deduplicate statements of 9.4 and 9.5
- functoriality will give the implication 9.5 -> 9.4.
Adam Topaz (Jan 29 2021 at 14:22):
Perhaps it makes sense to generalize this concept to more general categories, so that the profinitely filtered pseudo...
is the special case for the category ProFin
, and pseudo-normed abelian groups is the special case for the category Type
?
Johan Commelin (Jan 29 2021 at 14:23):
Well, I was thinking of using the predicate for compact subsets (and requiring M
to be Hausdorff, totally disconnected).
Johan Commelin (Jan 29 2021 at 14:24):
So then it's not very modular. But maybe easier to use, because there will be only 1 carrier type involved.
Johan Commelin (Jan 29 2021 at 14:24):
But we should definitely experiment a bit with different approaches.
Adam Topaz (Jan 29 2021 at 14:24):
Oh, I see, so the issue is that the "total space" is not profinite? Then just work in locally profinite spaces, or something?
Adam Topaz (Jan 29 2021 at 14:25):
Okay, I guess one has to think about how to combine these two concepts in a workable way, but I feel like we could deduplicate even more if we generalize enough.
Johan Commelin (Jan 29 2021 at 14:26):
Well, we only need pseudo normed groups of the special form that I outlined above, as far as I can see.
Adam Topaz (Jan 29 2021 at 14:27):
Ah, I see.
Johan Commelin (Feb 01 2021 at 09:51):
Another question that I didn't see an immediate answer to: do these profinitely filtered pseudo normed groups have an internal hom?
We already know that the type of group homs between pseudo normed groups is naturally a pseudo normed group. But will it's filtration sets be profinite, if this is true for source and target?
Peter Scholze (Feb 01 2021 at 09:58):
definitely not!
Peter Scholze (Feb 01 2021 at 09:58):
Profinite sets do not have internal Hom's; the internal Hom is very far from compact
Johan Commelin (Feb 01 2021 at 09:59):
Oooh right.
Johan Commelin (Feb 01 2021 at 10:00):
So we'll need some special work to make into one of these objects.
But here it works, because is a lattice, hence discrete, and all the filtration sets are finite.
Peter Scholze (Feb 01 2021 at 10:01):
Well yes, but it's also not really an internal Hom (or is it?). The definition is a little hand-made....
Johan Commelin (Feb 01 2021 at 10:01):
Right
Johan Commelin (Feb 01 2021 at 10:01):
Grr, I don't know what the LateX parser is unhappy about. But I guess it's clear what I mean Hooray, I found the bad }
Johan Commelin (Feb 01 2021 at 10:04):
I'm currently playing around with formal definitions of the objects. Lean automatically puts a (subspace) topology on the filtration sets , if we put a topology on . But if the topology on is defined in terms of topologies on to start with, this means we now have two topologies on , and Lean will get confused which one we mean. Of course we mean "who cares, they are equal", but that's not something Lean will accept.
Johan Commelin (Feb 01 2021 at 10:05):
So my current feeling is that maybe we should avoid putting a topology on .
Peter Scholze (Feb 01 2021 at 10:05):
Definitely!
Peter Scholze (Feb 01 2021 at 10:06):
If you ever want to take products, you will start worrying whether the product topology on is the colimit topology
Peter Scholze (Feb 01 2021 at 10:06):
This is true, but it's a nasty and useless lemma
Johan Commelin (Feb 01 2021 at 10:07):
Ok, that's a good example. (We have this with metric topologies in mathlib.)
Peter Scholze (Feb 01 2021 at 10:08):
But it's not a metric topology!
Peter Scholze (Feb 01 2021 at 10:08):
It's not any kind of standard topology for which you will likely have any API
Johan Commelin (Feb 01 2021 at 10:15):
No, I just meant that we have a similar "problem" there. If X
and Y
are metric spaces, then you can put the metric topology on or the product topology (of the metric topologies on and ). To make this usable, we need to use a "trick". We put a separate topology on , together with a proof that it is equal to the metric topology. And then on the proof that it's a metric space includes the lemma that the two aforementioned topologies agree.
Peter Scholze (Feb 01 2021 at 10:24):
Oh, I see. That's a neat trick! But I think it's best to avoid mention of the topology here.
Kevin Buzzard (Feb 01 2021 at 10:27):
Peter Scholze said:
That's a neat trick!
Ha! When I first saw it I was horrified! It is the canonical example of a "diamond" in type theory: the product topology equals the topology defined by the product metric, but this equality is not true by definition, so it breaks the system which is supposed to be keeping track of these things automatically unless you make this hack in the definition of a metric space to also also carry around a topology and a proof that the topology is induced by the metric. In fact I think nowadays we carry around the uniform space structure instead.
Johan Commelin (Feb 01 2021 at 10:29):
@Peter Scholze What about the homs in this category. They should be bounded and "continuous" group homomorphisms, right?
Johan Commelin (Feb 01 2021 at 10:30):
But phrasing what "continuous" means will be somewhat roundabout.
Peter Scholze (Feb 01 2021 at 10:30):
I thought the homomorphisms should be strict, i.e. map into
Peter Scholze (Feb 01 2021 at 10:31):
Otherwise the formation of this -complex wouldn't even be functorial
Peter Scholze (Feb 01 2021 at 10:31):
And strict morphisms are necessarily continuous. I assume you automatically ask continuity of each ...
Johan Commelin (Feb 01 2021 at 10:33):
Ok, that makes sense. I'm now happy with this approach.
Peter Scholze (Feb 01 2021 at 10:33):
@Kevin Buzzard Well, in higher category theory you learn that "the same" means "up to contractible choice". And you learn that it is often wise to enlarge a point to a larger contractible space to make enough space for certain arguments. So I don't find it bad to somehow enlarge "the topology on a metric space" into an interval.
Peter Scholze (Feb 01 2021 at 10:35):
What I mean to say is that I've seen similar hacks being used by pure mathematicians to justify certain arguments in higher category theory
Kevin Buzzard (Feb 01 2021 at 10:36):
Initially I was concerned that the system would not scale at all well (e.g. maybe we would have to put five auxiliary extra structures on every scheme in order to make things work) but this seems not to be the case.
Kevin Buzzard (Feb 01 2021 at 10:36):
Especially when I discovered that despite the fact that these systems had been around for decades, nobody had bothered to even try and test this by making schemes...
Kevin Buzzard (Feb 01 2021 at 10:38):
Part of the reason we formalised the definition of a perfectoid space was just to see whether this sort of issue actually turned the problem into a nightmare, or whether type theory really did work in practice as a foundation for mathematics beyond proving sqrt(2) is irrational and all finite groups of odd order were solvable. Turns out it did.
Peter Scholze (Feb 01 2021 at 10:48):
One thing I find mildly confusing about type theory is that its advocates claim that when you form say it automatically checks for you that is not zero, and that this is a very good thing. But in practice, things seem to be the other way around: You do not want to be bothered having to check that, so instead you simply define it to be some junk value in case is not as expected. I've seen many such hacks in the code now, and it makes me believe that this whole justification for type theory is not really sound
Johan Commelin (Feb 01 2021 at 10:50):
I agree. In practice it's an art to find the right balance.
Johan Commelin (Feb 01 2021 at 10:51):
I really view the type system as a tool. It can figure out a lot of things for me, because of the implicit information carried around by types. But sometimes it gets in your way, and then we resort to "junk-value hacks".
Johan Commelin (Feb 01 2021 at 10:55):
I think it is in some sense an "unsolved issue". Sometimes having "correctly typed" functions really makes life a lot harder. Sometimes it just makes the code very ugly to read. I think it's worth a lot that we can just write 1/x
, instead of 1/<x, proof that x \ne 0>
. If we could have solid automation deriving those proofs for us, then I would be a lot happier about using 1/x
as notation for div 1 x (auto-inferred proof that x \ne 0)
.
Johan Commelin (Feb 01 2021 at 10:55):
But currently such automation is typically slow or brittle.
Johan Commelin (Feb 01 2021 at 10:56):
On the other hand, there are still lots of places where the type system prohibits us from writing nonsense.
Peter Scholze (Feb 01 2021 at 10:57):
Johan Commelin said:
On the other hand, there are still lots of places where the type system prohibits us from writing nonsense.
In the sense that this is actively helpful during the coding, catching stupid slips early on?
Johan Commelin (Feb 01 2021 at 10:59):
Yes, I definitely think so. Before you launch into the proof, you often already make a bunch of typos that you can immediately correct.
Johan Commelin (Feb 01 2021 at 10:59):
But also because it will just infer a lot of information.
Peter Scholze (Feb 01 2021 at 10:59):
OK, I see
Johan Commelin (Feb 01 2021 at 11:00):
For example, we can write x + y
, which is in some sense quite a miracle. We don't have to explain to lean which +
we mean all the time.
Johan Commelin (Feb 01 2021 at 11:02):
Lean sees x + y
, and thinks: "now I need to figure out what +
this is. So let's look up the type of x
. Aha, that is R
. Now let's look for has_add R
in my database. Ok, there is some _inst_3 : ring R
, and I know how to turn a ring
into a has_add
through a series of steps."
Peter Scholze (Feb 01 2021 at 11:02):
Aha, I see the point. That's great!
Johan Commelin (Feb 01 2021 at 11:03):
Similarly, if C : system_of_complexes
and , we can just write C.d x
, and Lean will figure out which differential we mean exactly.
Johan Commelin (Feb 01 2021 at 11:04):
It will be able to infer c
and i
from the type of x
, namely C.X c i
(if I got the order of c
and i
right...)
Johan Commelin (Feb 01 2021 at 11:05):
But that doesn't change the fact that you are completely right that 1/x
is hacky.
Johan Commelin (Feb 01 2021 at 11:06):
On the other hand, when I work in set theory, I just ignore the issue. I never check (ok, maybe in 1% of the cases) that . And I think I'm not the only one who "cheats" like this.
Kevin Buzzard (Feb 01 2021 at 11:07):
When I was doing undergraduate example sheets there were questions about square roots of real numbers, and Lean did not have square roots at that time, and so I defined sqrt(x, [proof that x>=0]) to be the usual sup, and the computer scientists laughed at me for writing a function which took two inputs and then never used one of them.
Kevin Buzzard (Feb 01 2021 at 11:10):
Turns out that if you feed in a set S which has no sup into the sup function, it just returns some junk value. At the time I was horrified by this, but it had got to the stage where I knew enough about what I was doing to make my square root function work anyway, and I was adamant that it was the "correct" thing to do. In the mean time, the computer scientists (who presumably could see what was about to happen) implemented a square root function which returned the correct value for x>=0 and a junk value for x<0. I then started on the example sheet questions and they drove me nuts! I was constantly having to insert proofs that e.g. 2>=0, and in proofs showing stuff like sqrt(2)+sqrt(3)<sqrt(10) it soon got tiring.
Peter Scholze (Feb 01 2021 at 11:22):
Johan Commelin said:
Similarly, if
C : system_of_complexes
and , we can just writeC.d x
, and Lean will figure out which differential we mean exactly.
Wow! Even in my very "sloppy" notes, I felt the need to put more quantifiers.
Johan Commelin (Feb 01 2021 at 11:25):
Yup, but in VScode, you can click on the C.d
in the "Goal window", and it will tell you exactly which c
and i
it inferred.
Johan Commelin (Feb 01 2021 at 11:26):
I think this is the start of where interactive theorem provers become interactive. There is a whole lot more that can be done here... and people are working on it. I'm excited about interactive diagram chases, etc...
Patrick Massot (Feb 01 2021 at 12:13):
The question of is all about "where do you want to put proofs?". We want to talk first and apologize later (and only if really needed), because it saves a lot of typing. So, given a field , we define division as a function from to , without any precondition, and it satisfies . This is a definition, it can't be wrong in the sense it cannot bring a contradiction by itself. Now of course Lean won't let prove without asking for a proof that is non-zero. The lemma simplifying this expressions asks for a proof, the definition of doesn't. Note that apologizing is not always needed. For instance is true without any condition on . That's because we've clearly chosen the "junk value" . This is simply convenient in practice.
Johan Commelin (Feb 01 2021 at 12:16):
I like that slogan Patrick. "speak first, apologize later".
Patrick Massot (Feb 01 2021 at 12:25):
This has basically no mathematical content. The other trick that Johan mentioned (with metric spaces being a packages containing a distance function, a uniform structure and a proof of the fact that the uniform structure is compatible with the distance) does have some clarifying content. We met this in a slightly more sophisticated context while working on basic infrastructure for perfectoid spaces. An abelian topological group have a unique reasonable uniform structure. Now complete this uniform structure to get a new uniform space and extend the group operations to . The new group operations are continuous. So with its group structure and the topology underlying its uniform structure is a topological group. And now we have two uniform structures on : the one offered by the completion functor acting on all uniform spaces and the reasonable one on a topological group. These are the same uniform structure, but this is not obvious. Bourbaki proves they are the same, but this proof is pretty well hidden inside a long discussion. This subtlety is painful for Lean if not handled through the trick described by Johan. But I claim you also get added clarity by openly defining a compatibility condition between a group structure and a uniform structure and openly stating 1) given a topological abelian group there is a unique uniform structure which is compatible with both its group structure and its topology and 2) if a group structure and a uniform structure on are compatible then the group structure and the uniform structure on are compatible.
Peter Scholze (Feb 01 2021 at 12:50):
I can totally relate to that second trick, but remain minimally perplexed by this thing. (Another reason I remain perplexed by it is that another reason type theory is supposed to be great is that in suitable versions of type theory, you have to work constructively/continuously/internally in a topos/whatever, and is a very discontinuous extension of . I guess in such versions of type theory, this cheat wouldn't work?) But I'm derailing this conversation, and also should do other things...!
Mario Carneiro (Feb 01 2021 at 12:58):
The type theory purist would indeed argue that division should have the type div : \all (x y : R), y != 0 -> R
. This concession to simple types is however very useful in practice. Personally I think it's great that lean has the tools to express very dependent types even if they are used only sparingly. We know that it's possible to get by with essentially no types at all (ZFC), but you get less assistance with type errors and typing when you have to do everything by hand. Lean (and mathlib specifically) represents a compromise on that spectrum, where lean can support very precise types but it's also okay to be a little inelegant sometimes and totalize when type purism hurts more than it helps.
Mario Carneiro (Feb 01 2021 at 12:59):
(Actually, the real purist would replace y != 0
with y # 0
using the apartness relation on the reals, but let's not get into that.)
Johan Commelin (Feb 01 2021 at 14:05):
I pushed a definition of the category.
Johan Commelin (Feb 01 2021 at 14:24):
The downside of making everything functorial is that we need to give things names that would usually be covered by notation.
Johan Commelin (Feb 01 2021 at 14:27):
We used to go from to
(Mbar_le r' S c)^m -> (Mbar_le r' S c')^n
But now this has to become a natural transformation between functors that raise things to the power m
and n
. And so we can't use the ^m
notation.
Johan Commelin (Feb 01 2021 at 14:28):
But I don't see how to avoid this. We'll have to think about how to make it look good later.
David Michael Roberts (Feb 01 2021 at 23:43):
Peter Scholze said:
(Another reason I remain perplexed by it is that another reason type theory is supposed to be great is that in suitable versions of type theory, you have to work constructively/continuously/internally in a topos/whatever, and is a very discontinuous extension of . I guess in such versions of type theory, this cheat wouldn't work?)
Yes.
Lean's logic is classical, so you are working in a Boolean topos, where the by-cases definition is fine. Alternatively, this 1/0=0 hack means implicitly you are working with a discrete field, not one of the more the general versions (eg a Heyting field, which uses #
). In other proof assistants that stick more closely to the constructive world this hack would not be ok.
Kevin has publicly said this is a big plus for Lean, and for all the "generic" mathematics he wants to see done, as it is more optimal for usability vs purity.
Kevin Buzzard (Feb 02 2021 at 06:58):
The reason it doesn't work in constructive settings is that given a random term representing a real number, such as , there's no algorithm which tells you if it's zero or not, so constructively you can't make a definition starting "if then..."
Peter Scholze (Feb 02 2021 at 09:52):
@David Michael Roberts Thanks! I definitely side with Kevin on the question of constructivity, my love for (Grothendieck) topoi notwithstanding (sorry!...).
One reason I like sets is that they are an extremely primitive language: Extremely few basic sorts of objects and relations. But you can still base everything on them, and at some point you can put some higher level languages on top of them. Type theory is much less primitive, it seems to me -- I never quite understood what a type is, I think this also depends heavily on the precise setup, and what the primitive objects and relations are is a little in flux, I believe -- often you allow yourself to produce new "methods of forming types" at will(?). This seems more like the kind of thing you would like to do on top of other foundations, not in the foundations themselves. That types are closer to how you actually think is not an argument that they are better foundations, only that you want to have some interface that has some type-like behaviour. If I want to work in a general cartesian-closed topos or whatever, I should be free to say so, but I shouldn't be forced to from the start. (Back when I was coding in C++ in my early youth, I used to appreciate C++'s "pointers" with which you could access arbitrary points in memory. I think that's a little akin to asking whether is an element of . I assume use of pointers, if at all allowed in current languages, is considered very bad form? But somehow in the end whatever the computer is doing is saved somewhere on memory, and the question "what is the bit at this point of memory" is always a well-defined question, although often not one you should be asking as it misses context. I feel the same about the foundations of mathematics -- type theory seems to be like a way of denying that there's actually some physical memory with actual bits somewhere.)
Johan Commelin (Feb 02 2021 at 10:04):
Ha, you sound the same as @Mario Carneiro (-;
Johan Commelin (Feb 02 2021 at 10:04):
He frequently says "types are an editor feature".
Johan Commelin (Feb 02 2021 at 10:10):
I think I agree with what you say. But to carry on with the analogy with programming languages (which is a good one, I think): programming languages with "strong" foundations, like Rust or Lean 4 can make huge compiler optimizations because they have a very strong set of assumptions on what can and cannot be done with memory. There are benchmarks that show that Lean 4 beats the C++ stdlib on lots of common tasks.
Johan Commelin (Feb 02 2021 at 10:12):
Of course mathlib is only a single data point. But the speed of growth of mathlib is tremendous compared to other libraries that have been around for decades. But who knows what Mario's Metamath Zero will accomplish. It's similar to what you describe. A low level language, with a type theory bolted on top (MM1).
Peter Scholze (Feb 02 2021 at 10:12):
Aha, that's great!
Johan Commelin (Feb 02 2021 at 10:16):
About your point of "methods of forming types". I think the list is not so long, and quite natural. Basically, we have -types and -types.
And what I really like about type theory is that it merges the "set-theoretic" part of the foundations and the "logic" part. Because "sets are types" and "propositions are types". A function between sets uses the same machinery as an implication of propositions .
Johan Commelin (Feb 02 2021 at 10:16):
So that seems like a simplification to me, compared to the usual foundations.
Peter Scholze (Feb 02 2021 at 10:22):
yes and no. There's this whole hierarchy of -truncated homotopy types: For , just the one-point set, for , the empty set and the one-point set, for all sets, and then you get into actual homotopy types. I can see the appeal of starting directly from all homotopy types and then specializing, but I have never seen a convincing proposal for how to do this -- I think this is largely what HoTT is trying to achieve (and I think it's a very interesting and important question). So I think it's fine to instead start building it up from below, and then going in one-by-one steps seems like the good thing, instead of jumping right to (why not and allow groupoids as well? Why not ? etc.)
Johan Commelin (Feb 02 2021 at 10:24):
I agree, and I'm certainly rooting for HoTT. Unfortunately there are still usability issues with HoTT in practice. Where "practice" means, in a formal theorem prover.
Johan Commelin (Feb 02 2021 at 16:30):
I pushed some wip on "continuous" maps between profinitely filtered pseunormgrps. There are some sorrys left. They are maths-trivial... but I need to have dinner first.
Johan Commelin (Feb 03 2021 at 07:17):
These sorrys are now done. I just realised that the objects in this category should also have an endomorphism .
Johan Commelin (Feb 03 2021 at 07:18):
So, I'll need to modify everything a bit. By now I no longer know how to call these objects...
Johan Commelin (Feb 03 2021 at 07:18):
profinitely_filtered_pseudo_normed_group_with_Tinv
is a bit... long
Kevin Buzzard (Feb 03 2021 at 07:28):
So you went with adding the topology to the abstract algebra? You could call them profinitely filtered pseudo normed modules
Johan Commelin (Feb 03 2021 at 07:32):
But that doesn't mention the T
Johan Commelin (Feb 03 2021 at 07:33):
There is now no topology on , only on the . But I introduce a new "continuity" predicate for functions , and prove the minimal amount of stuff that we need about them.
Peter Scholze (Feb 03 2021 at 08:37):
Note that does not send into . What I suggested once was to define a rescaling functor that takes some sends it to , and then is a map from to
Johan Commelin (Feb 03 2021 at 08:39):
Yes, that might be the best approach
Johan Commelin (Feb 03 2021 at 08:39):
We need the fact that is "additive", right?
Johan Commelin (Feb 03 2021 at 08:40):
It has to commute with addition, so that it commutes with the maps coming from BD.
Peter Scholze (Feb 03 2021 at 08:40):
Sure
Peter Scholze (Feb 03 2021 at 08:40):
But that ought to be automatic in what I said above
Johan Commelin (Feb 03 2021 at 08:41):
But this still means we need a two stage rocket. Because we can't define the rescaling functor before defining profinitely filtered pseudo normed groups
Peter Scholze (Feb 03 2021 at 08:42):
Yes, but that seems very reasonable to me. In fact better than cooking the into the first stage
Johan Commelin (Feb 03 2021 at 08:42):
Another option would be to have non-strict maps in the first stage.
Peter Scholze (Feb 03 2021 at 08:43):
But that would destroy lots of functoriality, and control over constants
Johan Commelin (Feb 03 2021 at 08:43):
That might also help, because scalar multiplication with n
would become a morphism
Johan Commelin (Feb 03 2021 at 08:43):
And so evaluation universal maps from BD would also become morphisms.
Peter Scholze (Feb 03 2021 at 08:43):
Hmm, I disagree
Johan Commelin (Feb 03 2021 at 08:44):
We would regain the functoriality on the subcategory with strict homs and .
Peter Scholze (Feb 03 2021 at 08:44):
See the appendix to lecture 9: You carefully choose constants in the BD complex to make scalar multiplication by well-defined
Peter Scholze (Feb 03 2021 at 08:45):
Well my experience from thinking about the pure math is that it's better to work with the strict notion, and work around the fact that certain maps are not strict by explicitly keeping track of required rescalings
Johan Commelin (Feb 03 2021 at 08:46):
So then multiplication by n
would also use the rescaling functor, right?
Johan Commelin (Feb 03 2021 at 08:46):
I guess that would work
Peter Scholze (Feb 03 2021 at 08:46):
Yes
Johan Commelin (Feb 03 2021 at 08:47):
I'll start working on the second stage of the rocket (-;
Patrick Massot (Feb 03 2021 at 09:11):
Then we should stay abstract and not try to get down to earth because landing rockets is tricky.
Johan Commelin (Feb 03 2021 at 09:34):
I'm worried that the rescaling functor will lead to type theoretic motive headaches. (Yes, we wouldn't have those issues in set theory.)
Johan Commelin (Feb 03 2021 at 09:35):
We'll end up with types filtration M (r * (c1 + c2))
and filtration M (r * c1 + r * c2)
that are not definitionally equal.
Mario Carneiro (Feb 03 2021 at 09:35):
= :rainbow:
Peter Scholze (Feb 03 2021 at 09:37):
What about the previous cheat of putting extra proofs that the rescaling factor is what you think it is?
Johan Commelin (Feb 03 2021 at 09:37):
That's another option
Johan Commelin (Feb 03 2021 at 09:38):
I'll just play around. My intuition is pushing me towards a predicate is_rescaling_hom r
... but you warned against this, which is making me doubtful.
Peter Scholze (Feb 03 2021 at 09:39):
Hmm no that might be OK too
Peter Scholze (Feb 03 2021 at 09:39):
As long as the rescaling is mentioned explicitly
Peter Scholze (Feb 03 2021 at 09:40):
Well, whatever you think works to formalize cleanly the BD resolutions in the appendix to lecture 9
Johan Commelin (Feb 03 2021 at 09:40):
The suitable
predicate worked quite well there.
Johan Commelin (Feb 03 2021 at 09:41):
I'll take another careful look at that appendix, to see what we need.
Johan Commelin (Feb 03 2021 at 09:42):
My guess is that we'll have another predicate very_suitable
that will assert that a sequence of constants is not only suitable
, but also works for the homotopies from that appendix.
Johan Commelin (Feb 03 2021 at 09:43):
But I agree that we should keep track of the constants. It just seems that it's better to pack them closer to the homs than to the objects
Peter Scholze (Feb 03 2021 at 09:51):
OK, sounds good. I think I misunderstood the proposal. I was thinking about the analogue of any bounded homomorphism of Banach spaces, where that notion does not keep track of the bound. Such a notion would definitely be too weak
Johan Commelin (Feb 03 2021 at 12:35):
Another design question: we want . This is a condition on the action of that will be bundled into the objects. But do we want to vary r
with the objects? Or is r
fixed for the entire category?
Johan Commelin (Feb 03 2021 at 12:38):
Since homs need to commute with , I'm guessing that r
is fixed for the entire category. On the other hand, there are natural inclusions (that don't commute with ) from for .
Peter Scholze (Feb 03 2021 at 13:27):
For now, you should think of as fixed forever
Johan Commelin (Feb 03 2021 at 13:27):
Ok, thanks. That's what I was hoping :sweat_smile:
Peter Scholze (Feb 03 2021 at 13:28):
Although I guess, in the notation of the lecture notes, it is that is relevant here
Johan Commelin (Feb 03 2021 at 13:28):
But if (when?!) we get to , we can start worrying about how to move ...
Peter Scholze (Feb 03 2021 at 13:28):
But both are completely fixed throughout lecture 9
Peter Scholze (Feb 03 2021 at 13:29):
hmm yes for , but there you don't have to worry about for example
Johan Commelin (Feb 03 2021 at 19:50):
@Peter Scholze For the time being, we don't need functoriality in the normed group either, right?
Peter Scholze (Feb 03 2021 at 22:07):
No, we don't! is fixed throughout lecture 9.
Johan Commelin (Feb 04 2021 at 21:05):
The second stage of the rocket is done. The third stage is coming along nicely.
Johan Commelin (Feb 04 2021 at 21:50):
If anybody is looking for something to work on, you can show that (weak) exactness is isomorphism invariant for isos of systems of complexes.
Johan Commelin (Feb 04 2021 at 21:51):
Another option is to turn Mbar
into an object of ProFiltPseuNormGrpWithTinv
, although that isn't as follow your nose as the other thing.
Riccardo Brasca (Feb 04 2021 at 21:52):
I can do something next week... too much teaching this week!
Kevin Buzzard (Feb 04 2021 at 21:53):
I will have several hours tomorrow. I was just tidying up the stuff I did last weekend.
Kevin Buzzard (Feb 04 2021 at 21:54):
I could try either thing -- do you have a preference?
Johan Commelin (Feb 04 2021 at 21:55):
I don't really care
Riccardo Brasca (Feb 04 2021 at 21:55):
The iso invariance should be easy... or there is something subtle I am missing?
Johan Commelin (Feb 04 2021 at 21:55):
No, it should be easy. But it has to be done.
Johan Commelin (Feb 04 2021 at 21:56):
I think the Mbar
thing should be done by somebody who wants to learn all about the API for Mbar
and ProFiltBazQuux
s
Johan Commelin (Feb 04 2021 at 21:56):
I'll probably do it tomorrow, if nobody beats me.
Johan Commelin (Feb 04 2021 at 21:57):
Once those things are done, and I finish the functoriality of the systems of complexes, then the only thing stopping us from stating 9.5 is the defn of polyhedral lattices, and turning into a ProFiltBazQuux
.
Johan Commelin (Feb 04 2021 at 21:58):
I haven't thought hard about that last thingy, but it shouldn't cause any surprises.
Kevin Buzzard (Feb 04 2021 at 22:01):
I was going to start on Laurent series until I saw this thread. I'm flexible.
Scott Morrison (Feb 04 2021 at 22:27):
"(weak) exactness is isomorphism invariant for isos of systems of complexes" is https://github.com/leanprover-community/lean-liquid/blob/master/src/system_of_complexes.lean#L197
Riccardo Brasca (Feb 04 2021 at 22:31):
We will probably also need the weak version, using is_weak_bdd_exact_for_bdd_degree_above_idx
(with identical proof I think)
Scott Morrison (Feb 04 2021 at 22:40):
ah, okay
Scott Morrison (Feb 04 2021 at 22:40):
I'm having a go at this now
Scott Morrison (Feb 04 2021 at 23:01):
I've only just started reading these definitions, so I'm probably missing something, but I don't see why this iso-invariance lemma is true. I see that C₁.is_bdd_exact_for_bdd_degree_above_idx k m c₀
and f : C₁ ≅ C₂
implies that C₁.is_bdd_exact_for_bdd_degree_above_idx k' m c₀
for some potentially _larger_ k'
, but not that you can use the same k
. Perhaps I am just misunderstanding something basic about the definitions.
Scott Morrison (Feb 04 2021 at 23:02):
f
doesn't preserve norms, does it?
Adam Topaz (Feb 04 2021 at 23:12):
@Scott Morrison That's right, a moprhism of normed abeliaan groups is only assumed to be bounded (with respect to some constant C
). Maybe k' = C * k
?
Scott Morrison (Feb 04 2021 at 23:12):
Something like that. There will be appearances of f
on both sides of the inequality, so potentially two appearances of that constant C
.
Adam Topaz (Feb 04 2021 at 23:13):
But there are many morphisms involved one of these systems of complexes, so it's unclear whether a single k'
would work.
Adam Topaz (Feb 04 2021 at 23:13):
I've also been too busy to follow the discussion here the last few weeks, so maybe someone else can clarify.
Scott Morrison (Feb 04 2021 at 23:15):
Hmm, yes. The different components of f : C₁ ≅ C₂
could all have different bounds...
Riccardo Brasca (Feb 04 2021 at 23:17):
Ah, I thought f
preserved the norm... if this is not the case I agree it's not true
Scott Morrison (Feb 04 2021 at 23:18):
maybe we've just set up the category structure on system_of_complexes
incorrectly, and it is meant to only have norm-nonincreasing morphisms?
Scott Morrison (Feb 04 2021 at 23:18):
in which case an isomorphism would be norm-preserving?
Adam Topaz (Feb 04 2021 at 23:18):
If you think of just the case of normed abelian grouups, and you have an isomorphism f
with inverse g
, then f
has an associated constant C
aand same with g
, say D
, so you would end uup with
|x| \le C * D * |x|
for all x
. Of course, this doesn't mean that C = 1
Scott Morrison (Feb 04 2021 at 23:20):
I haven't discovered yet where we want to use this isomorphism-invariance result, so I don't know what statement we really want.
Riccardo Brasca (Feb 04 2021 at 23:20):
Yes, we should wait for someone who has a better understanding of the big picture
Riccardo Brasca (Feb 04 2021 at 23:21):
There was a discussion about optimizing the constants, so the existence of some k
is definitely weaker than what we want
Floris van Doorn (Feb 05 2021 at 00:16):
I'm interested in joining this project. I just finished a paper I was writing on the formalization of Haar measure, so I have time now.
I looked a bit at the library, and I tried this lemma, and am stuck at exactly the same place (also not having any high-level idea of what I'm doing :) )
Johan Commelin (Feb 05 2021 at 05:45):
@Scott Morrison A system is admissible
if all the maps are norm-nonincreasing. So we can have two versions of "iso-preserving", I guess.
Johan Commelin (Feb 05 2021 at 05:46):
But in practice, we will apply this to admissible
examples only, so it's fine to add that as assumption.
Scott Morrison (Feb 05 2021 at 05:46):
but admissible
is about the maps within a system, not the morphisms between them?
Johan Commelin (Feb 05 2021 at 05:46):
Oooh, you are right.
Johan Commelin (Feb 05 2021 at 05:47):
Sorry, let me think for a sec about how I want to apply the iso invariance
Johan Commelin (Feb 05 2021 at 05:48):
In NormedGroup
we need norm-increasing morphisms, because we'll have automorphisms T
that rescale the norm by some r
.
Johan Commelin (Feb 05 2021 at 05:49):
But for the iso-invariance, we can assume that the maps between the systems are norm-noninc.
Scott Morrison (Feb 05 2021 at 05:50):
is it just that for system_of_complexes
the morphisms should be norm nonincreasing?
Scott Morrison (Feb 05 2021 at 05:50):
or do we even want norm-increasing morphisms there?
Johan Commelin (Feb 05 2021 at 05:51):
I don't know yet
Johan Commelin (Feb 05 2021 at 05:52):
For the implication 9.5 => 9.4 we only need norm-noninc. But I don't know about the proof of 9.5.
Johan Commelin (Feb 05 2021 at 05:53):
I think it should be fine to make everything norm-noninc. Because the main tool is the normed spectral sequence 9.6. And that one is assumed to be admissible.
Peter Scholze (Feb 05 2021 at 08:45):
Isomorphism invariance is definitely one of the things I might use without noticing. But I'm actually confused where we want to use it. Currently, I can only think of situations where the two are literally the same to my eyes, so in particular everything is compatible with norms.
Peter Scholze (Feb 05 2021 at 08:47):
I agree that for an isomorphism that has a weird behaviour with respect to norms, the translation may not be so obvious, but I don't think such a situation appears.
Johan Commelin (Feb 05 2021 at 09:00):
@Peter Scholze In the proof of 9.5 => 9.4 you use that Hom(Z, Mbar)
is the same as Mbar
.
Johan Commelin (Feb 05 2021 at 09:00):
But Lean doesn't agree that they are the same-same-same.
Johan Commelin (Feb 05 2021 at 09:05):
/-- The system of complexes
`V-hat(M_{≤c})^{T⁻¹} ⟶ V-hat(M_{≤c_1c}^2)^{T⁻¹} ⟶ ...`
as a functor in `M`.
See also `system`. -/
@[simps]
def System (r : ℝ≥0) (V : NormedGroup) [normed_with_aut r V] [fact (0 < r)]
(r' : ℝ≥0) [fact (0 < r')] [fact (r' ≤ 1)] :
(ProFiltPseuNormGrpWithTinv r')ᵒᵖ ⥤ system_of_complexes :=
{ obj := λ M, BD.system c' r V r' (unop M),
map := λ M₁ M₂ f, system.map BD c' r V r' f.unop,
map_id' := λ M, by apply system.map_id,
map_comp' := λ M₁ M₂ M₃ f g, by apply system.map_comp }
Johan Commelin (Feb 05 2021 at 09:06):
So now we need to turn Mbar
into an object of the source category, and then we can restate 9.4.
Much ado about nothing...
Johan Commelin (Feb 05 2021 at 09:06):
But the upshot is that we are now also quite a bit closer to the statement of 9.5
Johan Commelin (Feb 05 2021 at 09:07):
Once Mbar
is a ProFiltDingsBums
we can also delete several hundreds of lines of dead code :sweat_smile:
Peter Scholze (Feb 05 2021 at 09:11):
OK, great! But then the norms are literally the same.
Johan Commelin (Feb 05 2021 at 09:12):
Sure, this is one of those hyper-canonical identifications that Lean will stumble over, but of course it's utterly trivial.
Peter Scholze (Feb 05 2021 at 09:13):
Sure, that's OK. I was just trying to answer the question that came up in the chat above, about which version of isomorphism invariance is required
Johan Commelin (Feb 05 2021 at 09:13):
Ok, yes. That's helpful. You're saying that in the proof of 9.5 we will not need a more advanced version. That's useful info.
Riccardo Brasca (Feb 05 2021 at 09:16):
So in practice we can add the assumption that the isomorphism preserves the norm?
Riccardo Brasca (Feb 05 2021 at 09:16):
If this is the case I can try to prove it now
Johan Commelin (Feb 05 2021 at 09:17):
Maybe we should do as Scott suggested, and bake this into the definition of morphism of system_of_complexes
.
Riccardo Brasca (Feb 05 2021 at 09:19):
But this means changing the category, right? Morphisms of system of complexes are never explicitly defined, they are just natural transformations. And if we do so we can never consider norm increasing morphism
Johan Commelin (Feb 05 2021 at 09:21):
True. So my inclination is to just add it as unbundled assumptions on top of morphisms-as-nat-trans when we need it.
Riccardo Brasca (Feb 05 2021 at 09:35):
So something like
lemma of_iso (h : C₁.is_bdd_exact_for_bdd_degree_above_idx k m c₀) (f : C₁ ≅ C₂)
(hnorm : ∀ c i (x : C₁.X c i), ∥(f.apply c i).1 x∥ ≤ ∥x∥ ) :
C₂.is_bdd_exact_for_bdd_degree_above_idx k m c₀ :=
Peter Scholze (Feb 05 2021 at 09:37):
I think I strive to only ever consider norm-nonincreasing maps. If you formalize 9.6, you will find that it's best to have this assumption baked into the foundations, probably...
Peter Scholze (Feb 05 2021 at 09:38):
On the other hand, in the proof of 9.5, there is a point where I have to artificially divide by some factorials to make this assumption true. But I think it's still best to do the work there. In general, it's best to have norm-nonincreasing as the standard and make additional arguments when that's violated, because bounds are so important in this argument
Scott Morrison (Feb 05 2021 at 09:40):
It shouldn't be hard to do this. We can't just use a @[derive]
statement to summon the category structure of system_of_complexes
, but it will only be a few lemmas to define the subcategory of norm nonincreasing maps. (And I guess some straightforward helpful lemmas, e.g. that isomorphisms thus preserve norms.)
Scott Morrison (Feb 05 2021 at 09:41):
We already have a special apply
function to apply a morphism to an element, so it will be no harder to use, whether the morphisms are collections of natural transformations or collections of natural transformations satisfying a condition.
Johan Commelin (Feb 05 2021 at 09:42):
@Peter Scholze acts as automorphism on . But do you ever consider the action of ? Or should we turn it into an norm-noninc endomorphism?
Peter Scholze (Feb 05 2021 at 09:47):
I think I only ever use the action of in Lecture 9. I'm sorry that this is bad notation... but didn't feel much better to me.
Peter Scholze (Feb 05 2021 at 09:47):
Also note that doesn't act on !
Johan Commelin (Feb 05 2021 at 09:48):
Right, only does...
Riccardo Brasca (Feb 05 2021 at 09:50):
I have no idea how to modify the definition, so I will let someone else doing it :stuck_out_tongue:
Johan Commelin (Feb 05 2021 at 11:07):
https://github.com/leanprover-community/lean-liquid/blob/master/src/Mbar/Mbar_le.lean#L583
two sorry
s here that shouldn't be too hard...
Other than that, Mbar
is now a profinitely_filtered_pseudo_normed_group_with_Tinv
, and so we can use the general machinery to get a system of complexes (instead of the "ad hoc" approach that we first used to get the statement of 9.4).
Kevin Buzzard (Feb 05 2021 at 11:46):
Is someone working on these? I can try them now
Johan Commelin (Feb 05 2021 at 11:46):
I'm not
Johan Commelin (Feb 05 2021 at 11:47):
I need to have lunch, and after that there is a seminar talk in Freiburg by a certain Heather Macbeth :smiley: :laughing:
Kevin Buzzard (Feb 05 2021 at 12:07):
OK I'm unsorrying Mbar_le
Johan Commelin (Feb 05 2021 at 12:07):
Thanks!
Kevin Buzzard (Feb 05 2021 at 13:12):
I've done neg
so all that's left is embedding_cast_le
, although I'm wrestling with excessive memory consumption errors :-/
Riccardo Brasca (Feb 05 2021 at 13:17):
I had the same problem of excessive memory consumption. I run leanproject build
and now it seems to be gone
Kevin Buzzard (Feb 05 2021 at 13:17):
Yeah, although that's a bit weird (I did the same).
Riccardo Brasca (Feb 05 2021 at 13:17):
Yes, there is something wrong...
Kevin Buzzard (Feb 05 2021 at 13:19):
We might have to ask the computer scientists, the code is nowhere near big or complicated enough to be causing problems, there might be some loop somewhere.
Kevin Buzzard (Feb 05 2021 at 13:20):
Going back to mathematics, it seems that I'm supposed to be proving that if then the injection Mbar_le r S c_1 \to Mbar_le r S c_2
is an embedding (i.e. the topology on the source is induced by the topology on the target under the injection). Is this done or am I expected to prove it?
Kevin Buzzard (Feb 05 2021 at 13:23):
(Mbar_le r' S c
is the set of power series F_s = ∑ a_{n,s}T^n ∈ Tℤ[[T]] such that ∑_{n,s} |a_{n,s}|r'^n ≤ c, and S is finite)
Kevin Buzzard (Feb 05 2021 at 13:25):
OK I think I have a maths proof, I'll just type that in.
Peter Scholze (Feb 05 2021 at 13:48):
Any injective map of compact Hausdorff spaces acquires the subspace topology
Peter Scholze (Feb 05 2021 at 13:48):
not sure whether you already have that in mathlib, but it might be useful
Kevin Buzzard (Feb 05 2021 at 13:49):
oh that's a better approach than mine.
Kevin Buzzard (Feb 05 2021 at 13:50):
@Riccardo Brasca I had a problem with a proof timing out in Mbar_le
and I've written a much simpler one, so maybe if you pull that memory error will go away.
Kevin Buzzard (Feb 05 2021 at 13:51):
I'm kind of annoyed I didn't spot that proof, I thought I had some counterexample but now I realise the source wasn't Hausdorff.
Riccardo Brasca (Feb 05 2021 at 13:51):
I don't see any timeout anymore :grinning:
Sebastien Gouezel (Feb 05 2021 at 13:58):
We have docs#closed_embedding_of_continuous_injective_closed. Is this what you need here?
Peter Scholze (Feb 05 2021 at 14:09):
I think not: There's something more specific to compact Hausdorff spaces, where merely being injective on underlying sets already implies it's closed immersion, with the subspace topology
Sebastien Gouezel (Feb 05 2021 at 14:19):
I think it's exactly this lemma, plus the trivial fact that a continuous map on a compact space is closed, no?
Peter Scholze (Feb 05 2021 at 14:22):
Well, I find the other half more trivial, I believe?
Peter Scholze (Feb 05 2021 at 14:23):
(After all, that half works for any topological space, so I regard it as "abstract nonsense")
Sebastien Gouezel (Feb 05 2021 at 14:24):
The other half is the fact that the image of a closed set is in fact the image of a compact set, hence it is compact, hence it is closed. We have all these facts in mathlib, but probably not glued together exactly in the form you need.
Kevin Buzzard (Feb 05 2021 at 14:26):
import topology.separation
lemma foo (X Y : Type*) [topological_space X] [topological_space Y]
[compact_space X] [t2_space X] [compact_space Y] [t2_space Y]
(f : X → Y) (hf1 : continuous f) (hf2 : function.injective f) :
embedding f := sorry
is what we seem to need.
Peter Scholze (Feb 05 2021 at 14:26):
If I remember right, you can omit t2_space X
Sebastien Gouezel (Feb 05 2021 at 14:27):
Let me do this now.
Kevin Buzzard (Feb 05 2021 at 14:27):
Many thanks -- it is the last sorry in the file I'm working on.
Kevin Buzzard (Feb 05 2021 at 14:29):
lemma embedding_of_injective {X Y : Type*} [topological_space X] [topological_space Y]
[compact_space X] [t2_space X] [compact_space Y] [t2_space Y] {f : X → Y}
(hf1 : continuous f) (hf2 : function.injective f) :
embedding f := sorry
is a tidier version
Sebastien Gouezel (Feb 05 2021 at 14:42):
Sebastien Gouezel (Feb 05 2021 at 14:50):
If you want to put it in your file, this is just
lemma continuous.is_closed_map [compact_space α] [t2_space β] {f : α → β} (h : continuous f) :
is_closed_map f :=
λ s hs, (hs.compact.image h).is_closed
lemma continuous.closed_embedding [compact_space α] [t2_space β] {f : α → β} (h : continuous f)
(hf : function.injective f) : closed_embedding f :=
closed_embedding_of_continuous_injective_closed h hf h.is_closed_map
(with topological space assumptions)
Adam Topaz (Feb 05 2021 at 14:54):
Did we decide what to do about (non)invariance under isomorphisms of systems of complexes?
Here's a possible suggestion, which I think we discussed at some point before (possible in a different context):
- Redefine
normed_group_hom
to be norm nonincreasing. - Introduce an endofunctor on
NormedGroup
which on objects just scales the norm by a positive constantC
.
Johan Commelin (Feb 05 2021 at 15:08):
I think such an endofunctor will lead to annoying DTT hell, unfortunately.
Johan Commelin (Feb 05 2021 at 15:08):
I would rather work with two separate categories.
Adam Topaz (Feb 05 2021 at 15:09):
Will it be more annoying than not having invariance under isomorphisms?
Johan Commelin (Feb 05 2021 at 15:10):
I think we don't need invariance under isos that much.
Adam Topaz (Feb 05 2021 at 15:11):
Oh ok.
Johan Commelin (Feb 05 2021 at 15:11):
But we could tweak the category structure on system_of_complexes
instead, to ask that those maps are all norm-noninc
Johan Commelin (Feb 05 2021 at 15:11):
that seems like the best of both worlds
Johan Commelin (Feb 05 2021 at 17:17):
Should we explicitly make a norm_nonincreasing
predicate? That might by more economical and ergonomical then explicitly writing out the condition every time.
Adam Topaz (Feb 05 2021 at 17:19):
Yes, I think making such a predicate would be a good option.
Adam Topaz (Feb 05 2021 at 17:19):
We can rewrite things like admissible
using this
Adam Topaz (Feb 05 2021 at 17:19):
Should it be a class?
Adam Topaz (Feb 05 2021 at 17:20):
(or we can use fact
s again...)
Johan Commelin (Feb 05 2021 at 17:20):
No, I think just a regular predicate.
Adam Topaz (Feb 05 2021 at 17:21):
I'm thinking it might be nice to have it automatically for things like quotient maps
Johan Commelin (Feb 05 2021 at 17:21):
If we start using it a lot, we can upgrade.
Adam Topaz (Feb 05 2021 at 17:21):
Sounds good
Johan Commelin (Feb 05 2021 at 17:22):
And with regular predicates, we can have dot notation, which is also a powerful tool.
Johan Commelin (Feb 05 2021 at 17:23):
@Adam Topaz do you want to do this (minor) refactor?
Adam Topaz (Feb 05 2021 at 17:23):
I can. But I won't be able to today. I'll have much more time starting next week
Johan Commelin (Feb 05 2021 at 17:24):
ok, then let's see if it's still waiting for you next week :wink:
Kevin Buzzard (Feb 05 2021 at 17:44):
@Johan Commelin OK Mbar_le
is now sorry-free.
Johan Commelin (Feb 05 2021 at 17:44):
Great, thanks!
Johan Commelin (Feb 05 2021 at 19:26):
I did the refactor, and pushed a proof that strong exactness is isom invariant (assuming strictness).
Johan Commelin (Feb 06 2021 at 06:05):
the condition embedding_cast_le
is now replaced by continuous_cast_le
(and embedding_cast_le
is a lemma).
Kevin Buzzard (Feb 06 2021 at 08:55):
Nice!
Last updated: Dec 20 2023 at 11:08 UTC