Zulip Chat Archive
Stream: triage
Topic: PR #2292: refactor(algebra/group): make `multiplicative` ...
Random Issue Bot (Dec 09 2020 at 14:24):
Today I chose PR 2292 for discussion!
refactor(algebra/group): make multiplicative
and additive
irreducible
Created by @Yury G. Kudryashov (@urkud) on 2020-03-30
Labels: WIP, merge-conflict
Is this PR still relevant? Any recent updates? Anyone making progress?
Kevin Buzzard (Dec 09 2020 at 15:20):
My gut feeling is that this is worth working on, like making with_bot and with_top irreducible, so I'd be interested in hearing Yury's opinion on this one.
Random Issue Bot (Mar 03 2021 at 14:21):
Today I chose PR 2292 for discussion!
refactor(algebra/group): make multiplicative
and additive
irreducible
Created by @Yury G. Kudryashov (@urkud) on 2020-03-30
Labels: WIP
Is this PR still relevant? Any recent updates? Anyone making progress?
Kevin Buzzard (Mar 03 2021 at 15:30):
My memory is that various things broke when we tried this and I didn't really understand why -- I could fix them but it involved making code uglier. I asked Yury for advice but I think he was very busy at that time.
Yury G. Kudryashov (Mar 04 2021 at 00:20):
I don't know what exactly changes about irreducible
in Lean 4.
Yury G. Kudryashov (Mar 04 2021 at 00:21):
It makes no sense to refactor a code to something that will need another refactor for Lean 4 porting.
Gabriel Ebner (Mar 04 2021 at 09:32):
I have experimented with making additive/multiplicate structures instead in #6045. It's a bit cumbersome (but will probably get easier with more automation around equivalences).
The one place where I got completely stuck was with monoid algebras.
Kevin Buzzard (Mar 04 2021 at 09:46):
But am I right in thinking that persevering with that experiment is a useful step forwards when it comes to mathport?
Eric Wieser (Mar 04 2021 at 09:51):
Gabriel Ebner said:
The one place where I got completely stuck was with monoid algebras.
I assume that was regarding docs#monoid_algebra.to_additive and docs#add_monoid_algebra.to_multiplicative?
Eric Wieser (Mar 04 2021 at 09:55):
I remember getting completely stuck in #4402 when I added those, Kevin and I argued a lot about whether abusing the transparency of additive/multiplicative mattered, and then Johan pushed a golf that made it look as though the abuse wasn't there at all
Kevin Buzzard (Mar 04 2021 at 10:03):
so we're now covering up abuse?
Kevin Buzzard (Mar 04 2021 at 10:04):
My feelings about the evilness of abusing type equality might just date back to some traumatic childhood experience or something -- trying to get something like order_dual
working in some case when I was still a beginner and just getting super-confused, with leaks meaning that after a while you had no idea whether a < b
meant a < b
or b < a
.
Kevin Buzzard (Mar 04 2021 at 10:06):
Damiano might have had a similar experience recently. At the time I convinced myself that if you keep your head then of course you can write the code you want to write, but I remember thinking that if definitional equality could not be abused then we would never be in this mess in the first place.
Kevin Buzzard (Mar 04 2021 at 10:07):
And mathematically I've very much come to terms with the fact that the map from nat to int is not "the identity", and I like the idea, to the extent that I'm happy with the idea that the map from (X,<) to (order_dual X,>) is also not "the identity".
Eric Wieser (Mar 04 2021 at 10:12):
But are you happy with the idea that the map from (X,<) to (order_dual (order_dual X),<) is not "the identity"?
Kevin Buzzard (Mar 04 2021 at 10:13):
Yes, that is a corollary of my belief system.
Kevin Buzzard (Mar 04 2021 at 10:13):
It's a canonical isomorphism.
Gabriel Ebner (Mar 04 2021 at 10:14):
Eric Wieser said:
Gabriel Ebner said:
The one place where I got completely stuck was with monoid algebras.
I assume that was regarding docs#monoid_algebra.to_additive and docs#add_monoid_algebra.to_multiplicative?
Yes, they're a bit tricky but they can be solved with a bit of code duplication. The whole file is a piece of work since it very much relies on the fact that both additive and multiplicative monoid algebras reduce to the same finsupp:
We cannot define
add_monoid_algebra k G := monoid_algebra k (multiplicative G)
, because polynomials are defined in terms ofadd_monoid_algebra R ℕ
, and thenp.support
would be afinset (multiplicative ℕ)
.
I wonder if it would be easier to define add_monoid_algebra first, and then monoid_algebra k A := add_monoid_algebra k (additive A)
.
Eric Wieser (Mar 04 2021 at 10:25):
I mean, isn't that really just saying "if we define things with multiplicative
/ additive
, then one of the APIs will be awkward to use, and mathematicians will really hate us if we choose the awkward one to be add_monoid_algebra
as that's what polynomial uses"?
Eric Wieser (Mar 04 2021 at 10:25):
Your suggestion just makes the users of monoid_algebra
hate us instead, which is at least fewer users
Johan Commelin (Mar 04 2021 at 10:30):
All the representation theorists...
Julian Külshammer (Mar 04 2021 at 10:30):
But won't it be hidden from the user for polynomial
because that has its own API but things like the group algebra are exclusively written multiplicatively and hopefully will be studied a lot in mathlib's future?
Gabriel Ebner (Mar 04 2021 at 10:32):
The problem is that polynomials don't have their own API. The add_monoid_algebra leaks through it like a sieve.
Scott Morrison (Mar 04 2021 at 10:36):
This would be reasonable to try to fix. :-)
Scott Morrison (Mar 04 2021 at 10:37):
Presumably it's one of those things you don't have to do all at once --- marked polynomial
irreducible, fix a few breakages, then remove the irreducible label when you get bored/stuck, and PR your progress.
Kevin Buzzard (Mar 04 2021 at 10:45):
Scott's approach sounds like the right one to me. I agree with Gabriel that there are a ton of leaks here. My experience is that these things really confuse beginners.
Eric Wieser (Mar 04 2021 at 11:02):
Is polynomial
really at all different from add_monoid_algebra
other than giving things different names?
Kevin Buzzard (Mar 04 2021 at 11:15):
This is exactly the sort of question which comes up. As a mathematician I would strongly argue for an API which respects what mathematicians expect to see when they come to our library. It is important to stress that there are plenty of mathematicians out there who know exactly what a polynomial and a coefficient of a polynomial is, but have no idea what a monoid is and even less idea what a monoid algebra is. I am well aware that from a CS point of view giving things more than one name is a pain, and indeed I remember the struggle I had in 2018 or so to even persuade the developers of the polynomial API to even make a coeff
function, because their argument was that it "was already there", it was called function evaluation. But this is absolutely not the way to go if we want to attract mathematician users.
Monoids are like LEM, they're taught in CS departments so people with a non-maths background just assume they're taught in maths departments too. This is not the case. Monoids are unimportant because there are no good classification theorems or profound questions about them, so we go straight to groups.
Eric Wieser (Mar 04 2021 at 11:17):
My point was not "why do we even have polynomial
", but "how can we make this type of renaming easier"
Kevin Buzzard (Mar 04 2021 at 11:18):
I think Scott has the right answer to this -- make it irreducible and stop the leaks. In the file where the API is made it can be reducible enabling slick one-line proofs which abuse defeq, and then once we have the API, lock it up.
Kevin Buzzard (Mar 04 2021 at 11:23):
It might frustrate people who want to give slick proofs which abuse defeq, but surely it is a general principle of programming that you want your users to access your functions via the correct interface, and I have learnt that for mathematicians this works really well. Once they get the hang of the API they can really use it, the naming convention is brilliant and they don't ever have to worry about monoid algebras or universal properties, unless of course they want to, in which case they can just look and find out what is really going on.
Julian Külshammer (Mar 04 2021 at 11:23):
I agree that monoids are not as widely studied as groups, but that doesn't mean that there are no profound questions about them, see e.g. Benjam Steinberg, Representation Theory of finite monoids, https://www.springer.com/gp/book/9783319439303
Kevin Buzzard (Mar 04 2021 at 11:29):
Oh nice, thanks :-) It's good to have a non-commutative person around to keep me honest :-) But presumably there's nothing like classification of finite simple monoids?
I don't really understand computation much, but my impression is that a disadvantage of making everything irreducible is that it makes stuff I never use like #eval stop working. If there are people who want to use Lean as a CAS and multiply polynomials together then making them irreducible might have consequences. But I also have this vague idea that algorithms for multiplying things together which are good for proving are totally different to the algorithms for multiplying things together which are good for computing (e.g. there are efficient ways to multiply large matrices together which beat the naive method, but the naive method is essential if you want to prove theorems) so I kind of suspect that this CAS thing is somehow a different problem -- one would want to write other functions to do this sort of thing rather than getting the provey stuff we have to compute.
Damiano Testa (Mar 04 2021 at 11:32):
While I do not feel that I can contribute much for the "Lean development" part of the conversation, I can confirm that
- had there not been a
coeff
function onpolynomials
, I am not sure that I would have played with them in the first place; - I am worried about using
order_dual
, since I have not been able to prove that for a non-emptyfinset ℕ
, themin'
for the usual order on ℕ and themax'
for theorder_dual
coincide; - I do not think that I know what a monoid is and whenever I have to choose a typeclass, I prove the lemma assuming
integral_domain
and then start making my way down until I realize thathas_one
was enough.
Damiano Testa (Mar 04 2021 at 11:34):
While we are at it, if it were possible to introduce a "mathematician switch" in lean that would fictitiously introduce a goal of a ≠ 0
, whenever you use 1 / a
, I might even consider using it myself, feeding into Lean lots of unneeded proofs that some elements are non-zero! :smile:
Anne Baanen (Mar 04 2021 at 11:35):
Kevin Buzzard said:
I don't really understand computation much, but my impression is that a disadvantage of making everything irreducible is that it makes stuff I never use like #eval stop working.
I think you're confusing irreducible
and noncomputable
in this context:
@[irreducible, derive[has_zero,has_one,has_add,has_repr]]
def not_the_natural_numbers := ℕ
@[irreducible] def secret_number : not_the_natural_numbers := 37
#eval secret_number -- 37
But overall I agree, the definitions we should standardise on are the ones that prove well. Especially since there are many different data types that can represent polynomials, each with their own (computational) benefits.
Gabriel Ebner (Mar 04 2021 at 19:23):
Back to the original topic: #6045 builds now.
Gabriel Ebner (Mar 04 2021 at 19:25):
I've looked a bit into making polynomials work the add_monoid_algebra k G := monoid_algebra k (multiplicative G)
definition.
This actually works pretty nicely once you define polynomial.support
, polynomial.sum
, and use polynomial.monomial
instead of finsupp.single
.
Gabriel Ebner (Mar 04 2021 at 19:27):
Do additive monoid algebras actually have any applications (aside from polynomials)? If nobody cares about them, then we could define polynomial R := monoid_algebra R (multiplicative ℕ)
directly and save hundreds of lines.
Gabriel Ebner (Mar 04 2021 at 19:30):
(The add_monoid_algebra
API leaks multiplicative
all over the place, even now. So it's not a great abstraction.)
Eric Wieser (Mar 04 2021 at 21:03):
700 lines of extra proofs is a little unfortunate
Scott Morrison (Mar 04 2021 at 21:33):
I think in the conventional world whenever someone wants to talk about an add_monoid_algebra they implicitly use multiplicative
, although mathematicians would handle this by introducing a dummy symbol (often e or t) and writing e^x.
Scott Morrison (Mar 04 2021 at 21:35):
This is even what we do for polynomials, of course: the symbol X^
is a stand-in for multiplicative
.
Scott Morrison (Mar 04 2021 at 21:36):
(Also, my apologies for some of the mess here. I made monoid_algebra because I needed it for group rings, and only did a half-hearted job of surgering polynomials onto it.)
Scott Morrison (Mar 04 2021 at 21:38):
Gabriel's suggestion to add polynomial.support|sum
etc sounds like the right approach. You need to duplicate lots of theorems but of course not their proofs.
Yury G. Kudryashov (Mar 11 2021 at 03:25):
There are many polynomial
-like objects (polynomials and mv_polynomial
s are in mathlib, quasi-polynomials are not there yet). While we can use monoid_algebra
directly, it's convenient to have a place where all theorems are reformulated with +
instead of *
where it matters.
Yury G. Kudryashov (Mar 11 2021 at 03:27):
Possibly, the "correct" way is to define add_monoid_algebra := finsupp ...
then use an equiv
alence with monoid_algebra
to transfer the properties. This will make proofs about add_monoid_algebra
longer (though automation may help here) but the users of add_monoid_algebra
will be saved from dealing with multiplicative
.
Mario Carneiro (Mar 12 2021 at 17:58):
I've mentioned this privately to Johan, but I'm personally opposed to this change (#6045, that is), I think it's a regression. Even if the original additive
and multiplicative
are changed to structures, I want a version that is a type alias in order to facilitate a particular proof method that was the raison d'etre of the definitions in the first place
Mario Carneiro (Mar 12 2021 at 17:59):
I think it would be more useful to keep additive
and multiplicative
as aliases and make wrapper structures around them for particular applications such as add_monoid_algebra
Eric Wieser (Mar 12 2021 at 18:38):
The change makes me feel mildly uneasy too; it seemed useful to be able to say "reuse this entire proof / definition, but pass in (*) = (+)
". Perhaps the way that's currently done isn't optimal, but the structure
-approach of #6045 results in definitions which don't unfold to something as simple. I'm glad to see someone who knows more about this than me has similar concerns.
Kevin Buzzard (Mar 12 2021 at 18:57):
There are problems with the "pass in (*) for (+)" approach, e.g. nsmul v pow, and the example which Ashwin Iyengar pointed out today (we write but never , and but never ). But I thought that the point of this was that it has something to do with Lean 4 where...something related to this was not actually allowed?
Eric Wieser (Mar 12 2021 at 19:33):
Right, the nsmul / pow thing is n •ℕ x
vs x ^ n
I assume - although if the first translated to nsmul x n
then the problem would go away right? Maybe not something to divert this thread with. I too would like to understand how lean4 fits in.
Bryan Gin-ge Chen (Mar 13 2021 at 16:05):
In the meantime would it make sense to merge #6657?
Last updated: Dec 20 2023 at 11:08 UTC