Zulip Chat Archive

Stream: maths

Topic: Cayley-Hamilton


Aaron Anderson (Jun 09 2020 at 04:10):

I think the "determinantal" proof of Cayley Hamilton, as presented in this pdf (https://people.reed.edu/~jerry/332/28ch.pdf) would be pretty amenable to Lean, except for the fact that polynomials cannot be evaluated (with the standard tools at least) into non-commutative values. I understand why this could make life easier for multivariate polynomials, but adjoining just one element to a commutative ring, say, gives another commutative ring. Can anyone recommend any fixes?

Scott Morrison (Jun 09 2020 at 04:15):

It doesn't seem that polynomial.eval₂ assumes the target ring is commutative.

Aaron Anderson (Jun 09 2020 at 04:17):

Thanks! Unfortunately polynomial.eval₂_ring_hom does assume the target ring is commutative, which might still set me back a little.

Scott Morrison (Jun 09 2020 at 04:18):

You do need to know that your map f : R \to S lands in the centre of S.

Scott Morrison (Jun 09 2020 at 04:19):

i.e. you could eval into any R-algebra, more generally than along a ring_hom to a commutative ring.

Scott Morrison (Jun 09 2020 at 04:19):

This should be fixed!

Scott Morrison (Jun 09 2020 at 04:22):

It would be a pretty major overhaul of polynomial.lean, however.

Scott Morrison (Jun 09 2020 at 04:22):

I know @Yury G. Kudryashov has been planning various refactors there, but I'm not sure where he's up to.

Aaron Anderson (Jun 09 2020 at 04:26):

Well, now that I'm deconfused about polynomial.eval₂, I should be able to prove that the image of my polynomial evaluation is a commutative subring of matrices, which should be enough to use determinants.

Yury G. Kudryashov (Jun 09 2020 at 05:34):

Sorry, I'll return to the polynomials refactor tomorrow.

Yury G. Kudryashov (Jun 09 2020 at 06:10):

After my refactor aeval will work for matrices.

Yury G. Kudryashov (Jun 09 2020 at 06:10):

(or any other algebra)

Yury G. Kudryashov (Jun 09 2020 at 06:10):

Currently it assumes that the target ring is commutative.

Yury G. Kudryashov (Jun 09 2020 at 07:40):

[1/n]: #3003

Johan Commelin (Jun 09 2020 at 07:52):

@Yury G. Kudryashov Thank you so much for doing this!

Aaron Anderson (Jun 12 2020 at 05:23):

I have written what I'm pretty sure is most of a proof of Cayley-Hamilton.

Aaron Anderson (Jun 12 2020 at 05:23):

It's in this repository, with another project. https://github.com/jalex-stark/friendship-theorem/tree/master/src

Aaron Anderson (Jun 12 2020 at 05:25):

The only sorrys left have to do with polynomial.eval2, and should be fixed by either hard work, better understanding of how to infer instances, or waiting for the refactor to make polynomials evaluation less commutativity-dependent.

Yury G. Kudryashov (Jun 12 2020 at 05:37):

I'm sorry for slowing you down. I have a several refactors on the queue. I'll try to switch back to polynomials in about an hour. UPD: unless I'll go to bed earlier.

Scott Morrison (Jun 12 2020 at 05:46):

What is going on with

def choose_index : n := arbitrary n

def vec_to_one_row (vec : n  R): matrix n n R :=
λ (i : n), ite (i = choose_index) vec 0

This stuff seems weird.

Aaron Anderson (Jun 12 2020 at 05:50):

I wanted to apply a matrix of matrices to a vector of vectors, but matrix.mul_vec and such seem to require that the entries of the matrix and vector belong to the same ring

Aaron Anderson (Jun 12 2020 at 05:51):

So I came up with a way to replace the vector of vectors with a matrix of matrices

Aaron Anderson (Jun 12 2020 at 05:53):

Kind of weird, I agree, but easier in the short-term than developing the “right” generalization of matrix multiplication to ring actions or something

Scott Morrison (Jun 12 2020 at 06:28):

I think it's going to require a bit of restructuring before others could use this proof.

Scott Morrison (Jun 12 2020 at 06:36):

@Yury G. Kudryashov, I just experimented a bit with generalising parts of polynomial.lean to semiring, rather than comm_semiring.

Scott Morrison (Jun 12 2020 at 06:36):

How much would it get in the way of your refactoring plans if I PR'd that?

Scott Morrison (Jun 12 2020 at 06:36):

I can try again later if it's going to be annoying.

Yury G. Kudryashov (Jun 12 2020 at 06:37):

What parts?

Scott Morrison (Jun 12 2020 at 06:38):

https://github.com/leanprover-community/mathlib/pull/3043

Scott Morrison (Jun 12 2020 at 06:38):

So everything before eval2 first appears, except that everything about C still assumes a comm_semiring.

Mario Carneiro (Jun 12 2020 at 06:39):

Hm, that's an interesting point @Aaron Anderson . Matrices are supposed to be the finite based version of linear maps between vector spaces, so there should be a way to turn a matrix into a linear map

Yury G. Kudryashov (Jun 12 2020 at 06:40):

My next refactor will deal with eval₂, eval, and aeval.

Scott Morrison (Jun 12 2020 at 06:40):

So what I was just doing is perhaps orthogonal?

Yury G. Kudryashov (Jun 12 2020 at 06:40):

Yes.

Scott Morrison (Jun 12 2020 at 06:40):

(It wasn't intended as a real PR; just the quickest way to show you now.)

Scott Morrison (Jun 12 2020 at 06:41):

Ok. I will not do anything that touches eval*!

Mario Carneiro (Jun 12 2020 at 06:41):

Given a basis on V and a vector in V, you can produce a list of coefficients in R, which you can use mul_vec on to get another list of coefficients in R. Given a basis on W you can evaluate the list to get a vector in W

Mario Carneiro (Jun 12 2020 at 06:42):

So it seems like nothing new is needed in matrices, mul_vec is sufficient. It just needs to be coupled with constructions for converting from R^n to and from a vector space

Aaron Anderson (Jun 12 2020 at 06:58):

I’ll try to clean it up soon. Ultimately these vectors of vectors and matrices of matrices really come from tensor products, so maybe the best way to express this proof is by fleshing out the theory of tensors-as-matrices a bit

Scott Morrison (Jun 12 2020 at 08:04):

Another fun thing to prove might be

matrix n n (polynomial R) +* polynomial (matrix n n R)

(this requires #3043 before it even makes sense)

Scott Morrison (Jun 12 2020 at 08:05):

if you have that, there is what looks like a lovely proof of Cayley-Hamilton at http://drorbn.net/AcademicPensieve/2015-12/CayleyHamilton.pdf

Kenny Lau (Jun 12 2020 at 08:19):

there must be a way to generalize this statement

Johan Commelin (Jun 12 2020 at 08:23):

Its an R-alghom

Kenny Lau (Jun 12 2020 at 08:25):

this is saying that two functors commute

Kenny Lau (Jun 12 2020 at 08:25):

looks like a homomorphism condition to me

Kevin Buzzard (Jun 12 2020 at 08:33):

They only commute up to natural isomorphism

Scott Morrison (Jun 12 2020 at 09:26):

what is the universal property for matrix n n? Can we prove this (natural) isomorphism by comparing the universal properties of the LHS and RHS?

Jalex Stark (Jun 12 2020 at 09:29):

universal property in the category of R-algebras? I want to say something like "it admits an embedding of every n-dimensional R-algebra" but that's not quite it

Scott Morrison (Jun 12 2020 at 09:34):

like polynomial R is the initial object in the category of "rings S equipped with a map from R and a chosen element"

Scott Morrison (Jun 12 2020 at 09:34):

what is matrix n n R?

Kevin Buzzard (Jun 12 2020 at 09:34):

End_R(R^n)

Kevin Buzzard (Jun 12 2020 at 09:34):

up to natural isomorphism

Scott Morrison (Jun 12 2020 at 09:35):

but just saying that doesn't give you matrix n n (polynomial R) ≃+* polynomial (matrix n n R) for free

Scott Morrison (Jun 12 2020 at 09:39):

https://mathoverflow.net/a/142445

Scott Morrison (Jun 12 2020 at 09:41):

doesn't actually seem very useful

Johan Commelin (Jun 12 2020 at 09:42):

Yeah... it's a universal coproperty :rofl:

Jalex Stark (Jun 12 2020 at 09:45):

does anyone have a math proof of this isomorphism? it's not clear to me
edit: ah, the isomorphism concretely is "collect like powers of the indeterminate"

Patrick Massot (Jun 12 2020 at 09:45):

It's a bit sad to struggle with Cayley-Hamilton for matrices when this theorem has not much to do with matrices, and we'll need to do multilinear algebra properly at some point. But I guess it still progress to work towards having it anyway (and I certainly don't have time for multilinear algebra).

Scott Morrison (Jun 12 2020 at 09:58):

The proof, at least as a ≃+ is just a bunch of currying, swapping, uncurrying, and using src#finsupp.equiv_fun_on_fintype

David Wärn (Jun 12 2020 at 10:23):

polynomial and matrix n n are both 'semigroup algebras', over the monoid of natural numbers and the semigroup fin n x fin n, viewed as indexing the standard basis of matrix n n, respectively

David Wärn (Jun 12 2020 at 10:29):

I guess the latter is not quite a semigroup actually...

Kevin Buzzard (Jun 12 2020 at 10:29):

(a0+a1X+a2X2+b0+b1X+b2X2+c0+c1X+c2X2+d0+d1X+d2X2+)\begin{pmatrix} a_0+a_1X+a_2X^2+\cdots&b_0+b_1X+b_2X^2+\cdots\\ c_0+c_1X+c_2X^2+\cdots&d_0+d_1X+d_2X^2+\cdots \end{pmatrix}

"equals"

(a0b0c0d0)+(a1b1c1d1)X+(a2b2c2d2)X2+\begin{pmatrix}a_0&b_0\\c_0&d_0\end{pmatrix}+\begin{pmatrix}a_1&b_1\\c_1&d_1\end{pmatrix}X+\begin{pmatrix}a_2&b_2\\c_2&d_2\end{pmatrix}X^2+\cdots

Kevin Buzzard (Jun 12 2020 at 10:33):

A mathematican can say they're equal because they're equal in some fluid over-set

Kevin Buzzard (Jun 12 2020 at 10:34):

Might well work for multivariate polynomials too. Might have to be careful with infinite matrices

David Wärn (Jun 12 2020 at 10:37):

Maybe the correct thing to say is that polynomial A for an R-algebra A is AZ[X]A \otimes \mathbb Z[X], and matrix n n A is AMn(Z)A \otimes M_n(\mathbb Z)? Then it's associativity and commutativity of tensor product?

Kevin Buzzard (Jun 12 2020 at 10:37):

R=Z\mathbb{Z}

Kevin Buzzard (Jun 12 2020 at 10:38):

and this proof generalises to mv_polynomial

David Wärn (Jun 12 2020 at 10:39):

Are you saying we may assume R is Z\mathbb Z or that this only makes sense for Z\mathbb Z?

Kevin Buzzard (Jun 12 2020 at 10:40):

I was just remarking that you said R-algebra but R is mentioned precisely once in your comment; you either tensor with R[X] or change R to Z and you're fine

David Wärn (Jun 12 2020 at 10:46):

Right, I think I meant "is" as in "isomorphic as R-algebras", but that also works

Kevin Buzzard (Jun 12 2020 at 10:46):

I was just remarking that you said R-algebra but R is mentioned precisely once in your comment; you either tensor with R[X] or change R to Z and you're fine

Scott Morrison (Jun 12 2020 at 11:15):

We don't have tensor products of R-algebras yet?

Johan Commelin (Jun 12 2020 at 11:23):

nope, we don't afaik

Scott Morrison (Jun 12 2020 at 14:49):

#3050

Scott Morrison (Jun 27 2020 at 07:30):

I have a very sorryful proof of Cayley-Hamilton, following the argument at http://drorbn.net/AcademicPensieve/2015-12/CayleyHamilton.pdf.

Scott Morrison (Jun 27 2020 at 07:30):

It is still a mess and needs some refactoring.

Scott Morrison (Jun 27 2020 at 07:31):

It also needs some sorrys filled in. I am confident all the remaining ones are "easy" (no design or API changes need, just bashing around with polynomials and/or matrices).

Scott Morrison (Jun 27 2020 at 07:31):

If anyone wants to jump in and hack on it, go for it: git checkout origin/cayley_hamilton

Scott Morrison (Jun 27 2020 at 07:31):

I will start cleaning up the structure later

Patrick Massot (Jun 27 2020 at 07:38):

Scott Morrison said:

If anyone wants to jump in and hack on it, go for it: git checkout origin/cayley_hamilton

Scott means leanproject get mathlib:cayley_hamiltonso that you get oleans too.

Johan Commelin (Jun 27 2020 at 07:48):

Unfortunately Azure isn't done yet

Johan Commelin (Jun 27 2020 at 08:03):

@Scott Morrison I'm seeing def baz, shouldn't that be def bar or def dror? :stuck_out_tongue_wink: [/jk]

Johan Commelin (Jun 27 2020 at 08:05):

Also, I would prefere M or A for matrices, over m.

Scott Morrison (Jun 27 2020 at 08:09):

All the names are up for grabs. :-)

Scott Morrison (Jun 27 2020 at 08:12):

Mostly the proof is/will be only the usual de Bruijn factor relative to the informal proof, but the statement "polynomials of matrices are matrices of polynomials" takes a ridiculously long time to set up properly (because we need both a good characterisation of the function, and that's it's an algebra equivalence). Having finished with the route I took (discussed earlier in this thread), I'm not sure it was the easiest one. :-)

Johan Commelin (Jun 27 2020 at 08:22):

But the resulting proof is quite nice. And the statement baz is very useful anyways. So we will profit in the future, I'm quite sure.

Patrick Massot (Jun 27 2020 at 08:23):

Johan Commelin said:

Unfortunately Azure isn't done yet

It's done now (unless someone pushed a new commit in the mean time).

Johan Commelin (Jun 27 2020 at 08:30):

I just did (-;

Johan Commelin (Jun 27 2020 at 08:30):

I pushed a proof of q

Johan Commelin (Jun 27 2020 at 08:32):

@Scott Morrison how did I let you get away with defining scalar in mathlib, but not scalar_apply?

Johan Commelin (Jun 27 2020 at 08:46):

@Jalex Stark I pushed another bit (just a heads up)

Jalex Stark (Jun 27 2020 at 08:50):

thanks; i merged and pushed another bit

Johan Commelin (Jun 27 2020 at 08:51):

@Jalex Stark Did you know about git pull --rebase?
Not very important, but it avoids the merges (unless there are conflicts) by replaying your work on top of whatever you pull.

Jalex Stark (Jun 27 2020 at 08:52):

(in this case there were conflicts; two lemmas that we both proved, yours seemed better so I accepted them)

Jalex Stark (Jun 27 2020 at 08:52):

i'm not sure i understand what you mean by avoiding merging. if there aren't conflicts, git pull will handle the merge by itself?

Patrick Massot (Jun 27 2020 at 08:53):

This is orthogonal.

Patrick Massot (Jun 27 2020 at 08:53):

git pull --rebase will also handle conflicts, with your help if needed.

Johan Commelin (Jun 27 2020 at 08:53):

Yes, I didn't phrase it correctly

Johan Commelin (Jun 27 2020 at 08:54):

It will replay automatically (unless there are conflicts) is what I should have written.

Johan Commelin (Jun 27 2020 at 08:55):

git pull will merge, but you can "rewrite history" by taking your local changes, storing them in the file cabinat (so to speak), doing a git pull and then taking your local changes and replaying them on top of whatever was pulled.

Johan Commelin (Jun 27 2020 at 08:55):

As I said, it's not very important.

Johan Commelin (Jun 27 2020 at 08:55):

Because once it is merged into mathlib the entire git history is squashed into 1 commit anyways

Jalex Stark (Jun 27 2020 at 08:55):

ah i see, git pull rebase will pretend that I pulled before making my changes

Jalex Stark (Jun 27 2020 at 08:56):

it kind of performs a reidemeister move on the commit tree

Johan Commelin (Jun 27 2020 at 09:07):

Once your git history is a non-trivial knot... didn't you do something wrong?? :grinning_face_with_smiling_eyes:

Jalex Stark (Jun 27 2020 at 09:12):

working with double sums feels too hard

Johan Commelin (Jun 27 2020 at 09:13):

I pushed a proof of sum_over_range'

Scott Morrison (Jun 27 2020 at 09:21):

thanks for all the work while I was eating dinner :-)

Johan Commelin (Jun 27 2020 at 09:47):

I pushed some more stuff

Scott Morrison (Jun 27 2020 at 09:59):

Turns out coeff_mul_zero has a rather shorter proof. :-)

Jalex Stark (Jun 27 2020 at 10:06):

simp [coeff_mul]?

Johan Commelin (Jun 27 2020 at 10:30):

After a digression involving simp and lunch, I'm back (-;

Jalex Stark (Jun 27 2020 at 10:30):

do we already have an induction principle like this?

import tactic

noncomputable theory
open_locale big_operators
open_locale classical


lemma sum_add_induction {α M : Type} [add_comm_monoid M] {p : M  Prop}
  (f : α  M) (s : finset α) (p_add :  a b, p a  p b  p (a + b)) (p_zero : p 0) (p_s :  x  s, p $ f x) :
p $  x in s, f x :=
begin
  induction s using finset.induction with x hx s hs, simpa,
  rw finset.sum_insert, swap, assumption,
  apply p_add, apply p_s, simp,
  apply hs, intros a ha, apply p_s, simp [ha],
end

Johan Commelin (Jun 27 2020 at 10:34):

I don't recognise it

Johan Commelin (Jun 27 2020 at 10:42):

Should we have a definition for matrices like
(λ i' j', if i' = i ∧ j' = j then x else 0)?

Jalex Stark (Jun 27 2020 at 10:45):

in maths this is sometimes called an "elementary matrix"?

Jalex Stark (Jun 27 2020 at 10:46):

i think we could use some simp lemmas for these in the current work

Scott Morrison (Jun 27 2020 at 10:47):

Yes.

Johan Commelin (Jun 27 2020 at 10:55):

Wiki says that elementary matrices are those that differ from the identity matrix by a single row/col operation

Jalex Stark (Jun 27 2020 at 10:58):

in quantum computing we just call them \ketbra i j

Jalex Stark (Jun 27 2020 at 10:59):

is that notion from wikipedia a useful one?

Johan Commelin (Jun 27 2020 at 11:00):

Maybe we should just have \delta

Johan Commelin (Jun 27 2020 at 11:01):

I think it's fine to just claim the name elementary_matrix for this concept. we can cook up a new name for the wiki thing, if we need it.

Jalex Stark (Jun 27 2020 at 11:04):

should this go in data.matrix.basic?

Johan Commelin (Jun 27 2020 at 11:04):

I think so

Johan Commelin (Jun 27 2020 at 11:05):

Of course it needs 25 compatibility lemmas with 1 and 0, and diagonal and scalar, etc...

Scott Morrison (Jun 27 2020 at 11:05):

Working with coefficients and degrees of polynomials is so painful!

Johan Commelin (Jun 27 2020 at 11:05):

And trace elementary and det (elementary i j) etc

Johan Commelin (Jun 27 2020 at 11:09):

@Scott Morrison On which sorrys are you currently working?

Scott Morrison (Jun 27 2020 at 11:09):

I pushed a few, but I think I'm going to stop for a while

Scott Morrison (Jun 27 2020 at 11:09):

So they're all yours :-)

Scott Morrison (Jun 27 2020 at 11:09):

we're pretty close

Jalex Stark (Jun 27 2020 at 11:13):

I have this so far, I think I should put it in a separate PR?

variables {α β : Type*} [semiring α] [semiring β]
variables {m n : Type u} [fintype m] [fintype n] [decidable_eq m] [decidable_eq n]

@[reducible]
def elementary_matrix (i : n) (j : m) : matrix n m α :=
(λ i' j', if i' = i  j' = j then 1 else 0)

lemma matrix_eq_sum_elementary (x : matrix n m α) :
x =  (i : n) (j : m), (x i j)  elementary_matrix i j :=
begin
  ext,
    rw finset.sum_apply,
    rw finset.sum_apply,
    rw  finset.sum_subset, swap 4, exact {i},
    { norm_num, },
    { simp },
    intros, norm_num at a, norm_num,
    convert finset.sum_const_zero,
    ext, norm_num, rw if_neg, tauto
end

Scott Morrison (Jun 27 2020 at 11:18):

I'd advise against using @[reducible] unless you have a good reason.

Scott Morrison (Jun 27 2020 at 11:18):

appropriate @[simp] lemmas are nearly always better

Scott Morrison (Jun 27 2020 at 11:19):

My suggestion would be to put it into the cayley_hamilton branch, and check that it is usable there

Scott Morrison (Jun 27 2020 at 11:19):

That branch is going to turn into many PRs, not just one.

Scott Morrison (Jun 27 2020 at 11:19):

so we can slice this off again once we see it's working

Johan Commelin (Jun 27 2020 at 12:27):

@Scott Morrison I think that algebras like algebra R (polynomial A) are currently not really "supported" by mathlib...

Scott Morrison (Jun 27 2020 at 12:28):

What do you mean?

Johan Commelin (Jun 27 2020 at 12:28):

Well, it's part of a diagram...

Johan Commelin (Jun 27 2020 at 12:29):

R → A → polynomial A, and this thing is (provably) the composition. (Maybe even definitionally.)

Johan Commelin (Jun 27 2020 at 12:29):

But we'll need a bunch of lemmas for that.

Jalex Stark (Jun 27 2020 at 12:29):

but those lemmas are the lemmas about tensor products, no?

Johan Commelin (Jun 27 2020 at 12:29):

Of course we can add the lemmas in the special cases. But I'm still trying to figure out if we can get a general machine to make this as easy as it should be.

Scott Morrison (Jun 27 2020 at 12:31):

I guess I'm not sure still what you're concerned about

Scott Morrison (Jun 27 2020 at 12:31):

I mean, it's just some R-algebra, that we happen to have built out of a different R-algebra.

Johan Commelin (Jun 27 2020 at 12:32):

Ooh, wait.. this is for noncommutative A, so we don't even have algebra A (polynomial A).
Let me shut up.

Johan Commelin (Jun 27 2020 at 12:33):

In fact, this works:

lemma turkle_map_apply (r : R) :
  algebra_map R (polynomial A) r = C (algebra_map R A r) :=
rfl

Scott Morrison (Jun 27 2020 at 12:34):

(I really should rename all my metasyntactic variables before making branches public...)

Scott Morrison (Jun 27 2020 at 12:41):

I've no idea where I picked up turkle from. It seems to be entirely unattested amongst lists of metasyntactic variables I can find. Nevertheless my chickens are called foobar, quux, and turkle.

Scott Morrison (Jun 27 2020 at 12:42):

(On even-numbered days they are all called bruce.)

Johan Commelin (Jun 27 2020 at 12:51):

I have fixed another sorry

Johan Commelin (Jun 27 2020 at 12:55):

Scott Morrison said:

(I really should rename all my metasyntactic variables before making branches public...)

What we really need is a linter that automatically flags if a name is wrong, and suggests to replace it with a proper one...
I fear that we need AGI before that happens.

Rob Lewis (Jun 27 2020 at 12:57):

What we really need is a linter that automatically flags if a name is wrong, and suggests to replace it with a proper one...

https://arxiv.org/abs/2004.07761

Scott Morrison (Jun 27 2020 at 13:00):

I wonder if we should change the definition of to_fun in polynomial_algebra.lean, to use eval2.

Scott Morrison (Jun 27 2020 at 13:01):

At the time I first wrote it, eval2 didn't work in this generality. I think it should now, and perhaps there's better API support?

Scott Morrison (Jun 27 2020 at 13:01):

Not sure.

Johan Commelin (Jun 27 2020 at 13:04):

@Scott Morrison Not sure if it matters...

Johan Commelin (Jun 27 2020 at 13:04):

We probably need equation lemmas either way

Johan Commelin (Jun 27 2020 at 13:04):

I've pushed another couple sorry-fixes

Kevin Buzzard (Jun 28 2020 at 01:16):

Hey it's really important that the constructor for the elementary matrix takes the value of the nonzero matrix entry as an input. That way you can reduce a lot of problems about R-linearity to questions about the underlying abelian group. This was an important trick I learnt when working with @Shing Tak Lam on derivations

Scott Morrison (Jun 28 2020 at 04:47):

(This is the same argument that we should talk much more about monomial n a in polynomial.lean, and less about C a * X^n.)

Scott Morrison (Jun 28 2020 at 04:48):

@Jalex Stark, I'll leave it to you to make this change?

Scott Morrison (Jun 28 2020 at 05:20):

@Jalex Stark, one of your commits has the line rw ← sum_product',, but there's no declaration with that name that I can find.

Scott Morrison (Jun 28 2020 at 05:20):

Did you forget to add a file?

Jalex Stark (Jun 28 2020 at 05:22):

oops sorry, I just got online and have a few things to address but should get to it soon. That's supposed to be a new decl in big_operators. (which i probably need to rewrite because i did a lot of stashing and switching between branches in one local copy of mathlib yesterday)

Jalex Stark (Jun 28 2020 at 07:38):

oh, it is in fact there in the current branch. the decl is made by to_additive on prod_product' in big_operators (okay, I changed sum_product' to finset.sum_product' and it works now)

Jalex Stark (Jun 28 2020 at 07:39):

i'm working on changing elementary_matrix now

Scott Morrison (Jun 28 2020 at 07:45):

I think things are going to be a lot easier if I refactor eval2 so it takes a bundled ring_hom rather than needing a is_semiring_hom typeclass

Scott Morrison (Jun 28 2020 at 07:45):

however I'm not super confident that will go well... not sure how many other places will need changes

Scott Morrison (Jun 28 2020 at 07:46):

I may experiment with it on a different branch for a bit

Jalex Stark (Jun 28 2020 at 07:54):

I think at some point I wrote an instance is_semiring_hom f for f being the to_fun of a bundled algebra_hom, and this felt dirty

Scott Morrison (Jun 28 2020 at 08:31):

yes, it's past time we killed is_semiring_hom.

Scott Morrison (Jun 28 2020 at 10:07):

@Jalex Stark, your work on elementary_matrix has broken the proof of matrix_polynomial_equiv_polynomial_matrix_coeff_apply.

Scott Morrison (Jun 28 2020 at 10:07):

Do you know what's going on, or should I start on that?

Scott Morrison (Jun 28 2020 at 10:07):

(Otherwise, we're down to a single sorry on cayley_hamilton, which looks like easy algebra.)

Jalex Stark (Jun 28 2020 at 10:09):

yeah, I don't know why i pushed before i reached an error-free state, I am still working locally to propagate elementary_matrix throughout

Jalex Stark (Jun 28 2020 at 10:09):

it does seem to be simplifying stuff

Scott Morrison (Jun 28 2020 at 10:12):

okay! I will leave that bit of the file then, and try and do my algebra homework

Jalex Stark (Jun 28 2020 at 10:13):

I have something like

M x.fst x.snd * (algebra_map R A) (elementary_matrix x.fst x.snd 1 i j))

but want something like

M x.fst x.snd  (elementary_matrix x.fst x.snd 1) $ i j

I think I am missing something simple

Johan Commelin (Jun 28 2020 at 10:15):

algebra.smul_def?

Jalex Stark (Jun 28 2020 at 10:16):

thanks!

Scott Morrison (Jun 28 2020 at 10:20):

Ok, cayley_hamilton now builds except for this thing with elementary_matrix!

Scott Morrison (Jun 28 2020 at 10:20):

there is a _lot_ of cleanup still to do

Jalex Stark (Jun 28 2020 at 10:23):

for the elementary_matrix stuff i just have two more proofs to fix in polynomial_algebra

Scott Morrison (Jun 28 2020 at 10:25):

awesome! and I found a quick work-around so

Scott Morrison (Jun 28 2020 at 10:26):

We have cayley_hamilton! (No sorries.)

Jalex Stark (Jun 28 2020 at 10:27):

maybe your work-around is best for what's remaining, the "right" way would be to show that elementary_matrix i j is an add_monoid_hom

Scott Morrison (Jun 28 2020 at 10:34):

oh ...

Scott Morrison (Jun 28 2020 at 10:34):

except, @Jalex Stark, I didn't notice you adding the [inhabited n] instances, which I don't think we want

Jalex Stark (Jun 28 2020 at 10:35):

hm.

Scott Morrison (Jun 28 2020 at 10:35):

and which changes the statement of Cayley-Hamilton subtly...

Scott Morrison (Jun 28 2020 at 10:35):

we have to prove it for empty matrices, too :-)

Scott Morrison (Jun 28 2020 at 10:35):

actually... err... is it even true?

Jalex Stark (Jun 28 2020 at 10:35):

matrix unit unit is unit. So is the empty matrix 0 or 1?

Scott Morrison (Jun 28 2020 at 10:36):

what is the characteristic polynomial of the empty matrix? surely 1?

Scott Morrison (Jun 28 2020 at 10:39):

if you want it to be the product of the eigenvalues, it is 1

Jalex Stark (Jun 28 2020 at 10:39):

the characteristic matrix is the empty matrix

Scott Morrison (Jun 28 2020 at 10:39):

but the "usual formula" gives zero

Scott Morrison (Jun 28 2020 at 10:39):

huh :-)

Jalex Stark (Jun 28 2020 at 10:39):

right, and this is fine because the empty matrix brings the zero ring into the picture

Jalex Stark (Jun 28 2020 at 10:40):

so we're complaining that we can't factor the identity map on our ring through the zero ring

Scott Morrison (Jun 28 2020 at 10:42):

phew:

#eval matrix.det (λ (x y : punit), (0 : ))

says 0 :-)

Scott Morrison (Jun 28 2020 at 10:42):

Lean has spoken.

Scott Morrison (Jun 28 2020 at 10:42):

So indeed, we need to get rid of that inhabited hypothesis.

Scott Morrison (Jun 28 2020 at 10:43):

sadly matrix.induction_on needs a h_zero argument, just to cover this case

Scott Morrison (Jun 28 2020 at 10:44):

maybe I'll split it into two versions, one with an inhabited argument using your trick

Scott Morrison (Jun 28 2020 at 10:44):

but we won't use that one here

Scott Morrison (Jun 28 2020 at 10:48):

Okay, cayley_hamilton works again, this time including the trivial case

Jalex Stark (Jun 28 2020 at 11:00):

maybe I'm just too tired to be doing this right now, but I want to rewrite

matrix_polynomial_equiv_polynomial_matrix (elementary_matrix i j p + elementary_matrix i j q)
= matrix_polynomial_equiv_polynomial_matrix (elementary_matrix i j p) + matrix_polynomial_equiv_polynomial_matrix (elementary_matrix i j q)

I tried rw ring_hom.map_add and that didn't work (and then lean crashed and I'm waiting to rebuild mathlib, since i don't have oleans yet)

Scott Morrison (Jun 28 2020 at 11:01):

try alg_hom.map_add

Jalex Stark (Jun 28 2020 at 11:02):

:thumbs_up: I will try that in a few minutes

Mario Carneiro (Jun 28 2020 at 11:03):

Jalex Stark said:

matrix unit unit is unit. So is the empty matrix 0 or 1?

Isn't it both?

Jalex Stark (Jun 28 2020 at 11:06):

yes, the question was what should happen to it when we bring it into polynomial land

Jalex Stark (Jun 28 2020 at 11:07):

so by "is" i didn't mean = I meant equal after applying some map out of matrix unit unit

Jalex Stark (Jun 28 2020 at 11:08):

if the map wanted to be a ring hom then it would have to land at 0, but thankfully that's not the case and scott figured out that it wants to land at 1
(edit: I guess I misunderstood something along the way)

Scott Morrison (Jun 28 2020 at 11:08):

@Mario Carneiro the question was really what the determinant of the empty matrix is

Scott Morrison (Jun 28 2020 at 11:09):

and it is indeed meant to be zero, which is good for Cayley-Hamilton, and bad for "determinant is the product of the eigenvalues"

Jalex Stark (Jun 28 2020 at 11:09):

Scott Morrison said:

try alg_hom.map_add

didn't work. I'll try haveing the statement i want intstead of rwing to it

Jalex Stark (Jun 28 2020 at 11:16):

ah i'm spewing nonsense

Mario Carneiro (Jun 28 2020 at 11:26):

I think the determinant of the empty matrix should not be defined

Mario Carneiro (Jun 28 2020 at 11:26):

which means that it should be 0 in lean

Kenny Lau (Jun 28 2020 at 11:30):

the determinant of the empty matrix is 1

Kenny Lau (Jun 28 2020 at 11:31):

S_0 has exactly one element, and the empty product is 1

Mario Carneiro (Jun 28 2020 at 11:31):

Ah, I messed up the math on one of the definitions. https://math.stackexchange.com/questions/2130175/does-the-unique-map-on-the-zero-space-have-determinant-1 gives a number of independent reasons for the determinant to be defined and equal to 1

Mario Carneiro (Jun 28 2020 at 11:32):

Oh I misread Scott's #eval too, he measured the determinant of the 1x1 zero matrix

Scott Morrison (Jun 28 2020 at 11:47):

oops!

Scott Morrison (Jun 28 2020 at 12:27):

#3214 #3215 #3216 and #3217 are all mini-PRs coming out of cayley_hamilton.

Scott Morrison (Jun 28 2020 at 12:29):

The rest may have to wait until #3193 and #3213 have merged.

Scott Morrison (Jun 28 2020 at 12:29):

Although @Jalex Stark, do you want to make the elementary_matrix stuff a separate PR? It could get started now.

Scott Morrison (Jun 28 2020 at 12:29):

I think you want @[simp] on scalar_apply_eq and scalar_apply_ne, and corresponding @[simp] lemmas for elementary_matrix.

Reid Barton (Jun 28 2020 at 12:56):

I haven't read this carefully, but I just want to say (a) the determinant of the empy matrix is 1, (b) elementary matrix doesn't mean a matrix with a single nonzero entry.

Scott Morrison (Jun 28 2020 at 13:37):

Yes! (a) was eventually agreed, after I caused some confusion.

Scott Morrison (Jun 28 2020 at 13:39):

I also agree with (b), and I think we haven't renamed yet for lack of the right name.

Scott Morrison (Jun 28 2020 at 13:39):

What should these matrices be called?

Scott Morrison (Jun 28 2020 at 13:39):

maybe just basis_matrix?

Reid Barton (Jun 28 2020 at 13:45):

I was going to suggest something like std_basis_matrix, yeah

Reid Barton (Jun 28 2020 at 13:46):

partly in reference to the confusingly named std_basis (does it still exist)?

Kevin Buzzard (Jun 28 2020 at 16:28):

#eval matrix.det (λ (x y : punit), (0 : ))

Wait, that's the det of the 1x1 matrix (0). I think CH is false for the 0 by 0 matrix

Kevin Buzzard (Jun 28 2020 at 16:29):

Sorry, late to the party. So we are agreed that CH fails for 0x0 matrices?

Jalex Stark (Jun 28 2020 at 18:54):

any polynomial of the empty matrix is the empty matrix is zero, so CH is true for 0x0 matrices

Jalex Stark (Jun 28 2020 at 18:56):

i like std_basis_matrix

Johan Commelin (Jun 29 2020 at 06:52):

@Scott Morrison I'm happy with the polynomial refactor. I left one tiny comment. If you agree, I'll hit it with the merge hammer.

Scott Morrison (Jun 29 2020 at 07:12):

I generalised that lemma, and put the PR on the queue.

Johan Commelin (Jun 29 2020 at 07:16):

Great!

Scott Morrison (Jun 29 2020 at 07:20):

It occurs to me that we are still missing map_smul there. I tried proving that, but it will be easier post #3213.

Scott Morrison (Jun 30 2020 at 10:14):

@Jalex Stark, I'm just slicing things up into PRs now. Do you want to do the honours on https://github.com/leanprover-community/mathlib/pull/new/std_basis_matrix

Jalex Stark (Jun 30 2020 at 10:16):

Go ahead without me, sorry. Working on mathlib PRs is quite addictive and I need to leave space in my head for my day job right now


Last updated: Dec 20 2023 at 11:08 UTC