Zulip Chat Archive

Stream: Is there code for X?

Topic: charpoly and eigenvalues


Johan Commelin (Feb 09 2024 at 19:58):

It seems like there is no line in mathlib that mentions charpoly and eigen in one breath.
Do we have the tools to prove things like

import Mathlib

namespace LinearMap

variable {R M : Type*}
variable [Field R] [AddCommGroup M] [Module R M] -- maybe `IsDomain R` is enough?
variable [Module.Finite R M] [Module.Free R M]
variable (φ : M →ₗ[R] M)

open FiniteDimensional Polynomial

lemma charpoly_eq_X_pow_iff :
    φ.charpoly = X ^ finrank R M   m : M,  (n : ), (φ ^ n) m = 0 := by
  constructor
  · intro h m
    use finrank R M
    suffices φ ^ finrank R M = 0 by simp only [this, LinearMap.zero_apply]
    simpa only [h, map_pow, aeval_X] using φ.aeval_self_charpoly
  · sorry

lemma charpoly_constantCoeff_eq_zero_iff :
    constantCoeff φ.charpoly = 0   (m : M), m  0  φ m = 0 := by
  sorry

end LinearMap

Kevin Buzzard (Feb 09 2024 at 20:09):

Are these true? Multiplication by 2 is nilpotent on Z/4Z\Z/4\Z but has char poly X2X-2.

Johan Commelin (Feb 09 2024 at 20:19):

Ooh, please feel free to assume that R is a field, or something like that.

Johan Commelin (Feb 09 2024 at 20:20):

I suppose that domain should be enough

Johan Commelin (Feb 09 2024 at 20:33):

Mathlib/LinearAlgebra/Matrix/Charpoly/Eigs.lean has the header

/-!
# Eigenvalues are characteristic polynomial roots.

but that's promising more than it delivers :sad:

Junyan Xu (Feb 09 2024 at 20:40):

There is docs#Matrix.isNilpotent_charpoly_sub_pow_of_isNilpotent (so it seems IsReduced is enough) but it's stated for a Matrix rather than a LinearMap. (But LinearMap.charpoly is defined using Matrix.charpoly anyway ...)

Riccardo Brasca (Feb 09 2024 at 21:05):

I had a student that started the proof the eigenvalues are the roots of the characteristic polynomial (over a field), but he never finished. Anyway I had the impression that all the ingredients were there already one year ago.

Riccardo Brasca (Feb 09 2024 at 21:06):

There is a discussion somewhere about this (on mobile right now), but it wasn't clear in which generality we want this in mathlib (the analysis people want continuous spectrum and things like that).

Anatole Dedecker (Feb 09 2024 at 22:46):

I don't think there is any hope of unifying completely the algebraic and analytic theories though ? Especially in cases like this which work over any field/good enough ring

Antoine Chambert-Loir (Feb 10 2024 at 11:49):

It's really unlikely that a full unification is possible. You can define (maybe already defines) the spectrum of an element aa of an RR-algebra AA as the set of rRr\in R such that ara-r is not invertible. But if R=RR=\mathbf R is the field of real numbers, you may wish to upgrade this to a subset of the field of complex numbers C\mathbf C, or if you're scheme-minded, you may wish to have the spectrum as a functor, for any RR-algebra SS, the spectrum of a1ARSa\otimes 1\in A\otimes_R S, etc. On the other hand, the initial definition (spectrum of an element of an algebra) allows to define all the other one, so one can start with it. And then, in various contexts, try to understand it.

Riccardo Brasca (Feb 10 2024 at 12:07):

This was the conversation, but it seems I misremembered, everybody agreed to do the finite dimensional case.

Anatole Dedecker (Feb 10 2024 at 12:56):

Antoine Chambert-Loir said:

It's really unlikely that a full unification is possible. You can define (maybe already defines) the spectrum of an element aa of an RR-algebra AA as the set of rRr\in R such that ara-r is not invertible. But if R=RR=\mathbf R is the field of real numbers, you may wish to upgrade this to a subset of the field of complex numbers C\mathbf C, or if you're scheme-minded, you may wish to have the spectrum as a functor, for any RR-algebra SS, the spectrum of a1ARSa\otimes 1\in A\otimes_R S, etc. On the other hand, the initial definition (spectrum of an element of an algebra) allows to define all the other one, so one can start with it. And then, in various contexts, try to understand it.

I think we already have this level of unification, docs#spectrum is the first definition you give. And there seems to be used in the algebraic setting as well, although maybe not as much as it could be.

Johan Commelin (Feb 10 2024 at 13:14):

Btw, besides the Lean statements I posted above, I really need to know that the rank of the generalized Eigenspace of 0 is the minimal index of a nonzero coefficient in the charpoly.

Antoine Chambert-Loir (Feb 10 2024 at 15:15):

Does mathlib know about multiplicativity of characteristic polynomials in extensions? (aka charpoly of upper triangular block matrices)

Junyan Xu (Feb 10 2024 at 15:28):

Coordinate-free proof: let TT be an endomorphism of a finite dimensional vector space VV, and let nn be the index where ker Tn\text{ker }T^n stabilizes (docs#monotone_stabilizes_iff_noetherian). Then we have ker Tnim Tn=0\text{ker }T^n \cap \text{im }T^n=0 because ker T2n=ker Tn\text{ker }T^{2n}=\text{ker }T^n, and therefore VV decomposes into a direct sum of TT-invariant subspaces ker Tnim Tn\text{ker }T^n\oplus\text{im }T^n because the dimensions agree. TT acts nilpotently on the first space and invertibly on the second space (because im Tn+1=im Tn\text{im }T^{n+1}=\text{im }T^n: not sure what's the best way to show this: you could use that im Tn+1\text{im }T^{n+1} is a complement of ker Tn+1\text{ker }T^{n+1}, or use docs#monotone_stabilizes_iff_artinian to pick n such that im\text{im} also stabilizes). The charpoly on the first space is X^dim generalizedEigenspace while on the second space the charpoly has nonzero constant term (determinant), but we seem to be missing the lemma saying the charpoly on the whole space is the product of the two charpolys (also can't find the corresponding statement for block matrices).

I recall seeing such an argument in commutative algebra but forgot where.

Antoine Chambert-Loir (Feb 10 2024 at 17:18):

That's the proof I wanted to give. And it also shows that the charpoly and the minpoly have the same irreducible factors.

Antoine Chambert-Loir (Feb 10 2024 at 17:19):

There is docs#Matrix.det_of_upperTriangular, docs#Matrix.det_of_lowerTriangular and Matrix.det_blockDiagonal

Antoine Chambert-Loir (Feb 10 2024 at 17:26):

(A general fact that that ker(Tn)\ker(T^n) stabilizes exactly when im(Tn)\mathop{\rm im}(T^n) stabilizes, exactly when their intersection is 0.)

Antoine Chambert-Loir (Feb 10 2024 at 17:30):

Using the direct sum decomposition is nice. What I wanted to do is slightly different. Given a decomposition of the minimal polynomial as a product P1PrP_1\dots P_r of irreducible polynomials, I consider the filtration of the space by 0ker(P1)ker(P1P2)ker(P1Pr)0 \subseteq \ker(P_1)\subseteq \ker(P_1P_2)\subseteq\dots \subseteq \ker(P_1\dots P_r). By definition, each successive quotient is non trivial, and its minimal polynomial is PiP_i, so that its characteristic polynomial is a power of PiP_i. (This fact is trivial if PiP_i has degree 11, a bit less in general.)

Junyan Xu (Feb 10 2024 at 18:07):

If you want the fact that the characteristic polynomial divides a power of the minimal polynomial, it's probably easiest to decompose the k[X]k[X]-module VV as a direct sum of cyclic modules using docs#Module.equiv_directSum_of_isTorsion (structure theorem for modules over PID). The charpoly and minpoly agree on each cyclic module (no need of explicit determinant computation; note that the minpoly divides the charpoly and has degree equal to dimension of the module), but on a direct sum the charpoly is the product while the minpoly is the lcm.

Antoine Chambert-Loir (Feb 10 2024 at 18:36):

(it depends on what you mean by “is easier”.) The decomposition of a k[X]k[X]-module VV as a direct sm of cyclic modules gives the Frobenius decomposition which mathlib doesn't seem to know, and if you have this, then you know everything about characteristic polynomial and minimal polynomial. (For a cyclic endomorphism, both are equal, the rest is counting.)

Antoine Chambert-Loir (Feb 10 2024 at 18:38):

In the approach by filtrations as above, you are freed of the task of finding stable complement subspaces. So technically, this is much easier. This technique also generalizes to study modules over one-dimensional integral domains (when they are not integrally closed, which is often the case).

Johan Commelin (Feb 12 2024 at 17:38):

Kevin pointed out that my original question wasn't true for arbitrary comm rings.
Does the following claim have any chance of being true?

import Mathlib

namespace LinearMap

variable {R M : Type*}
variable [CommRing R] [AddCommGroup M] [Module R M]
variable [Module.Finite R M] [Module.Free R M]
variable (φ : Module.End R M)

open FiniteDimensional Polynomial

lemma natTrailingDegree_le_finrank_maximalGeneralizedEigenspace :
    natTrailingDegree (φ.charpoly)  finrank R (φ.maximalGeneralizedEigenspace 0) := by
  sorry

end LinearMap

Johan Commelin (Feb 12 2024 at 17:49):

And what are reasonable conditions on R for equality to hold? (Note that M has to be finite free over R, otherwise charpoly ins't defined.)

Damiano Testa (Feb 12 2024 at 17:52):

I have not thought about this yet, but natTrailingDegre (charpoly) makes me think of revcharpoly docs#Matrix.charpolyRev instead...

Damiano Testa (Feb 12 2024 at 18:04):

What is the characteristic polynomial of multiplication by 2 on the direct sum of 2 copies of Z/4Z?

Damiano Testa (Feb 12 2024 at 18:06):

Ah, I was confused. I do not have a counterexample.

Damiano Testa (Feb 12 2024 at 18:27):

Maybe a 2x2 diagonal matrix with entries 2 and 3 in Z/6? :shrug:


Last updated: May 02 2025 at 03:31 UTC