Zulip Chat Archive
Stream: graph theory
Topic: Spectral Graph Theory
Rida Hamadani (Aug 01 2024 at 18:09):
I would like to get your input on how to state theorems regarding eigenvalues/eigenvectors of graphs. What version of eigenvalues do you think works best? Many proofs in graph theory depend on the spectral theorem, so it would be nice to do it in a way that allows us to use docs#Matrix.IsHermitian.spectral_theorem.
For instance, how would you rewrite the following theorem (from mathlib) using eigenvectors?
theorem adjMatrix_mulVec_const_apply_of_regular [NonAssocSemiring α] {d : ℕ} {a : α}
(hd : G.IsRegularOfDegree d) {v : V} : (G.adjMatrix α *ᵥ Function.const _ a) v = d * a := by
simp [hd v]
(If G is regular, then the all-ones vector is an eigenvector of the adjacency matrix).
Rida Hamadani (Aug 02 2024 at 21:38):
Does the following approach seem best, or should I use a more general definition for eigenvalues?
import Mathlib.Combinatorics.SimpleGraph.AdjMatrix
import Mathlib.LinearAlgebra.Matrix.Spectrum
open Finset Matrix SimpleGraph
namespace SimpleGraph
variable (G : SimpleGraph V) [DecidableRel G.Adj]
variable (α)
theorem isHermitian_adjMatrix [RCLike α] [DecidableEq V] : (G.adjMatrix α).IsHermitian := by
ext
rw [conjTranspose, RCLike.star_def, transpose_adjMatrix, map_apply, adjMatrix_apply,
RingHom.map_ite_one_zero]
theorem regular_iff_degree_eigenvalue [RCLike α] [DecidableEq V] {d : ℕ} :
G.IsRegularOfDegree d ↔ ∃ v : V, isHermitian_adjMatrix.eigenvalues v = d := by
sorry
What do you think about having a SimpleGraph/Spectrum.lean
file that would depend on AdjMatrix.lean
, while files like StronglyRegular.lean
would depend on it. It will contain a new definition of eigenvalues (because graph theorists usually say "eigenvalue of a graph" instead of "eigenvalue of an adjacency matrix of a graph") and it will contain an API for them.
Kim Morrison (Aug 02 2024 at 22:38):
I don't think that people saying "eigenvalue of a graph" vs "eigenvalue of an adjacency matrix of a graph" is in itself sufficient to do the same in Lean. There's cost to duplicating API, and the cost of adding .adjMatrix
is not that high, and these costs should be weighed in terms of the Lean idiom that results, not the informal idiom.
Shreyas Srinivas (Aug 02 2024 at 23:33):
Rida Hamadani said:
I would like to get your input on how to state theorems regarding eigenvalues/eigenvectors of graphs. What version of eigenvalues do you think works best? Many proofs in graph theory depend on the spectral theorem, so it would be nice to do it in a way that allows us to use docs#Matrix.IsHermitian.spectral_theorem.
For instance, how would you rewrite the following theorem (from mathlib) using eigenvectors?theorem adjMatrix_mulVec_const_apply_of_regular [NonAssocSemiring α] {d : ℕ} {a : α} (hd : G.IsRegularOfDegree d) {v : V} : (G.adjMatrix α *ᵥ Function.const _ a) v = d * a := by simp [hd v]
(If G is regular, then the all-ones vector is an eigenvector of the adjacency matrix).
Highly depends on what you want. For example the eigenvalues of the normalised laplacian matrix are super useful in some places.
Shreyas Srinivas (Aug 02 2024 at 23:34):
The cheeger formula is defined w.r.t eigenvalues of the adjacencylaplacian matrix
Shreyas Srinivas (Aug 02 2024 at 23:40):
so what does "eigenvalues of a graph" refer to?
Rida Hamadani (Aug 03 2024 at 01:48):
I see, those are good points. Thank you.
Shreyas Srinivas said:
so what does "eigenvalues of a graph" refer to?
From what I've seen, it refers to the adjacency matrix spectrum. However, I wouldn't be surprised if it is used for the Laplacian in some of the literature.
Shreyas Srinivas said:
Highly depends on what you want.
As a first step I would like to prove a few theorems about strongly regular graphs that (like most theorems about SRGs) requires the use of the eigenvalues of the adjacency matrix.
Rida Hamadani (Aug 03 2024 at 02:12):
Kim Morrison said:
There's cost to duplicating API.
The eigenvalues of a graph will depend on the existing definition of eigenvalues so that the old API can be used by unfolding. What I meant is not duplicating the existing API, but instead creating one with new lemmas that are specific to graphs.
This way, theorems that use adjacency matrices without needing the spectrum, will not need to import the extra files eigenvalues need.
Patrick Massot (Aug 03 2024 at 09:54):
In case you didn’t see it, there is a sample use of spectral methods in https://leanprover-community.github.io/mathlib4_docs/Archive/Sensitivity.html
Shreyas Srinivas (Aug 03 2024 at 13:51):
Rida Hamadani said:
I see, those are good points. Thank you.
Shreyas Srinivas said:
so what does "eigenvalues of a graph" refer to?
From what I've seen, it refers to the adjacency matrix spectrum. However, I wouldn't be surprised if it is used for the Laplacian in some of the literature.
Shreyas Srinivas said:
Highly depends on what you want.
As a first step I would like to prove a few theorems about strongly regular graphs that (like most theorems about SRGs) requires the use of the eigenvalues of the adjacency matrix.
My point is there is no single notion of "the eigenvalues of a graph". There are eigenvalues of the adjacency matrix useful for studying random walks. There are eigenvalues of the normalised Laplacian matrix for studying cut structure and so on.
Shreyas Srinivas (Aug 03 2024 at 13:56):
This is not some case of an isolated part of the literature doing something niche. This is all in mainstream spectral graph theory.
Rida Hamadani (Aug 03 2024 at 14:05):
What do you think is the best way to deal with this? Maybe not defining eigenvalues of a graphs a sufficient solution, using instead eigenvalues of the needed matrix?
Yaël Dillies (Aug 03 2024 at 14:06):
Yes, I believe that's precisely the point that Shreyas was making. There is no such thing as "eigenvalues of a graph" in the literature, only "eigenvalues of such matrix associated to a graph", so the Lean code should mirror that.
Shreyas Srinivas (Aug 03 2024 at 14:35):
A couple of good sources for this subject:
Daniel Spielman's book: http://cs-www.cs.yale.edu/homes/spielman/sagt/sagt.pdf
Shreyas Srinivas (Aug 03 2024 at 14:36):
and the first few lectures of Lap Chi Lau's course notes : https://cs.uwaterloo.ca/~lapchi/cs860/notes.html
Peter Nelson (Aug 07 2024 at 03:05):
Also Godsil and Royle.
Peter Nelson (Aug 07 2024 at 23:41):
Yaël Dillies said:
Yes, I believe that's precisely the point that Shreyas was making. There is no such thing as "eigenvalues of a graph" in the literature, only "eigenvalues of such matrix associated to a graph", so the Lean code should mirror that.
The phrases 'eigenvalues of G' does have a very standard default meaning - the eigenvalues of the adjacency matrix of G. Usually the variants are disambiguated; eg 'Laplacian eigenvalues'.
Given the concerns discussed here, I agree that it does make sense to just phrase things in terms of the matrix in mathlib, but I don’t think that ‘this is how it’s done in the literature’ is an argument in favour of this, because it’s not true.
Shreyas Srinivas (Aug 08 2024 at 03:20):
Not true. The phrase "eigenvalues of a graph" is context dependent. The context depends on whom you ask or what resource you read
Rida Hamadani (Aug 08 2024 at 05:52):
In my experience when a lecturer/author says "eigenvalue of a graph", everybody assumes "of an adjacency matrix", if they want to use that phrase to mean something else, they have to specify that first.
Rida Hamadani (Aug 08 2024 at 05:53):
But I agree that it is a good idea to always specify the matrix for our purposes.
Peter Nelson (Aug 09 2024 at 11:22):
It is context-dependent for sure. I’m just saying that if you look at standard sources, there is a default context.
Last updated: May 02 2025 at 03:31 UTC