Zulip Chat Archive

Stream: general

Topic: Definitional commutativity


Daniel Weber (Feb 15 2025 at 07:37):

Is it possible to define boolean operators such that commutativity is definitional? Concretely, I'm looking for something which could be used for Or' here:

def Or' (p q : Prop) : Prop := sorry

theorem Or'_eq_Or : Or' = Or := sorry

variable (p q : Prop)

#guard_expr Or' p q =~ Or' q p

Kevin Buzzard (Feb 15 2025 at 08:14):

Why?

A few years ago when we were trying to figure out how to do homological algebra in lean a related question came up: could we redefine addition on the naturals and integers such that associativity was rfl? I think the conclusion was that even if it were possible (and I don't think anyone ever managed to do it) the question was moot anyway because core would not be accepting such a PR.

Luigi Massacci (Feb 15 2025 at 09:01):

Now I’m genuinely curious as to how that came up while doing homological algebra

Markus Himmel (Feb 15 2025 at 09:20):

You can try to write down the definition of a chain complex and see for yourself where it goes wrong :) There have been many threads on Zulip about this over the years. The first I could find is #general > morphisms between objects that are "equal" @ 💬 . Mathlib has a rather elegant solution for this; see the module doc for file#Algebra/Homology/ComplexShape.

Niels Voss (Feb 15 2025 at 09:22):

This is super informal, so take it with a grain of salt, but if such an Or' could exist, wouldn't that mean that both Or' p q and Or' q p reduce to the same expression E[p,q] depending on p and q? This would mean that E[p,q] and E[q,p] are definitionally the same, but since they are both irreducible, if they are syntactically different this would mean that Or' p q has essentially two normal forms. So either E[p,q] is syntactically symmetric with respect to p and q, or Or' p q has two normal forms. Either of these options seems really strange to me, so I'd be surprised if you could find a way to make this work.

Daniel Weber (Feb 15 2025 at 09:29):

Kevin Buzzard said:

Why?

A few years ago when we were trying to figure out how to do homological algebra in lean a related question came up: could we redefine addition on the naturals and integers such that associativity was rfl? I think the conclusion was that even if it were possible (and I don't think anyone ever managed to do it) the question was moot anyway because core would not be accepting such a PR.

I'm just curious, not for any real usage

Riccardo Brasca (Feb 15 2025 at 10:30):

Luigi Massacci said:

Now I’m genuinely curious as to how that came up while doing homological algebra

Long story short: if you define a complex of groups (that mathematically is just a group G_i for every integer i and maps d_i : G_i → G_(i+1) such that the composition of two consecutive maps is zero) just as a function ℤ → GroupCat with the maps then Lean complains that the composition d_(i-1) with d_i is meaningless, because it doesn't realize that G_(i-1+1) and G_i are the same.

Riccardo Brasca (Feb 15 2025 at 10:31):

the same happens with various operations on , and definitional equality would help.

Trebor Huang (Feb 15 2025 at 11:42):

Commutativity is impossible unless the type theory directly supports it (for example universe levels are definitionally commutative, but obviously that can't be leveraged here). Associativity can be done via the D-List technique. We define AssocNat as a function Nat -> Nat that commutes with successor. Then addition is just composition which is definitionally associative (there is that extra component, but Prop-valued types are definitional subsingletons so it doesn't matter). This should be generalizable to make definitionally associative categories via representable functors, but I haven't thought about it.

Kevin Buzzard (Feb 16 2025 at 15:43):

I think the OG thread where associativity was first mentioned was this one. If you have a graded algebra graded by pieces A(n)A(n) and multiplications A(i)×A(j)A(i+j)A(i)\times A(j)\to A(i+j) then stating associativity in the naive way results in a type mismatch as A((i+j)+k)A((i+j)+k) and A(i+(j+k))A(i+(j+k)) aren't defeq.

Alex Meiburg (Sep 05 2025 at 17:24):

I came across this thread and this implementation. They also define multiplication, and you get a definitionally true a * 1 = a and a * 0 = a (much like on Nat), but the other sides don't work.

But I realized you can do better: by keeping around a higher-order multiplication function, you can get a definitionally associative multiplication too, _and_ you get that 1 * a = a * 1 = a definitionally, _and_ get definitional distributivity! At least, left-distributivity one side.

Demo

I think there's a pattern of, if you keep functions bundled with the type (and then use an appropriate property to enforce their compatibility), you can get more definitional equality. e.g. Nat is just the raw data, first AssocNat above keeps the addition function bundled, and then here you bundle a multiplication function too.

But I have to wonder, if there's an even better way to do it, that gets you zero_mul and right_distrib too? I don't see any way to get commutativity, that feels out of scope.

Aaron Liu (Sep 05 2025 at 17:36):

interesting

Aaron Liu (Sep 05 2025 at 17:54):

I think getting definitional a * 0 = 0 and 0 * a = 0 simultaneously would let you get define a symmetric AND operation with false AND x = false and x AND false = false both holding definitionally which I'm pretty sure is impossible

Robin Arnez (Sep 05 2025 at 19:07):

Yeah I think you can't have a definitionally left and right absorbing element (for a nontrivial operator) but you can get a Bool that has many useful properties:

structure WeirdBool where
  choose : Bool  Bool  Bool
  forall_choose_eq_or : ( a b, choose a b = a)  ( a b, choose a b = b)

protected def WeirdBool.false : WeirdBool := fun a _ => a, .inl fun _ _ => rfl
protected def WeirdBool.true : WeirdBool := fun _ b => b, .inr fun _ _ => rfl
protected def WeirdBool.and (x y : WeirdBool) : WeirdBool :=
  fun a b => x.1 a (y.1 a b), by have := x.2; have := y.2; grind
protected def WeirdBool.or (x y : WeirdBool) : WeirdBool :=
  fun a b => x.1 (y.1 a b) b, by have := x.2; have := y.2; grind
protected def WeirdBool.xor (x y : WeirdBool) : WeirdBool :=
  fun a b => x.1 (y.1 a b) (y.1 b a), by have := x.2; have := y.2; grind
protected def WeirdBool.not (x : WeirdBool) : WeirdBool :=
  fun a b => x.1 b a, by have := x.2; grind
protected def WeirdBool.toBool (x : WeirdBool) : Bool := x.choose false true

protected theorem WeirdBool.false_and (x : WeirdBool) : WeirdBool.false.and x = .false := rfl
protected theorem WeirdBool.true_and (x : WeirdBool) : WeirdBool.true.and x = x := rfl
protected theorem WeirdBool.and_true (x : WeirdBool) : x.and .true = x := rfl
protected theorem WeirdBool.and_assoc (x y z : WeirdBool) : (x.and y).and z = x.and (y.and z) := rfl

protected theorem WeirdBool.true_or (x : WeirdBool) : WeirdBool.true.or x = .true := rfl
protected theorem WeirdBool.false_or (x : WeirdBool) : WeirdBool.false.or x = x := rfl
protected theorem WeirdBool.or_false (x : WeirdBool) : x.or .false = x := rfl
protected theorem WeirdBool.or_assoc (x y z : WeirdBool) : (x.or y).or z = x.or (y.or z) := rfl

protected theorem WeirdBool.xor_true (x : WeirdBool) : x.xor .true = x.not := rfl
protected theorem WeirdBool.xor_false (x : WeirdBool) : x.xor .false = x := rfl
protected theorem WeirdBool.true_xor (x : WeirdBool) : WeirdBool.true.xor x = x.not := rfl
protected theorem WeirdBool.false_xor (x : WeirdBool) : WeirdBool.false.xor x = x := rfl
protected theorem WeirdBool.xor_assoc (x y z : WeirdBool) : (x.xor y).xor z = x.xor (y.xor z) := rfl

protected theorem WeirdBool.not_false : WeirdBool.false.not = .true := rfl
protected theorem WeirdBool.not_true : WeirdBool.true.not = .false := rfl
protected theorem WeirdBool.not_or (x y : WeirdBool) : (x.or y).not = x.not.and y.not := rfl
protected theorem WeirdBool.not_and (x y : WeirdBool) : (x.and y).not = x.not.or y.not := rfl
protected theorem WeirdBool.not_not (x : WeirdBool) : x.not.not = x := rfl
protected theorem WeirdBool.not_xor (x y : WeirdBool) : x.not.xor y = (x.xor y).not := rfl
protected theorem WeirdBool.xor_not (x y : WeirdBool) : x.xor y.not = (x.xor y).not := rfl
protected theorem WeirdBool.not_xor_not (x y : WeirdBool) : x.not.xor y.not = x.xor y := rfl

protected theorem WeirdBool.toBool_false : WeirdBool.false.toBool = false := rfl
protected theorem WeirdBool.toBool_true : WeirdBool.true.toBool = true := rfl

Sadly, no distributivity laws (and forget about commutativity and idempotence).
Actually, can there be a nontrivial idempotent operator?

Robin Arnez (Sep 05 2025 at 19:12):

I guess yeah that isn't very difficult:

def IdemTest := Bool × Bool deriving DecidableEq
def IdemTest.combine (x y : IdemTest) : IdemTest :=
  x.1, y.2

theorem IdemTest.combine_self (x : IdemTest) : x.combine x = x := rfl
theorem IdemTest.combine_assoc (x y z : IdemTest) : (x.combine y).combine z = x.combine (y.combine z) := rfl
theorem IdemTest.combine_nontrivial₁ :  x y : IdemTest, x.combine y  x :=
  ⟨⟨false, false, true, true, by decide
theorem IdemTest.combine_nontrivial₂ :  x y : IdemTest, x.combine y  y :=
  ⟨⟨false, false, true, true, by decide

Alex Meiburg (Sep 05 2025 at 20:55):

Can one of you explain why having a two-sided absorbing element is impossible, but having two-sided identities is fine? I don't see that part.

Robin Arnez (Sep 05 2025 at 21:16):

I don't really have a formal argument but it feels like if you have 0 * a ≡ 0 and a * 0 ≡ 0 then you get this intersection point 0 * 0 ≡ 0 where the result stays the same definitionally if you change either the lhs or rhs so it feels like it should also stay the same definitionally if both the lhs and rhs change (which can't be the case). I'd be happy to be proven wrong though!

Aaron Liu (Sep 05 2025 at 22:33):

Alex Meiburg said:

Can one of you explain why having a two-sided absorbing element is impossible, but having two-sided identities is fine? I don't see that part.

I saw a proof once that this was impossible in Haskell but I can't find it anymore

Alex Meiburg (Sep 06 2025 at 00:43):

Hmm. If your type system was such that reducing a _multi_-variable match eagerly was a definitional equality, then you could do

def Nat.MyMul : Nat  Nat  Nat := fun x y 
  match x, y with
  | succ a, succ b => 1 + a + b + a * b
  | _, _ => 0

So if you have an existing addition and multiplication function that had slightly worse definitional equalities, then you could "wrap" them with this MyMul and call MyMul your multiplication; and then you would have definitional equalities that a * 0 = 0 * a = 0.

But, Lean doesn't permit that, it won't reduce the match until both x and y have their heads identified. (Reasonable! But, one could imagine a "more eager" match reduction.)

Alex Meiburg (Sep 06 2025 at 00:44):

I recognize that match is ultimately just built on Nat.rec of course, it's not obvious what it would look like the basic level of the type theory to allow "eager multivariable matching" whatever that would mean

Alex Meiburg (Sep 06 2025 at 00:45):

(Okay, actually I guess I could imagine how that would work, mechanically. I think I could write down a type theory where this would go through)

Trebor Huang (Sep 15 2025 at 05:56):

Robin Arnez said:

I don't really have a formal argument but it feels like if you have 0 * a ≡ 0 and a * 0 ≡ 0 then you get this intersection point 0 * 0 ≡ 0 where the result stays the same definitionally if you change either the lhs or rhs so it feels like it should also stay the same definitionally if both the lhs and rhs change (which can't be the case). I'd be happy to be proven wrong though!

I think is related to the Perpendicular Lines Lemma in untyped lambda calculus: if (M x N1) and (M N2 x) are both constant in x, then M is actually constant. This generalizes to more variables too. I don't know whether it could be adapted to Lean though.


Last updated: Dec 20 2025 at 21:32 UTC