linear_algebra.tensor_algebra.to_tensor_powerMathlib.LinearAlgebra.TensorAlgebra.ToTensorPower

This file has been ported!

Changes since the initial port

The following section lists changes to this file in mathlib3 and mathlib4 that occured after the initial port. Most recent changes are shown first. Hovering over a commit will show all commits associated with the same mathlib3 commit.

Changes in mathlib3

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(no changes)

(last sync)

Changes in mathlib3port

mathlib3
mathlib3port
Diff
@@ -3,8 +3,8 @@ Copyright (c) 2021 Eric Wieser. All rights reserved.
 Released under Apache 2.0 license as described in the file LICENSE.
 Authors: Eric Wieser
 -/
-import Mathbin.LinearAlgebra.TensorAlgebra.Basic
-import Mathbin.LinearAlgebra.TensorPower
+import LinearAlgebra.TensorAlgebra.Basic
+import LinearAlgebra.TensorPower
 
 #align_import linear_algebra.tensor_algebra.to_tensor_power from "leanprover-community/mathlib"@"1a51edf13debfcbe223fa06b1cb353b9ed9751cc"
 
Diff
@@ -176,7 +176,7 @@ theorem TensorPower.list_prod_gradedMonoid_mk_single (n : ℕ) (x : Fin n → M)
     simp_rw [Fin.append_left_eq_cons, Function.comp]
     congr 1 with i
     congr 1
-    rw [Fin.castIso_trans, Fin.castIso_refl, OrderIso.refl_apply]
+    rw [Fin.cast_trans, Fin.castIso_refl, OrderIso.refl_apply]
 #align tensor_power.list_prod_graded_monoid_mk_single TensorPower.list_prod_gradedMonoid_mk_single
 -/
 
Diff
@@ -2,15 +2,12 @@
 Copyright (c) 2021 Eric Wieser. All rights reserved.
 Released under Apache 2.0 license as described in the file LICENSE.
 Authors: Eric Wieser
-
-! This file was ported from Lean 3 source module linear_algebra.tensor_algebra.to_tensor_power
-! leanprover-community/mathlib commit 1a51edf13debfcbe223fa06b1cb353b9ed9751cc
-! Please do not edit these lines, except to modify the commit id
-! if you have ported upstream changes.
 -/
 import Mathbin.LinearAlgebra.TensorAlgebra.Basic
 import Mathbin.LinearAlgebra.TensorPower
 
+#align_import linear_algebra.tensor_algebra.to_tensor_power from "leanprover-community/mathlib"@"1a51edf13debfcbe223fa06b1cb353b9ed9751cc"
+
 /-!
 # Tensor algebras as direct sums of tensor powers
 
Diff
@@ -4,7 +4,7 @@ Released under Apache 2.0 license as described in the file LICENSE.
 Authors: Eric Wieser
 
 ! This file was ported from Lean 3 source module linear_algebra.tensor_algebra.to_tensor_power
-! leanprover-community/mathlib commit d97a0c9f7a7efe6d76d652c5a6b7c9c634b70e0a
+! leanprover-community/mathlib commit 1a51edf13debfcbe223fa06b1cb353b9ed9751cc
 ! Please do not edit these lines, except to modify the commit id
 ! if you have ported upstream changes.
 -/
@@ -14,6 +14,9 @@ import Mathbin.LinearAlgebra.TensorPower
 /-!
 # Tensor algebras as direct sums of tensor powers
 
+> THIS FILE IS SYNCHRONIZED WITH MATHLIB4.
+> Any changes to this file require a corresponding PR to mathlib4.
+
 In this file we show that `tensor_algebra R M` is isomorphic to a direct sum of tensor powers, as
 `tensor_algebra.equiv_direct_sum`.
 -/
Diff
@@ -25,23 +25,30 @@ variable {R M : Type _} [CommSemiring R] [AddCommMonoid M] [Module R M]
 
 namespace TensorPower
 
+#print TensorPower.toTensorAlgebra /-
 /-- The canonical embedding from a tensor power to the tensor algebra -/
 def toTensorAlgebra {n} : (⨂[R]^n) M →ₗ[R] TensorAlgebra R M :=
   PiTensorProduct.lift (TensorAlgebra.tprod R M n)
 #align tensor_power.to_tensor_algebra TensorPower.toTensorAlgebra
+-/
 
+#print TensorPower.toTensorAlgebra_tprod /-
 @[simp]
 theorem toTensorAlgebra_tprod {n} (x : Fin n → M) :
     TensorPower.toTensorAlgebra (PiTensorProduct.tprod R x) = TensorAlgebra.tprod R M n x :=
   PiTensorProduct.lift.tprod _
 #align tensor_power.to_tensor_algebra_tprod TensorPower.toTensorAlgebra_tprod
+-/
 
+#print TensorPower.toTensorAlgebra_gOne /-
 @[simp]
 theorem toTensorAlgebra_gOne :
     (@GradedMonoid.GOne.one _ (fun n => (⨂[R]^n) M) _ _).toTensorAlgebra = 1 :=
   TensorPower.toTensorAlgebra_tprod _
 #align tensor_power.to_tensor_algebra_ghas_one TensorPower.toTensorAlgebra_gOne
+-/
 
+#print TensorPower.toTensorAlgebra_gMul /-
 @[simp]
 theorem toTensorAlgebra_gMul {i j} (a : (⨂[R]^i) M) (b : (⨂[R]^j) M) :
     (@GradedMonoid.GMul.mul _ (fun n => (⨂[R]^n) M) _ _ _ _ a b).toTensorAlgebra =
@@ -62,7 +69,9 @@ theorem toTensorAlgebra_gMul {i j} (a : (⨂[R]^i) M) (b : (⨂[R]^j) M) :
   rw [← List.map_ofFn _ (TensorAlgebra.ι R), ← List.map_ofFn _ (TensorAlgebra.ι R), ←
     List.map_ofFn _ (TensorAlgebra.ι R), ← List.map_append, List.ofFn_fin_append]
 #align tensor_power.to_tensor_algebra_ghas_mul TensorPower.toTensorAlgebra_gMul
+-/
 
+#print TensorPower.toTensorAlgebra_galgebra_toFun /-
 @[simp]
 theorem toTensorAlgebra_galgebra_toFun (r : R) :
     (@DirectSum.GAlgebra.toFun _ R (fun n => (⨂[R]^n) M) _ _ _ _ _ _ _ r).toTensorAlgebra =
@@ -71,63 +80,81 @@ theorem toTensorAlgebra_galgebra_toFun (r : R) :
   rw [TensorPower.galgebra_toFun_def, TensorPower.algebraMap₀_eq_smul_one, LinearMap.map_smul,
     TensorPower.toTensorAlgebra_gOne, Algebra.algebraMap_eq_smul_one]
 #align tensor_power.to_tensor_algebra_galgebra_to_fun TensorPower.toTensorAlgebra_galgebra_toFun
+-/
 
 end TensorPower
 
 namespace TensorAlgebra
 
+#print TensorAlgebra.ofDirectSum /-
 /-- The canonical map from a direct sum of tensor powers to the tensor algebra. -/
 def ofDirectSum : (⨁ n, (⨂[R]^n) M) →ₐ[R] TensorAlgebra R M :=
   DirectSum.toAlgebra _ _ (fun n => TensorPower.toTensorAlgebra) TensorPower.toTensorAlgebra_gOne
     (fun i j => TensorPower.toTensorAlgebra_gMul) TensorPower.toTensorAlgebra_galgebra_toFun
 #align tensor_algebra.of_direct_sum TensorAlgebra.ofDirectSum
+-/
 
+#print TensorAlgebra.ofDirectSum_of_tprod /-
 @[simp]
 theorem ofDirectSum_of_tprod {n} (x : Fin n → M) :
     ofDirectSum (DirectSum.of _ n (PiTensorProduct.tprod R x)) = tprod R M n x :=
   (DirectSum.toAddMonoid_of _ _ _).trans (TensorPower.toTensorAlgebra_tprod _)
 #align tensor_algebra.of_direct_sum_of_tprod TensorAlgebra.ofDirectSum_of_tprod
+-/
 
+#print TensorAlgebra.toDirectSum /-
 /-- The canonical map from the tensor algebra to a direct sum of tensor powers. -/
 def toDirectSum : TensorAlgebra R M →ₐ[R] ⨁ n, (⨂[R]^n) M :=
   TensorAlgebra.lift R <|
     DirectSum.lof R ℕ (fun n => (⨂[R]^n) M) _ ∘ₗ
       (LinearEquiv.symm <| PiTensorProduct.subsingletonEquiv (0 : Fin 1) : M ≃ₗ[R] _).toLinearMap
 #align tensor_algebra.to_direct_sum TensorAlgebra.toDirectSum
+-/
 
+#print TensorAlgebra.toDirectSum_ι /-
 @[simp]
 theorem toDirectSum_ι (x : M) :
     toDirectSum (ι R x) =
       DirectSum.of (fun n => (⨂[R]^n) M) _ (PiTensorProduct.tprod R fun _ : Fin 1 => x) :=
   TensorAlgebra.lift_ι_apply _ _
 #align tensor_algebra.to_direct_sum_ι TensorAlgebra.toDirectSum_ι
+-/
 
+#print TensorAlgebra.ofDirectSum_comp_toDirectSum /-
 theorem ofDirectSum_comp_toDirectSum :
     ofDirectSum.comp toDirectSum = AlgHom.id R (TensorAlgebra R M) :=
   by
   ext
   simp [DirectSum.lof_eq_of, tprod_apply]
 #align tensor_algebra.of_direct_sum_comp_to_direct_sum TensorAlgebra.ofDirectSum_comp_toDirectSum
+-/
 
+#print TensorAlgebra.ofDirectSum_toDirectSum /-
 @[simp]
 theorem ofDirectSum_toDirectSum (x : TensorAlgebra R M) : ofDirectSum x.toDirectSum = x :=
   AlgHom.congr_fun ofDirectSum_comp_toDirectSum x
 #align tensor_algebra.of_direct_sum_to_direct_sum TensorAlgebra.ofDirectSum_toDirectSum
+-/
 
+#print TensorAlgebra.mk_reindex_cast /-
 @[simp]
 theorem mk_reindex_cast {n m : ℕ} (h : n = m) (x : (⨂[R]^n) M) :
     GradedMonoid.mk m (PiTensorProduct.reindex R M (Equiv.cast <| congr_arg Fin h) x) =
       GradedMonoid.mk n x :=
   Eq.symm (PiTensorProduct.gradedMonoid_eq_of_reindex_cast h rfl)
 #align tensor_algebra.mk_reindex_cast TensorAlgebra.mk_reindex_cast
+-/
 
+#print TensorAlgebra.mk_reindex_fin_cast /-
 @[simp]
-theorem mk_reindex_fin_castIso {n m : ℕ} (h : n = m) (x : (⨂[R]^n) M) :
+theorem mk_reindex_fin_cast {n m : ℕ} (h : n = m) (x : (⨂[R]^n) M) :
     GradedMonoid.mk m (PiTensorProduct.reindex R M (Fin.castIso h).toEquiv x) =
       GradedMonoid.mk n x :=
   by rw [Fin.castIso_to_equiv, mk_reindex_cast h]
-#align tensor_algebra.mk_reindex_fin_cast TensorAlgebra.mk_reindex_fin_castIso
+#align tensor_algebra.mk_reindex_fin_cast TensorAlgebra.mk_reindex_fin_cast
+-/
 
+#print TensorPower.list_prod_gradedMonoid_mk_single /-
 /-- The product of tensor products made of a single vector is the same as a single product of
 all the vectors. -/
 theorem TensorPower.list_prod_gradedMonoid_mk_single (n : ℕ) (x : Fin n → M) :
@@ -151,7 +178,9 @@ theorem TensorPower.list_prod_gradedMonoid_mk_single (n : ℕ) (x : Fin n → M)
     congr 1
     rw [Fin.castIso_trans, Fin.castIso_refl, OrderIso.refl_apply]
 #align tensor_power.list_prod_graded_monoid_mk_single TensorPower.list_prod_gradedMonoid_mk_single
+-/
 
+#print TensorAlgebra.toDirectSum_tensorPower_tprod /-
 theorem toDirectSum_tensorPower_tprod {n} (x : Fin n → M) :
     toDirectSum (tprod R M n x) = DirectSum.of _ n (PiTensorProduct.tprod R x) :=
   by
@@ -163,25 +192,32 @@ theorem toDirectSum_tensorPower_tprod {n} (x : Fin n → M) :
   rw [GradedMonoid.mk_list_dProd]
   rw [TensorPower.list_prod_gradedMonoid_mk_single]
 #align tensor_algebra.to_direct_sum_tensor_power_tprod TensorAlgebra.toDirectSum_tensorPower_tprod
+-/
 
+#print TensorAlgebra.toDirectSum_comp_ofDirectSum /-
 theorem toDirectSum_comp_ofDirectSum :
     toDirectSum.comp ofDirectSum = AlgHom.id R (⨁ n, (⨂[R]^n) M) :=
   by
   ext
   simp [DirectSum.lof_eq_of, -tprod_apply, to_direct_sum_tensor_power_tprod]
 #align tensor_algebra.to_direct_sum_comp_of_direct_sum TensorAlgebra.toDirectSum_comp_ofDirectSum
+-/
 
+#print TensorAlgebra.toDirectSum_ofDirectSum /-
 @[simp]
 theorem toDirectSum_ofDirectSum (x : ⨁ n, (⨂[R]^n) M) : (ofDirectSum x).toDirectSum = x :=
   AlgHom.congr_fun toDirectSum_comp_ofDirectSum x
 #align tensor_algebra.to_direct_sum_of_direct_sum TensorAlgebra.toDirectSum_ofDirectSum
+-/
 
+#print TensorAlgebra.equivDirectSum /-
 /-- The tensor algebra is isomorphic to a direct sum of tensor powers. -/
 @[simps]
 def equivDirectSum : TensorAlgebra R M ≃ₐ[R] ⨁ n, (⨂[R]^n) M :=
   AlgEquiv.ofAlgHom toDirectSum ofDirectSum toDirectSum_comp_ofDirectSum
     ofDirectSum_comp_toDirectSum
 #align tensor_algebra.equiv_direct_sum TensorAlgebra.equivDirectSum
+-/
 
 end TensorAlgebra
 
Diff
@@ -122,10 +122,11 @@ theorem mk_reindex_cast {n m : ℕ} (h : n = m) (x : (⨂[R]^n) M) :
 #align tensor_algebra.mk_reindex_cast TensorAlgebra.mk_reindex_cast
 
 @[simp]
-theorem mk_reindex_fin_cast {n m : ℕ} (h : n = m) (x : (⨂[R]^n) M) :
-    GradedMonoid.mk m (PiTensorProduct.reindex R M (Fin.cast h).toEquiv x) = GradedMonoid.mk n x :=
-  by rw [Fin.cast_to_equiv, mk_reindex_cast h]
-#align tensor_algebra.mk_reindex_fin_cast TensorAlgebra.mk_reindex_fin_cast
+theorem mk_reindex_fin_castIso {n m : ℕ} (h : n = m) (x : (⨂[R]^n) M) :
+    GradedMonoid.mk m (PiTensorProduct.reindex R M (Fin.castIso h).toEquiv x) =
+      GradedMonoid.mk n x :=
+  by rw [Fin.castIso_to_equiv, mk_reindex_cast h]
+#align tensor_algebra.mk_reindex_fin_cast TensorAlgebra.mk_reindex_fin_castIso
 
 /-- The product of tensor products made of a single vector is the same as a single product of
 all the vectors. -/
@@ -148,7 +149,7 @@ theorem TensorPower.list_prod_gradedMonoid_mk_single (n : ℕ) (x : Fin n → M)
     simp_rw [Fin.append_left_eq_cons, Function.comp]
     congr 1 with i
     congr 1
-    rw [Fin.cast_trans, Fin.cast_refl, OrderIso.refl_apply]
+    rw [Fin.castIso_trans, Fin.castIso_refl, OrderIso.refl_apply]
 #align tensor_power.list_prod_graded_monoid_mk_single TensorPower.list_prod_gradedMonoid_mk_single
 
 theorem toDirectSum_tensorPower_tprod {n} (x : Fin n → M) :
Diff
@@ -52,7 +52,7 @@ theorem toTensorAlgebra_gMul {i j} (a : (⨂[R]^i) M) (b : (⨂[R]^j) M) :
     LinearMap.compl₂_apply, ← LinearMap.comp_apply]
   refine' LinearMap.congr_fun (LinearMap.congr_fun _ a) b
   clear a b
-  ext (a b)
+  ext a b
   simp only [LinearMap.compr₂_apply, LinearMap.mul_apply', LinearMap.compl₂_apply,
     LinearMap.comp_apply, LinearMap.compMultilinearMap_apply, PiTensorProduct.lift.tprod,
     TensorPower.tprod_mul_tprod, TensorPower.toTensorAlgebra_tprod, TensorAlgebra.tprod_apply, ←
Diff
@@ -37,18 +37,18 @@ theorem toTensorAlgebra_tprod {n} (x : Fin n → M) :
 #align tensor_power.to_tensor_algebra_tprod TensorPower.toTensorAlgebra_tprod
 
 @[simp]
-theorem toTensorAlgebra_ghasOne :
+theorem toTensorAlgebra_gOne :
     (@GradedMonoid.GOne.one _ (fun n => (⨂[R]^n) M) _ _).toTensorAlgebra = 1 :=
   TensorPower.toTensorAlgebra_tprod _
-#align tensor_power.to_tensor_algebra_ghas_one TensorPower.toTensorAlgebra_ghasOne
+#align tensor_power.to_tensor_algebra_ghas_one TensorPower.toTensorAlgebra_gOne
 
 @[simp]
-theorem toTensorAlgebra_ghasMul {i j} (a : (⨂[R]^i) M) (b : (⨂[R]^j) M) :
+theorem toTensorAlgebra_gMul {i j} (a : (⨂[R]^i) M) (b : (⨂[R]^j) M) :
     (@GradedMonoid.GMul.mul _ (fun n => (⨂[R]^n) M) _ _ _ _ a b).toTensorAlgebra =
       a.toTensorAlgebra * b.toTensorAlgebra :=
   by
   -- change `a` and `b` to `tprod R a` and `tprod R b`
-  rw [TensorPower.ghasMul_eq_coe_linearMap, ← LinearMap.compr₂_apply, ← @LinearMap.mul_apply' R, ←
+  rw [TensorPower.gMul_eq_coe_linearMap, ← LinearMap.compr₂_apply, ← @LinearMap.mul_apply' R, ←
     LinearMap.compl₂_apply, ← LinearMap.comp_apply]
   refine' LinearMap.congr_fun (LinearMap.congr_fun _ a) b
   clear a b
@@ -61,7 +61,7 @@ theorem toTensorAlgebra_ghasMul {i j} (a : (⨂[R]^i) M) (b : (⨂[R]^j) M) :
   congr
   rw [← List.map_ofFn _ (TensorAlgebra.ι R), ← List.map_ofFn _ (TensorAlgebra.ι R), ←
     List.map_ofFn _ (TensorAlgebra.ι R), ← List.map_append, List.ofFn_fin_append]
-#align tensor_power.to_tensor_algebra_ghas_mul TensorPower.toTensorAlgebra_ghasMul
+#align tensor_power.to_tensor_algebra_ghas_mul TensorPower.toTensorAlgebra_gMul
 
 @[simp]
 theorem toTensorAlgebra_galgebra_toFun (r : R) :
@@ -69,7 +69,7 @@ theorem toTensorAlgebra_galgebra_toFun (r : R) :
       algebraMap _ _ r :=
   by
   rw [TensorPower.galgebra_toFun_def, TensorPower.algebraMap₀_eq_smul_one, LinearMap.map_smul,
-    TensorPower.toTensorAlgebra_ghasOne, Algebra.algebraMap_eq_smul_one]
+    TensorPower.toTensorAlgebra_gOne, Algebra.algebraMap_eq_smul_one]
 #align tensor_power.to_tensor_algebra_galgebra_to_fun TensorPower.toTensorAlgebra_galgebra_toFun
 
 end TensorPower
@@ -78,8 +78,8 @@ namespace TensorAlgebra
 
 /-- The canonical map from a direct sum of tensor powers to the tensor algebra. -/
 def ofDirectSum : (⨁ n, (⨂[R]^n) M) →ₐ[R] TensorAlgebra R M :=
-  DirectSum.toAlgebra _ _ (fun n => TensorPower.toTensorAlgebra) TensorPower.toTensorAlgebra_ghasOne
-    (fun i j => TensorPower.toTensorAlgebra_ghasMul) TensorPower.toTensorAlgebra_galgebra_toFun
+  DirectSum.toAlgebra _ _ (fun n => TensorPower.toTensorAlgebra) TensorPower.toTensorAlgebra_gOne
+    (fun i j => TensorPower.toTensorAlgebra_gMul) TensorPower.toTensorAlgebra_galgebra_toFun
 #align tensor_algebra.of_direct_sum TensorAlgebra.ofDirectSum
 
 @[simp]
Diff
@@ -19,7 +19,7 @@ In this file we show that `tensor_algebra R M` is isomorphic to a direct sum of
 -/
 
 
-open DirectSum TensorProduct
+open scoped DirectSum TensorProduct
 
 variable {R M : Type _} [CommSemiring R] [AddCommMonoid M] [Module R M]
 
Diff
@@ -65,7 +65,7 @@ theorem toTensorAlgebra_ghasMul {i j} (a : (⨂[R]^i) M) (b : (⨂[R]^j) M) :
 
 @[simp]
 theorem toTensorAlgebra_galgebra_toFun (r : R) :
-    (@DirectSum.Galgebra.toFun _ R (fun n => (⨂[R]^n) M) _ _ _ _ _ _ _ r).toTensorAlgebra =
+    (@DirectSum.GAlgebra.toFun _ R (fun n => (⨂[R]^n) M) _ _ _ _ _ _ _ r).toTensorAlgebra =
       algebraMap _ _ r :=
   by
   rw [TensorPower.galgebra_toFun_def, TensorPower.algebraMap₀_eq_smul_one, LinearMap.map_smul,

Changes in mathlib4

mathlib3
mathlib4
chore: remove a few miscellaneous now-resolved porting notes (#12127)
Diff
@@ -58,8 +58,7 @@ theorem toTensorAlgebra_gMul {i j} (a : (⨂[R]^i) M) (b : (⨂[R]^j) M) :
     TensorPower.tprod_mul_tprod, TensorPower.toTensorAlgebra_tprod, TensorAlgebra.tprod_apply, ←
     gMul_eq_coe_linearMap]
   refine' Eq.trans _ List.prod_append
-  -- Porting note: was `congr`
-  apply congr_arg
+  congr
   -- Porting note: `erw` for `Function.comp`
   erw [← List.map_ofFn _ (TensorAlgebra.ι R), ← List.map_ofFn _ (TensorAlgebra.ι R), ←
     List.map_ofFn _ (TensorAlgebra.ι R), ← List.map_append, List.ofFn_fin_append]
fix(LinearAlgebra/TensorPower): correct notation precedence (#11062)

This removes some ugly parens that were introduced during porting

Diff
@@ -24,7 +24,7 @@ variable {R M : Type*} [CommSemiring R] [AddCommMonoid M] [Module R M]
 namespace TensorPower
 
 /-- The canonical embedding from a tensor power to the tensor algebra -/
-def toTensorAlgebra {n} : (⨂[R]^n) M →ₗ[R] TensorAlgebra R M :=
+def toTensorAlgebra {n} : ⨂[R]^n M →ₗ[R] TensorAlgebra R M :=
   PiTensorProduct.lift (TensorAlgebra.tprod R M n)
 #align tensor_power.to_tensor_algebra TensorPower.toTensorAlgebra
 
@@ -36,13 +36,13 @@ theorem toTensorAlgebra_tprod {n} (x : Fin n → M) :
 
 @[simp]
 theorem toTensorAlgebra_gOne :
-    TensorPower.toTensorAlgebra (@GradedMonoid.GOne.one _ (fun n => (⨂[R]^n) M) _ _) = 1 :=
+    TensorPower.toTensorAlgebra (@GradedMonoid.GOne.one _ (fun n => ⨂[R]^n M) _ _) = 1 :=
   TensorPower.toTensorAlgebra_tprod _
 #align tensor_power.to_tensor_algebra_ghas_one TensorPower.toTensorAlgebra_gOne
 
 @[simp]
 theorem toTensorAlgebra_gMul {i j} (a : (⨂[R]^i) M) (b : (⨂[R]^j) M) :
-    TensorPower.toTensorAlgebra (@GradedMonoid.GMul.mul _ (fun n => (⨂[R]^n) M) _ _ _ _ a b) =
+    TensorPower.toTensorAlgebra (@GradedMonoid.GMul.mul _ (fun n => ⨂[R]^n M) _ _ _ _ a b) =
       TensorPower.toTensorAlgebra a * TensorPower.toTensorAlgebra b := by
   -- change `a` and `b` to `tprod R a` and `tprod R b`
   rw [TensorPower.gMul_eq_coe_linearMap, ← LinearMap.compr₂_apply, ← @LinearMap.mul_apply' R, ←
@@ -67,7 +67,7 @@ theorem toTensorAlgebra_gMul {i j} (a : (⨂[R]^i) M) (b : (⨂[R]^j) M) :
 
 @[simp]
 theorem toTensorAlgebra_galgebra_toFun (r : R) :
-    TensorPower.toTensorAlgebra (DirectSum.GAlgebra.toFun (R := R) (A := fun n => (⨂[R]^n) M) r) =
+    TensorPower.toTensorAlgebra (DirectSum.GAlgebra.toFun (R := R) (A := fun n => ⨂[R]^n M) r) =
       algebraMap _ _ r := by
   rw [TensorPower.galgebra_toFun_def, TensorPower.algebraMap₀_eq_smul_one, LinearMap.map_smul,
     TensorPower.toTensorAlgebra_gOne, Algebra.algebraMap_eq_smul_one]
@@ -78,7 +78,7 @@ end TensorPower
 namespace TensorAlgebra
 
 /-- The canonical map from a direct sum of tensor powers to the tensor algebra. -/
-def ofDirectSum : (⨁ n, (⨂[R]^n) M) →ₐ[R] TensorAlgebra R M :=
+def ofDirectSum : (⨁ n, ⨂[R]^n M) →ₐ[R] TensorAlgebra R M :=
   DirectSum.toAlgebra _ _ (fun _ => TensorPower.toTensorAlgebra) TensorPower.toTensorAlgebra_gOne
     (fun {_ _} => TensorPower.toTensorAlgebra_gMul)
 #align tensor_algebra.of_direct_sum TensorAlgebra.ofDirectSum
@@ -92,16 +92,16 @@ theorem ofDirectSum_of_tprod {n} (x : Fin n → M) :
 #align tensor_algebra.of_direct_sum_of_tprod TensorAlgebra.ofDirectSum_of_tprod
 
 /-- The canonical map from the tensor algebra to a direct sum of tensor powers. -/
-def toDirectSum : TensorAlgebra R M →ₐ[R] ⨁ n, (⨂[R]^n) M :=
+def toDirectSum : TensorAlgebra R M →ₐ[R] ⨁ n, ⨂[R]^n M :=
   TensorAlgebra.lift R <|
-    DirectSum.lof R ℕ (fun n => (⨂[R]^n) M) _ ∘ₗ
+    DirectSum.lof R ℕ (fun n => ⨂[R]^n M) _ ∘ₗ
       (LinearEquiv.symm <| PiTensorProduct.subsingletonEquiv (0 : Fin 1) : M ≃ₗ[R] _).toLinearMap
 #align tensor_algebra.to_direct_sum TensorAlgebra.toDirectSum
 
 @[simp]
 theorem toDirectSum_ι (x : M) :
     toDirectSum (ι R x) =
-      DirectSum.of (fun n => (⨂[R]^n) M) _ (PiTensorProduct.tprod R fun _ : Fin 1 => x) :=
+      DirectSum.of (fun n => ⨂[R]^n M) _ (PiTensorProduct.tprod R fun _ : Fin 1 => x) :=
   TensorAlgebra.lift_ι_apply _ _
 #align tensor_algebra.to_direct_sum_ι TensorAlgebra.toDirectSum_ι
 
@@ -118,7 +118,7 @@ theorem ofDirectSum_toDirectSum (x : TensorAlgebra R M) :
 #align tensor_algebra.of_direct_sum_to_direct_sum TensorAlgebra.ofDirectSum_toDirectSum
 
 @[simp, nolint simpNF] -- see std4#365 for the simpNF issue
-theorem mk_reindex_cast {n m : ℕ} (h : n = m) (x : (⨂[R]^n) M) :
+theorem mk_reindex_cast {n m : ℕ} (h : n = m) (x : ⨂[R]^n M) :
     GradedMonoid.mk (A := fun i => (⨂[R]^i) M) m
     (PiTensorProduct.reindex R (fun _ ↦ M) (Equiv.cast <| congr_arg Fin h) x) =
     GradedMonoid.mk n x :=
@@ -126,7 +126,7 @@ theorem mk_reindex_cast {n m : ℕ} (h : n = m) (x : (⨂[R]^n) M) :
 #align tensor_algebra.mk_reindex_cast TensorAlgebra.mk_reindex_cast
 
 @[simp]
-theorem mk_reindex_fin_cast {n m : ℕ} (h : n = m) (x : (⨂[R]^n) M) :
+theorem mk_reindex_fin_cast {n m : ℕ} (h : n = m) (x : ⨂[R]^n M) :
     GradedMonoid.mk (A := fun i => (⨂[R]^i) M) m
     (PiTensorProduct.reindex R (fun _ ↦ M) (Fin.castIso h).toEquiv x) = GradedMonoid.mk n x := by
   rw [Fin.castIso_to_equiv, mk_reindex_cast h]
@@ -137,7 +137,7 @@ all the vectors. -/
 theorem _root_.TensorPower.list_prod_gradedMonoid_mk_single (n : ℕ) (x : Fin n → M) :
     ((List.finRange n).map fun a =>
           (GradedMonoid.mk _ (PiTensorProduct.tprod R fun _ : Fin 1 => x a) :
-            GradedMonoid fun n => (⨂[R]^n) M)).prod =
+            GradedMonoid fun n => ⨂[R]^n M)).prod =
       GradedMonoid.mk n (PiTensorProduct.tprod R x) := by
   refine' Fin.consInduction _ _ x <;> clear x
   · rw [List.finRange_zero, List.map_nil, List.prod_nil]
@@ -164,20 +164,20 @@ theorem toDirectSum_tensorPower_tprod {n} (x : Fin n → M) :
 #align tensor_algebra.to_direct_sum_tensor_power_tprod TensorAlgebra.toDirectSum_tensorPower_tprod
 
 theorem toDirectSum_comp_ofDirectSum :
-    toDirectSum.comp ofDirectSum = AlgHom.id R (⨁ n, (⨂[R]^n) M) := by
+    toDirectSum.comp ofDirectSum = AlgHom.id R (⨁ n, ⨂[R]^n M) := by
   ext
   simp [DirectSum.lof_eq_of, -tprod_apply, toDirectSum_tensorPower_tprod]
 #align tensor_algebra.to_direct_sum_comp_of_direct_sum TensorAlgebra.toDirectSum_comp_ofDirectSum
 
 @[simp]
-theorem toDirectSum_ofDirectSum (x : ⨁ n, (⨂[R]^n) M) :
+theorem toDirectSum_ofDirectSum (x : ⨁ n, ⨂[R]^n M) :
     TensorAlgebra.toDirectSum (ofDirectSum x) = x :=
   AlgHom.congr_fun toDirectSum_comp_ofDirectSum x
 #align tensor_algebra.to_direct_sum_of_direct_sum TensorAlgebra.toDirectSum_ofDirectSum
 
 /-- The tensor algebra is isomorphic to a direct sum of tensor powers. -/
 @[simps!]
-def equivDirectSum : TensorAlgebra R M ≃ₐ[R] ⨁ n, (⨂[R]^n) M :=
+def equivDirectSum : TensorAlgebra R M ≃ₐ[R] ⨁ n, ⨂[R]^n M :=
   AlgEquiv.ofAlgHom toDirectSum ofDirectSum toDirectSum_comp_ofDirectSum
     ofDirectSum_comp_toDirectSum
 #align tensor_algebra.equiv_direct_sum TensorAlgebra.equivDirectSum
feat(LinearAlgebra/PiTensorProduct): make reindex dependently typed (#9445)

used to be (⨂[R] _ : ι, M) ≃ₗ[R] ⨂[R] _ : ι₂, M, now M can vary according to the indexing set.

Co-authored-by: Eric Wieser <wieser.eric@gmail.com>

Diff
@@ -120,7 +120,7 @@ theorem ofDirectSum_toDirectSum (x : TensorAlgebra R M) :
 @[simp, nolint simpNF] -- see std4#365 for the simpNF issue
 theorem mk_reindex_cast {n m : ℕ} (h : n = m) (x : (⨂[R]^n) M) :
     GradedMonoid.mk (A := fun i => (⨂[R]^i) M) m
-    (PiTensorProduct.reindex R M (Equiv.cast <| congr_arg Fin h) x) =
+    (PiTensorProduct.reindex R (fun _ ↦ M) (Equiv.cast <| congr_arg Fin h) x) =
     GradedMonoid.mk n x :=
   Eq.symm (PiTensorProduct.gradedMonoid_eq_of_reindex_cast h rfl)
 #align tensor_algebra.mk_reindex_cast TensorAlgebra.mk_reindex_cast
@@ -128,7 +128,7 @@ theorem mk_reindex_cast {n m : ℕ} (h : n = m) (x : (⨂[R]^n) M) :
 @[simp]
 theorem mk_reindex_fin_cast {n m : ℕ} (h : n = m) (x : (⨂[R]^n) M) :
     GradedMonoid.mk (A := fun i => (⨂[R]^i) M) m
-    (PiTensorProduct.reindex R M (Fin.castIso h).toEquiv x) = GradedMonoid.mk n x := by
+    (PiTensorProduct.reindex R (fun _ ↦ M) (Fin.castIso h).toEquiv x) = GradedMonoid.mk n x := by
   rw [Fin.castIso_to_equiv, mk_reindex_cast h]
 #align tensor_algebra.mk_reindex_fin_cast TensorAlgebra.mk_reindex_fin_cast
 
perf(FunLike.Basic): beta reduce CoeFun.coe (#7905)

This eliminates (fun a ↦ β) α in the type when applying a FunLike.

Co-authored-by: Matthew Ballard <matt@mrb.email> Co-authored-by: Eric Wieser <wieser.eric@gmail.com>

Diff
@@ -117,7 +117,7 @@ theorem ofDirectSum_toDirectSum (x : TensorAlgebra R M) :
   AlgHom.congr_fun ofDirectSum_comp_toDirectSum x
 #align tensor_algebra.of_direct_sum_to_direct_sum TensorAlgebra.ofDirectSum_toDirectSum
 
-@[simp]
+@[simp, nolint simpNF] -- see std4#365 for the simpNF issue
 theorem mk_reindex_cast {n m : ℕ} (h : n = m) (x : (⨂[R]^n) M) :
     GradedMonoid.mk (A := fun i => (⨂[R]^i) M) m
     (PiTensorProduct.reindex R M (Equiv.cast <| congr_arg Fin h) x) =
@@ -128,8 +128,8 @@ theorem mk_reindex_cast {n m : ℕ} (h : n = m) (x : (⨂[R]^n) M) :
 @[simp]
 theorem mk_reindex_fin_cast {n m : ℕ} (h : n = m) (x : (⨂[R]^n) M) :
     GradedMonoid.mk (A := fun i => (⨂[R]^i) M) m
-    (PiTensorProduct.reindex R M (Fin.castIso h).toEquiv x) = GradedMonoid.mk n x :=
-  by rw [Fin.castIso_to_equiv, mk_reindex_cast h]
+    (PiTensorProduct.reindex R M (Fin.castIso h).toEquiv x) = GradedMonoid.mk n x := by
+  rw [Fin.castIso_to_equiv, mk_reindex_cast h]
 #align tensor_algebra.mk_reindex_fin_cast TensorAlgebra.mk_reindex_fin_cast
 
 /-- The product of tensor products made of a single vector is the same as a single product of
chore(Algebra/DirectSum/Algebra): remove a redundant assumption (#7585)

commutes is implied by hone and linearity.

This matches the approach taken by AlgHom.ofLinearMap.

Diff
@@ -80,7 +80,7 @@ namespace TensorAlgebra
 /-- The canonical map from a direct sum of tensor powers to the tensor algebra. -/
 def ofDirectSum : (⨁ n, (⨂[R]^n) M) →ₐ[R] TensorAlgebra R M :=
   DirectSum.toAlgebra _ _ (fun _ => TensorPower.toTensorAlgebra) TensorPower.toTensorAlgebra_gOne
-    (fun {_ _} => TensorPower.toTensorAlgebra_gMul) TensorPower.toTensorAlgebra_galgebra_toFun
+    (fun {_ _} => TensorPower.toTensorAlgebra_gMul)
 #align tensor_algebra.of_direct_sum TensorAlgebra.ofDirectSum
 
 @[simp]
feat: use suppress_compilation in tensor products (#7504)

More principled version of #7281.

Diff
@@ -15,6 +15,7 @@ In this file we show that `TensorAlgebra R M` is isomorphic to a direct sum of t
 `TensorAlgebra.equivDirectSum`.
 -/
 
+suppress_compilation
 
 open scoped DirectSum TensorProduct
 
chore: remove unused simps (#6632)

Co-authored-by: Eric Wieser <wieser.eric@gmail.com>

Diff
@@ -156,7 +156,6 @@ theorem toDirectSum_tensorPower_tprod {n} (x : Fin n → M) :
     toDirectSum (tprod R M n x) = DirectSum.of _ n (PiTensorProduct.tprod R x) := by
   rw [tprod_apply, AlgHom.map_list_prod, List.map_ofFn]
   simp_rw [Function.comp, toDirectSum_ι]
-  dsimp only
   rw [DirectSum.list_prod_ofFn_of_eq_dProd]
   apply DirectSum.of_eq_of_gradedMonoid_eq
   rw [GradedMonoid.mk_list_dProd]
chore: banish Type _ and Sort _ (#6499)

We remove all possible occurences of Type _ and Sort _ in favor of Type* and Sort*.

This has nice performance benefits.

Diff
@@ -18,7 +18,7 @@ In this file we show that `TensorAlgebra R M` is isomorphic to a direct sum of t
 
 open scoped DirectSum TensorProduct
 
-variable {R M : Type _} [CommSemiring R] [AddCommMonoid M] [Module R M]
+variable {R M : Type*} [CommSemiring R] [AddCommMonoid M] [Module R M]
 
 namespace TensorPower
 
chore: script to replace headers with #align_import statements (#5979)

Open in Gitpod

Co-authored-by: Eric Wieser <wieser.eric@gmail.com> Co-authored-by: Scott Morrison <scott.morrison@gmail.com>

Diff
@@ -2,15 +2,12 @@
 Copyright (c) 2021 Eric Wieser. All rights reserved.
 Released under Apache 2.0 license as described in the file LICENSE.
 Authors: Eric Wieser
-
-! This file was ported from Lean 3 source module linear_algebra.tensor_algebra.to_tensor_power
-! leanprover-community/mathlib commit d97a0c9f7a7efe6d76d652c5a6b7c9c634b70e0a
-! Please do not edit these lines, except to modify the commit id
-! if you have ported upstream changes.
 -/
 import Mathlib.LinearAlgebra.TensorAlgebra.Basic
 import Mathlib.LinearAlgebra.TensorPower
 
+#align_import linear_algebra.tensor_algebra.to_tensor_power from "leanprover-community/mathlib"@"d97a0c9f7a7efe6d76d652c5a6b7c9c634b70e0a"
+
 /-!
 # Tensor algebras as direct sums of tensor powers
 
feat: port LinearAlgebra.TensorAlgebra.ToTensorPower (#4876)

Co-authored-by: Kevin Buzzard <k.buzzard@imperial.ac.uk> Co-authored-by: Johan Commelin <johan@commelin.net>

Dependencies 8 + 474

475 files ported (98.3%)
197198 lines ported (98.4%)
Show graph

The unported dependencies are