Zulip Chat Archive
Stream: Is there code for X?
Topic: Array.erase_data
Scott Morrison (Nov 24 2023 at 00:24):
Would anyone be interested in proving
namespace Array
@[simp] theorem erase_data [BEq α] {l : Array α} {a : α} : (l.erase a).data = l.data.erase a := by
sorry
?
Unfortunately at the moment we have almost no facts about Array.erase
(or the functions Array.feraseIdx
and Array.indexOf?
that it is built out of).
Pol'tta / Miyahara Kō (Nov 26 2023 at 09:15):
import Mathlib
universe u v
namespace Option
variable {α : Type u} {β : Type v}
theorem pmap_congr {p q : α → Prop} {f : ∀ a, p a → β} {g : ∀ a, q a → β} (o : Option α) {H₁ H₂}
(h : ∀ a ∈ o, ∀ (h₁ h₂), f a h₁ = g a h₂) : pmap f o H₁ = pmap g o H₂ := by
cases o with
| none => rfl
| some a => simp [h a rfl (H₁ a rfl) (H₂ a rfl)]
end Option
namespace List
variable {α : Type u}
theorem findIdx?_nil (p : α → Bool) :
findIdx? p [] = none :=
rfl
theorem findIdx?_start_succ (p : α → Bool) (l : List α) (n : ℕ) :
findIdx? p l (n + 1) = Option.map (· + 1) (findIdx? p l n) := by
induction l generalizing n with
| nil => simp [findIdx?]
| cons a l hl =>
simp [findIdx?, hl (n + 1)]
split_ifs <;> rfl
theorem findIdx?_cons (p : α → Bool) (a : α) (l : List α) :
findIdx? p (a :: l) = bif p a then some 0 else Option.map (· + 1) (List.findIdx? p l) := by
simp [findIdx?, findIdx?_start_succ, Bool.cond_eq_ite]
@[simp]
theorem indexOf?_nil [BEq α] (a : α) :
indexOf? a [] = none :=
findIdx?_nil (a == ·)
theorem indexOf?_cons [DecidableEq α] (a b : α) (l : List α) :
indexOf? a (b :: l) = if a = b then some 0 else Option.map (· + 1) (indexOf? a l) := by
simp [indexOf?, findIdx?_cons, Bool.cond_eq_ite]
theorem findIdx?_eq_findIdx (p : α → Bool) (l : List α) :
findIdx? p l = Option.guard (· < length l) (findIdx p l) := by
induction l with
| nil => simp [findIdx?_nil, Option.guard]
| cons a l hl =>
simp [findIdx?_cons, findIdx_cons, Bool.cond_eq_ite, hl]
symm
split_ifs with ha
case pos => simp
case neg =>
simp [Option.guard, Nat.succ_eq_add_one]
split_ifs <;> rfl
theorem indexOf?_eq_indexOf [BEq α] (a : α) (l : List α) :
indexOf? a l = Option.guard (· < length l) (indexOf a l) :=
findIdx?_eq_findIdx (a == ·) l
theorem findIdx?_lt_length {p : α → Bool} {l : List α} {n : ℕ} (hn : findIdx? p l = some n) :
n < length l := by
rw [findIdx?_eq_findIdx, Option.guard_eq_some] at hn
rcases hn with ⟨rfl, hf⟩
exact hf
theorem indexOf?_lt_length [BEq α] {a : α} {l : List α} {n : ℕ} (hn : indexOf? a l = some n) :
n < length l :=
findIdx?_lt_length hn
@[simp]
theorem eraseIdx_indexOf [DecidableEq α] (l : List α) (a : α) :
eraseIdx l (indexOf a l) = List.erase l a := by
induction l with
| nil => simp
| cons b l hl =>
simp [List.erase_cons, List.indexOf_cons, eq_comm (a := b) (b := a)]
split_ifs with hb
case pos => simp
case neg => simp [hl]
theorem take_append_drop_succ (l : List α) (n : ℕ) :
take n l ++ drop (n + 1) l = eraseIdx l n := by
induction l generalizing n with
| nil => simp
| cons a l hl =>
cases n using Nat.casesAuxOn with
| zero => simp
| succ n => simp [hl n]
end List
namespace Array
variable {α : Type u}
theorem indexOfAux_eq_indexOfAux_data [DecidableEq α] (a : α) (as : Array α) (n : ℕ) :
indexOfAux as a n =
Option.pmap (fun m hm => ⟨m + n, Nat.add_lt_of_lt_sub (List.length_drop n as.data ▸ hm)⟩)
(List.indexOf? a (List.drop n as.data))
(fun m hm => List.indexOf?_lt_length hm) := by
obtain ⟨m, hm₁⟩ : ∃ m, size as - n = m := ⟨_, rfl⟩
induction m using Nat.strongRec generalizing n with
| ind m hm₂ =>
unfold indexOfAux
split_ifs with hn
case pos =>
simp [getElem_eq_data_get, - Option.pmap]
have hdas : List.drop n as.data = List.get as.data ⟨n, hn⟩ :: List.drop (n + 1) as.data := by
rw [← List.tail_drop]
apply List.eq_cons_of_mem_head?
simp [← List.get?_zero, List.get?_drop, List.get?_eq_get hn]
split_ifs with ha
case pos =>
simp [hdas, List.indexOf?_cons, ha, - Option.pmap]
case neg =>
specialize hm₂ _ (by simp [← hm₁, Nat.sub_succ_lt_self _ _ hn]) (n + 1) rfl
simp [hdas, List.indexOf?_cons, ha, hm₂, Option.pmap_map, - Option.pmap, eq_comm (a := a)]
apply Option.pmap_congr
intro b _ hb₂ hb₃
simp [Nat.add_assoc, Nat.add_comm n 1]
case neg =>
simp at hn
simp [List.drop_eq_nil_of_le hn, - Option.pmap]
theorem indexOf?_eq_indexOf?_data [DecidableEq α] (a : α) (as : Array α) :
indexOf? as a =
Option.pmap (fun n hn => ⟨n, hn⟩) (List.indexOf? a as.data)
(fun _ => List.indexOf?_lt_length) := by
simp [indexOf?, indexOfAux_eq_indexOfAux_data, - Option.pmap]
@[simp]
theorem eraseIdxAux_data (as : Array α) (n : ℕ) :
(eraseIdxAux n as).data = List.eraseIdx as.data (min n (size as) - 1) := by
obtain ⟨m, hm₁⟩ : ∃ m, size as - n = m := ⟨_, rfl⟩
induction m using Nat.strongRec generalizing as n with
| ind m hm₂ =>
unfold eraseIdxAux
split_ifs with hn
case pos =>
specialize hm₂ _
(by simp [← hm₁, Nat.sub_add_lt_sub (Nat.add_one_le_iff.mpr hn) Nat.zero_lt_one])
(swap as ⟨n, hn⟩ ⟨n - 1, by exact Nat.lt_of_le_of_lt (Nat.pred_le n) hn⟩) (n + 1) rfl
simp [hm₂, data_swap, getElem_eq_data_get, ← List.take_append_drop_succ,
min_eq_left (Nat.le_of_lt hn), min_eq_left (Nat.lt_iff_add_one_le.mp hn)]
apply List.ext_get
· simp [show List.length as.data = size as from rfl, min_eq_left (Nat.le_of_lt hn),
min_eq_left (Nat.le_of_lt (Nat.lt_of_le_of_lt (Nat.sub_le n 1) hn))]
cases n using Nat.casesAuxOn with
| zero => simp
| succ n =>
simp [Nat.sub_add_eq (size as) (n + 1) 1, ← Nat.add_sub_assoc,
Nat.lt_iff_add_one_le.mp (Nat.zero_lt_sub_of_lt hn)]
· intro k hk₁ hk₂
rcases Nat.lt_trichotomy k (n - 1) with (hk₃ | rfl | hk₃)
· conv_lhs =>
apply List.get_append_left
tactic' =>
simp [Nat.lt_of_lt_of_le hk₃ (Nat.sub_le n 1),
Nat.lt_trans (Nat.lt_of_lt_of_le hk₃ (Nat.sub_le n 1)) hn,
show List.length as.data = size as from rfl]
conv_rhs =>
apply List.get_append_left
tactic' =>
simp [hk₃, Nat.lt_trans (Nat.lt_of_lt_of_le hk₃ (Nat.sub_le n 1)) hn,
show List.length as.data = size as from rfl]
simp [List.get_take', List.get_set, Nat.ne_of_gt hk₃,
Nat.ne_of_gt (Nat.lt_of_lt_of_le hk₃ (Nat.sub_le n 1))]
· cases n using Nat.casesAuxOn with
| zero =>
conv_lhs =>
apply List.get_append_right'
tactic' => simp
conv_rhs =>
apply List.get_append_right'
tactic' => simp
simp [List.get_drop']
| succ n =>
conv_lhs =>
apply List.get_append_left
tactic' =>
simp [Nat.lt_trans (Nat.lt_add_of_pos_right Nat.zero_lt_one) hn,
show List.length as.data = size as from rfl]
conv_rhs =>
apply List.get_append_right'
tactic' => simp
simp [List.get_take', List.get_drop', List.get_set,
min_eq_left (Nat.le_of_lt
(Nat.lt_trans (Nat.lt_add_of_pos_right Nat.zero_lt_one) hn)),
show List.length as.data = size as from rfl]
· conv_lhs =>
apply List.get_append_right'
tactic' =>
simp [Nat.le_trans (Nat.sub_le_iff_le_add.mp (Nat.le_refl (n - 1)))
(Nat.lt_iff_add_one_le.mp hk₃)]
conv_rhs =>
apply List.get_append_right'
tactic' =>
simp [Nat.le_trans (Nat.le_trans (Nat.sub_le_iff_le_add.mp (Nat.le_refl (n - 1)))
(Nat.lt_iff_add_one_le.mp hk₃)) (Nat.le_of_lt (Nat.lt_succ_self k))]
simp [List.get_drop', List.get_set, min_eq_left (Nat.le_of_lt hn),
min_eq_left (Nat.le_of_lt
(Nat.lt_of_le_of_lt (Nat.sub_le n 1) hn)),
show List.length as.data = size as from rfl,
Nat.ne_of_lt (Nat.lt_of_le_of_lt (Nat.sub_le n 1)
(Nat.lt_of_lt_of_le (Nat.lt_succ_self n) (Nat.le_add_right (n + 1) (k - n)))),
Nat.ne_of_lt
(Nat.lt_of_lt_of_le (Nat.lt_succ_self n) (Nat.le_add_right (n + 1) (k - n)))]
congr 2
simp [Nat.add_right_comm (m := 1), Nat.add_sub_of_le (Nat.le_of_lt hk₃),
Nat.add_sub_of_le (Nat.le_trans (Nat.sub_le_iff_le_add.mp (Nat.le_refl (n - 1)))
(Nat.lt_iff_add_one_le.mp hk₃))]
case neg =>
rw [not_lt] at hn
have has : size as ≤ size as - 1 + 1 := by
rw [← Nat.sub_le_iff_le_add]
simp [hn, ← List.take_append_drop_succ, List.dropLast_eq_take, Nat.pred_eq_sub_one,
List.drop_eq_nil_of_le has, show List.length as.data = size as from rfl]
@[simp]
theorem feraseIdx_data (as : Array α) (i : Fin (size as)) :
(feraseIdx as i).data = List.eraseIdx as.data i.val := by
simp [feraseIdx, ← Nat.sub_min_sub_right, min_eq_left (Nat.le_sub_one_of_lt i.isLt)]
@[simp]
theorem erase_data [DecidableEq α] (as : Array α) (a : α) :
(erase as a).data = List.erase as.data a := by
simp [erase]
split
next _ has =>
simp [indexOf?_eq_indexOf?_data, List.indexOf?_eq_indexOf, Option.pmap_eq_none_iff,
Option.guard, imp_false, LE.le.ge_iff_eq List.indexOf_le_length, List.indexOf_eq_length,
- Option.pmap] at has
rw [List.erase_of_not_mem has]
next _ i hi =>
cases i with | mk n hn =>
simp [indexOf?_eq_indexOf?_data, List.indexOf?_eq_indexOf, Option.pmap_eq_some_iff,
- Option.pmap] at hi
simp [hi.left.symm]
end Array
Scott Morrison (Nov 26 2023 at 12:14):
Heroic, @Pol'tta / Miyahara Kō , thank you so much!
Frédéric Dupuis (Nov 26 2023 at 14:48):
On a related note, I have a proof of Array.zipWith_eq_zipWith_data
, but it relies on some lemmas about lists that are still in mathlib. Should I PR it to mathlib or just wait until those lemmas are moved to Std?
Eric Wieser (Nov 26 2023 at 16:16):
I'd say PR to mathlib so that the proofs don't get lost
Eric Wieser (Nov 26 2023 at 16:16):
You can always move them all at once later
Scott Morrison (Nov 27 2023 at 00:37):
@Frédéric Dupuis, you can also move the lemmas up to Std yourself! I know we have a PR backlog at Std, just like at Mathlib, but anything necessary for Array.zipWith_eq_zipWith_data
is in scope for Std.
Frédéric Dupuis (Nov 27 2023 at 00:40):
Sure! How does this moving process work exactly? If we first add the lemmas to Std, we have duplication of lemmas for a while, and if we first delete from mathlib, well that doesn't work either...!
Scott Morrison (Nov 27 2023 at 00:40):
We add lemmas to Std first.
Frédéric Dupuis (Nov 27 2023 at 00:40):
Doesn't the duplication break things?
Scott Morrison (Nov 27 2023 at 00:41):
If all that is required at Mathlib is deleting the corresponding lemmas, then we just do it at the same time as the dependency bump.
Frédéric Dupuis (Nov 27 2023 at 00:41):
Ah -- OK I see.
Scott Morrison (Nov 27 2023 at 00:41):
If the Mathlib fixes are more complicated, it is helpful to create a Mathlib PR that uses the Std branch for your PR, and link to it.
Scott Morrison (Nov 27 2023 at 00:41):
That way we can see that the adaptations are reasonable, and saves time for the person (often me :-) doing the dependency bump.
Frédéric Dupuis (Nov 27 2023 at 00:42):
Wait, what is the "dependency bump" here? The mathlib lakefile depends on Std main
, not on a particular commit.
Alex J. Best (Nov 27 2023 at 00:44):
The lake-manifest.json
file pins a more specific version of dependencies to be used (if the file exists, but it is checked in for mathlib)
Scott Morrison (Nov 27 2023 at 00:44):
i.e. only when you run lake update
does it actually move to the latest commit on main
.
Scott Morrison (Nov 27 2023 at 00:44):
(The regular "chore: bump Std dependency" PRs that come through are me doing that.)
Frédéric Dupuis (Nov 27 2023 at 00:45):
Oh I see now. In my private repos I've been tracking particular commits directly in the lakefile instead.
Frédéric Dupuis (Nov 27 2023 at 02:06):
Last updated: Dec 20 2023 at 11:08 UTC