Zulip Chat Archive
Stream: new members
Topic: Incipient numerical analysis results
Dan Stanescu (Jul 25 2020 at 15:32):
I've made some progress towards formalizing the proof that the sequence approximating (say) generated by Newton's method converges. I used the sequence . There are claims the ancient Sumerians knew of such and it appears in baby Rudin as an exercise (page 81). I've shown the sequence is bounded below, decreasing and also that a finite limit exists. I did that using mathlib
tools, as opposed to using my own definitions of convergence etc, which was at times difficult but rewarding (at places I needed to switch to the increasing sequence instead to use mathlib
's definition of monotone
). I'm at the last stage now, where I have to actually prove that the limit is . Rudin's suggested proof: given that the limit exists, get from the recurrence relationship, whence . This is straightforward on paper, but (supposing I can prove ) is it something that can be achieved solely with the tools in mathib
? Can anyone advise me how to proceed here within this restriction? The code (which still needs some clean-up) is available here.
Bryan Gin-ge Chen (Jul 25 2020 at 15:34):
(deleted)
Patrick Massot (Jul 25 2020 at 16:21):
I'm not sure what is the obstacle here. Dan, do you want the following?
lemma dan_limit {u : ℕ → ℝ} {L : ℝ} {f : ℝ → ℝ} (hu : tendsto u at_top $ 𝓝 L)
(hf : continuous_at f L) (huf : ∀ n, u (n+1) = f (u n)) : f L = L :=
begin
have lim : tendsto (u ∘ nat.succ) at_top (𝓝 L) :=
(tendsto_add_at_top_iff_nat 1).mpr hu,
rw show u ∘ nat.succ = f ∘ u, from funext huf at lim,
exact tendsto_nhds_unique (tendsto.comp hf hu) lim
end
Patrick Massot (Jul 25 2020 at 16:24):
Or is the issue solving the equation on L?
Patrick Massot (Jul 25 2020 at 16:24):
(which is probably much harder :sad:)
Dan Stanescu (Jul 25 2020 at 16:26):
I didn't say there was an issue, I was just wondering whether this can be done with what we have. I'm not at all familiar with all the tools we have as of now.
@Patrick Massot Thanks Patrick! I'll try to use this and it may be that this completely solves the problem. If not, I'll come back for more help.
Jalex Stark (Jul 25 2020 at 16:29):
i'd guess the most efficient way to solve the equation is to manipulate it into a quadratic by hand with rw
, and then I think the quadratic formula is already in mathlib
Patrick Massot (Jul 25 2020 at 16:36):
lemma dan_eqn {L : ℝ} (h : L = (L*L + 2)/(2*L)) (h' : 0 < L) : L = real.sqrt 2 :=
begin
apply_fun (λ x, 2*L*x) at h,
simp_rw mul_div_cancel' _ (ne_of_gt (by linarith) : 2*L ≠ 0) at h,
apply_fun (λ x, x - L*L) at h,
ring at h,
symmetry,
rwa real.sqrt_eq_iff_sqr_eq ; linarith
end
Dan Stanescu (Jul 25 2020 at 16:39):
And now I managed to get two lemmas with my name attached-:)
Wait, it's even better: my name prepended.
Patrick Massot (Jul 25 2020 at 16:40):
Actually only the second line is a bit ugly. Otherwise this is almost what you would do on paper. But really there should be a tactic doing it in one line.
Dan Stanescu (Jul 25 2020 at 22:47):
I finally have everything, so this little project is finished. However, I needed more than 25 lines of code to get the result below (I went all the way to showing false
from at_top = ⊥
). Very instructive, but now that I did can someone please show me a more reasonable proof of the following:
import analysis.specific_limits
open_locale topological_space
open filter
example (s : ℕ → ℝ) (hs : ∀ n : ℕ, 2 < s n) : ¬ (tendsto s at_top (𝓝 0)) :=
begin
sorry,
end
Reid Barton (Jul 25 2020 at 23:31):
example (s : ℕ → ℝ) (hs : ∀ n : ℕ, 2 < s n) : ¬ (tendsto s at_top (𝓝 0)) :=
begin
rw metric.tendsto_at_top,
push_neg,
refine ⟨2, by norm_num, λ N, ⟨N, le_refl _, _⟩⟩,
change 2 ≤ abs (s N - 0),
refine le_trans _ (le_abs_self _),
specialize hs N,
linarith
end
Dan Stanescu (Jul 26 2020 at 00:35):
Twenty lines less, thanks @Reid Barton
Dan Stanescu (Jul 26 2020 at 15:45):
To wrap this up, is there anyone who thinks this should go somewhere in mathlib
(possibly archive
) upon clean-up and possibly generalization to instead of only? It would probably be a nice little student project.
Thanks again to everyone!
Mario Carneiro (Jul 26 2020 at 15:59):
An alternative way to prove the tendsto statement is to observe that if it did then 0
would lie in the closure of Ioi 2
, which is Ici 2
Mario Carneiro (Jul 26 2020 at 16:08):
example (s : ℕ → ℝ) (hs : ∀ n : ℕ, 2 < s n) : ¬ (tendsto s at_top (𝓝 0)) :=
begin
have : (0:ℝ) ∉ set.Ici (2 : ℝ), { simp, norm_num },
rw ← closure_Ioi at this,
exact λ h, this (mem_closure_of_tendsto h (eventually_of_forall hs)),
end
Alex J. Best (Jul 26 2020 at 16:19):
I think some version of this would be cool to have, how easily does what you have generalize to any convex function (is it even true that newtons method always converges for single variable convex functions)?
Mario Carneiro (Jul 26 2020 at 16:21):
well it has to be differentiable at least
Mario Carneiro (Jul 26 2020 at 16:22):
convergence of newton's method for arbitrary convex differentiable functions would be a cool result but probably has very little in common with Dan's proof
Dan Stanescu (Jul 26 2020 at 16:24):
I don't think the method will converge in general for a convex function, even if differentiable.
Dan Stanescu (Jul 26 2020 at 16:25):
The update rule is .
Dan Stanescu (Jul 26 2020 at 16:26):
There's no guarantee the iterates won't hit a point where the first derivative is zero.
Kevin Buzzard (Jul 26 2020 at 16:27):
For continuous functions on the p-adic numbers, as long as x_0 is sufficiently close to the root (condition which depends on f(x_0) and f'(x_0)) then it always converges.
Dan Stanescu (Jul 26 2020 at 16:29):
My previous statements should be read as applying to real-valued functions of one real variable.
Last updated: Dec 20 2023 at 11:08 UTC