Zulip Chat Archive

Stream: maths

Topic: Proving that derivative of sin is cos


Luca Seemungal (Oct 28 2019 at 17:17):

I want to prove that the derivative of sin is cos as part of a project. Where do I start?

Kevin Buzzard (Oct 28 2019 at 17:25):

This is a great question! @Patrick Massot @Sebastien Gouezel what is the status of the definition of differentiation? Are we allowed to define the derivative of a function from C to C or does it still have to be all part of some theory of locally analytic topological Frechet Banach seminormed modules?

Kevin Buzzard (Oct 28 2019 at 17:25):

Do we even have this definition yet? Is it time?

Mario Carneiro (Oct 28 2019 at 17:29):

Ultimately we will want to generalize to the locally analytic topological Frechet Banach seminormed case but writing it for C -> C first seems like a good way to keep a student project focused, and a domain expert can do the generalization later

Mario Carneiro (Oct 28 2019 at 17:29):

define the univariate derivative for functions R -> R (or a trivial generalization thereof)

Mario Carneiro (Oct 28 2019 at 17:30):

oh, my message broke causality

Kevin Buzzard (Oct 28 2019 at 17:31):

R=C is true in HoTT, right?

Mario Carneiro (Oct 28 2019 at 17:32):

A class generalizing both is the first order trivial generalization

Floris van Doorn (Oct 28 2019 at 17:48):

R=C is true in HoTT, right?

As types, yes, but not if you take into account any structure.

Chris Hughes (Oct 28 2019 at 17:50):

It's true for the group structure.

Sebastien Gouezel (Oct 28 2019 at 18:02):

The univariate derivative is not defined yet, but it is trivial to define it in terms of the general derivative, just by identfying a linear map from š•œ to F with an element of F through its value at 1. Then it remains to translate the fderiv API to this setting, which should be trivial but tedious. But please, do not start from scratch, use the general results that are already available! Inspired by the beginning of the file analysis/calculus/deriv.lean, the definitions would start with

variables {š•œ : Type*} [nondiscrete_normed_field š•œ]
variables {F : Type*} [normed_group F] [normed_space š•œ F]

def has_deriv_at_filter (f : š•œ ā†’ F) (c : F) (x : š•œ) (L : filter š•œ) :=
has_fderiv_at_filter f (scalar_prod_space_iso id c) x L

def has_deriv_within_at (f : š•œ ā†’ F) (c : F) (s : set š•œ) (x : š•œ) :=
has_deriv_at_filter f c x (nhds_within x s)

def has_deriv_at (f : š•œ ā†’ F) (c : F) (x : š•œ) :=
has_deriv_at_filter f c x (nhds x)

variables (š•œ)

def deriv_within (f : š•œ ā†’ F) (s : set š•œ) (x : š•œ) :=
if h : āˆƒc, has_deriv_within_at f c s x then classical.some h else 0

def deriv (f : š•œ ā†’ F) (x : š•œ) :=
if h : āˆƒc, has_deriv_at f c x then classical.some h else 0

Note that properties such as differentiable_within_at or differentiable_on or differentiable would not need new definitions in the one-dimensional context, as the general definitions are already enough.

Sebastien Gouezel (Oct 28 2019 at 18:20):

I remember that @Jeremy Avigad said at some point that one of his students would be working on this, but I don't know if this has happened concretely.

Mario Carneiro (Oct 28 2019 at 18:50):

IIRC @Paula Neeley was working on derivatives and @Joe was working on integration, perhaps they can give an update?

Luca Seemungal (Oct 28 2019 at 18:58):

Thanks everyone. I want to be more extremely basic so I just want to show that the derivative of sine is cosine, and so I just need to get the derivative of a function from C to C working. So using stuff in analysis/calculus/deriv.lean I have to use C\mathbb{C} instead of k\mathbb{k} and F and then ... "make it work"?

Luca Seemungal (Oct 28 2019 at 19:00):

And then after that I'll need differentiability of power series within their radius of convergence. Do we have this? Do we have ROC?

Mario Carneiro (Oct 28 2019 at 19:04):

@Luca Seemungal The strategy I call "trivial generalization" is to write the statement with K like Sebastien suggests, but pretend it says R instead, and apply theorems as if it was R and hoping it all just works. As soon as something fails because you are missing an assumption, switch to R for that theorem (or ask here and we can probably suggest the right structure class to use)

Luca Seemungal (Oct 28 2019 at 19:05):

Ah, ok. Will do

Mario Carneiro (Oct 28 2019 at 19:10):

@Sebastien Gouezel what is that scalar prod thing? has deriv at filter is supposed to be the common generalization of the other two

Sebastien Gouezel (Oct 28 2019 at 19:13):

If you have an element c : F and want to construct a linear map from š•œ to F mapping 1 to c, you need to do something. This is precisely what scalar_prod_space_iso does. Here is its type:

/-- Associating to a scalar-valued linear map and an element of `Ī³` the
`Ī³`-valued linear map obtained by multiplying the two (a.k.a. tensoring by `Ī³`) -/
def scalar_prod_space_iso (c : Ī² ā†’L[Ī±] Ī±) (f : Ī³) : Ī² ā†’L[Ī±] Ī³

Jeremy Avigad (Oct 28 2019 at 19:38):

Paula is working on epistemic logic now, and isn't planning on working on the analysis library. I am still hoping Joe will finish the Bochner integral (where "finish" means leave it in a state where others can work on it without having to deal with the low-level definitions).

Luca Seemungal (Oct 29 2019 at 08:43):

Why are we doing the bochner integral specifically? If the reason is generality, is there not a more general integral?

Sebastien Gouezel (Oct 29 2019 at 09:12):

Bochner integral (the straightforward generalization to higher dimensions of Lebesgue integral) is the right integral, i.e., the one that every mathematician uses every day. There is also the Pettis integral, but it takes values in the bidual of the space so it is harder to use in practice. And the Kurzweil-Henstock integral, which is a one-dimensional curiosity that is very seldom used, and lacks many of the good properties of Lebesgue integral.

Oliver Nash (Oct 29 2019 at 09:19):

If someone has the time to write it, Iā€™d love to read an article expanding on the above very helpful remarks (and maybe also commenting on Riemann-Stieltjes, Haar, ...). It could even be a vehicle to get passing mathematicians to hear about Mathlib.

Kevin Buzzard (Oct 29 2019 at 10:14):

There's a hard theorem before we get Haar measure/integral -- I can't imagine that this Bochner stuff does it. The theorem is that every sufficiently nice topological group has a measure with some nice property. This is a bunch of abstract functional analysis which will, I guess, take a long time to do in Lean. I'm assuming that Bochner is just a story about the real numbers in some sense. Haar gives you an integral for real-valued functions on the group GLn(Qp)GL_n(\mathbb{Q}_p) for example.

Mario Carneiro (Oct 29 2019 at 10:22):

I guess you don't actually need the theorem to build the Haar integral though, if you just have a typeclass saying "a type with a designated Haar measure"

Mario Carneiro (Oct 29 2019 at 10:23):

and then the big theorem says that every sufficiently nice type can be equipped with a Haar structure. We still wouldn't want to use it because the choice is not even unique up to equality let alone defeq

Gabriel Ebner (Oct 29 2019 at 10:27):

@Kevin Buzzard The Bochner integral generalizes the codomain of integrable functions, i.e. you can integrate functions that go into a Banach space. The domain can be any measure space you want. You can e.g. integrate a function f : GL_n(QQ_p) -> L^2(CC).

Kevin Buzzard (Oct 29 2019 at 10:29):

So the main theorem regarding Haar measure is: "so you see this huge infinite-dimensional space of measures on GL_n(Q_p)? There is a very special line in there called the Haar measure (defined up to a positive real scaling factor)".

Kevin Buzzard (Oct 29 2019 at 10:32):

The beautiful theorem is that for any locally compact Hausdorff topological group there's a measure which is invariant under left translation (Ī¼(gU)=Ī¼(U)\mu(gU)=\mu(U)), and the measure of an open set is positive, and the measure of a compact set is finite. Its existence is "just copy the proof of existence of Lebesgue measure, and in a few places work a lot harder because you don't have the starting point "measure of [a,b][a,b] is bāˆ’ab-a" so you need to come up with some approximation to this and fiddle around a lot more.


Last updated: Dec 20 2023 at 11:08 UTC