Zulip Chat Archive

Stream: new members

Topic: Differentiable because composition of differentiables


Adomas Baliuka (Oct 02 2023 at 21:00):

Is there a tactic or a simple way to prove that any composition of differentiable functions is differentiable? Taking care of additions, multiplications, etc. What I'm doing right now is manually composing the lemmas, which seems like it should be automatable... How would you prove this, for example?

import Mathlib
open Real

example (x : ) :
    DifferentiableAt  (fun p  (p * exp p) ^ 3 + sin p * cos p) x := by
    sorry

Kevin Buzzard (Oct 03 2023 at 05:37):

You could see how the continuity tactic works and then generalise to a differentiability tactic. Alternatively you could train aesop to do this. I can't tell you any more details of either approach but I think that both would be nice projects.

Adomas Baliuka (Oct 03 2023 at 06:45):

I've not looked into macros yet but maybe I could try anyway. It seems that continuityonly works on Continuous f but not on ContinuousAt f x? So I guess differentiability sould also start with Differentiable while forgetting the At?
Also the tactic continuity? seems to produce output that doesn't actually prove the goal (without errors). It seems continuity is some kind of special mode of aesop already? So I should start looking into macros and then into how aesop works internally?

Alex J. Best (Oct 03 2023 at 10:13):

That would be a good start, but I think adding the at versions shouldn't be too hard to do at some point.

Can you give an example for the continuity? issue, I can try and fix it.

Yes, many tactics are based on aesop now, which was one of the original goals of aesop.
I'm not sure you need to understand that much about the actual internals of aesop to work with it, reading the whole readme at https://github.com/JLimperg/aesop and looking at continuity should be enough

Tomas Skrivan (Oct 03 2023 at 10:37):

At some point I did set up aesop for this purpose and you do not need to understand how aesop works(I still don't). Just copy paste exactly what the continuity tactic does:
https://github.com/leanprover-community/mathlib4/blob/master/Mathlib/Tactic/Continuity.lean

Unfortunately aesop was too slow for my application. In my library SciLean I have a specialized tactic fprop to deal with this type of proof. It can deal with much bigger terms.

Here is the file setting up fprop for DifferentiableAt.

Would there be interested in adding this tactic to mathlib?

Patrick Massot (Oct 03 2023 at 13:54):

If it's better than using Aesop then yes.

Vincent Beffara (Oct 03 2023 at 14:22):

How does it juggle with Differentiable, DifferentiableAt, DifferentiableOn, DifferentiableWithinAt, HasDeriv and all the variants? (In the same vein, how hard would it be to extend continuity so that it understands ContinuousOn?)

Tomas Skrivan (Oct 03 2023 at 15:16):

That is the problem, it is not strictly better then aesop and it would require non trivial work which is not too much aligned with my current goals. Mainly, it is faster then aesop on the problems like in the original question. Plus there are some cases which continuity can't solve but fprop can, like:

variable
  {X : Type _} [TopologicalSpace X]
  {Y : Type _} [TopologicalSpace Y]
  {Z : Type _} [TopologicalSpace Z]
  {ι : Type _} [Fintype ι]

example (f : X  Y  Z) (g : X  Y)
  (hf : Continuous fun (x,y) => f x y) (hg : Continuous g)
  : Continuous (fun x => f x (g x)) := by (try continuity); fprop

example (f : X  ι  Z) (g : Y  X)
  (hf :  i, Continuous fun x => f x i) (h : Continuous g)
  : Continuous (fun x => f (g x) i) := by (try continuity); fprop

Right now, there are still some annoying limitations of fprop I do not know how to deal with propertly:

  • for f : X → Fin 5 → Fin 10 → Fin 15→ Y derive Differentiable K f from ∀ i j k, Differentiable K (f · i j k) or vice versa
  • for f : X → Y → Z derive ∀ x, Differentiable K fun y => f x y from Differentiable K fun (x,y) => f x y
    I think I have a good idea how to deal with the second one, no clue what to do in complete generality with the first one.

Tomas Skrivan (Oct 03 2023 at 15:18):

Vincent Beffara said:

How does it juggle with Differentiable, DifferentiableAt, DifferentiableOn, DifferentiableWithinAt, HasDeriv and all the variants? (In the same vein, how hard would it be to extend continuity so that it understands ContinuousOn?)

Currently it treats all the variants Differentiable/At/On separately. To deal with this, my plan is to:

  • for each constant, like HAdd.hAdd, formulate theorem for each variant Differentiable/At/On. This can be automated, At and On variant can be automatically generated once theorem for Differentiable is stateted
  • for free variables allow inferring between variants e.g. DifferentiableAt K f x from DifferentiableOn K x Ω or DifferentiableK f where f is free variable, of inferring from completely different propositions like Continuous from Differentiable or IsLinearMap on finite dimensional vector spaces.

I think there is no difference between Differentiable/On and Continuous/On the type of arguments are the same.

Adomas Baliuka (Oct 03 2023 at 18:01):

Alex J. Best said:

Can you give an example for the continuity? issue, I can try and fix it.

Basically no examples I've tried involving more than a couple composed functions worked without errors.
For example, this outputs something misaligned and upon aligning it correctly it has apply's which fail.

import Mathlib
open Real
example  :  Continuous (fun (y : )  y^2 + sin y + (cos y * exp y) ^ 4) := by
    continuity?

Patrick Massot (Oct 03 2023 at 18:07):

That issue is presumable completely unrelated to continuity, it's probably the power bug.

Patrick Massot (Oct 03 2023 at 18:07):

Try adding local macro_rules | `($x ^ $y) => `(HPow.hPow $x $y) before the example.

Adomas Baliuka (Oct 03 2023 at 18:15):

Patrick Massot said:

Try adding local macro_rules | `($x ^ $y) => `(HPow.hPow $x $y) before the example.

Doesn't help. Same issues.

Adomas Baliuka (Oct 03 2023 at 18:17):

Actually, it seems to be just formatting and punctuation. Deleting all cdot's and indentation fixes the problem.

Patrick Massot (Oct 03 2023 at 18:19):

I tried it and I see a single indentation mismatch. Everything after the first line has one extra level of indentation. The correct output is

import Mathlib
open Real

local macro_rules | `($x ^ $y)   => `(HPow.hPow $x $y)

example  :  Continuous (fun (y : )  y^2 + sin y + (cos y * exp y) ^ 4) := by
    apply Continuous.add
    · apply Continuous.add
      · apply continuous_pow
      · apply Real.continuous_sin
    · apply Continuous.mul
      · apply Continuous.mul
        · apply Real.continuous_cos
        · apply Real.continuous_exp
      · aesop_unfold [npowRec]
        aesop_unfold [npowRec]
        aesop_unfold [npowRec]
        aesop_unfold [npowRec]
        simp_all only [mul_one]
        apply Continuous.mul
        · apply Continuous.mul
          · apply Real.continuous_cos
          · apply Real.continuous_exp
        · apply Continuous.mul
          · apply Continuous.mul
            · apply Real.continuous_cos
            · apply Real.continuous_exp
          · apply Continuous.mul
            · apply Real.continuous_cos
            · apply Real.continuous_exp

Patrick Massot (Oct 03 2023 at 18:19):

@Jannis Limperg

Adomas Baliuka (Oct 03 2023 at 18:22):

Patrick Massot said:

Everything after the first line has one extra level of indentation.

Indeed. I didn't "fix" the indentation carefully enough last time. The power bug plays no role here and the macro isn't needed.

Patrick Massot (Oct 03 2023 at 18:23):

I'm very surprised you don't need the macro. Where are you running this?

Adomas Baliuka (Oct 03 2023 at 18:27):

Patrick Massot said:

I'm very surprised you don't need the macro. Where are you running this?

Was running it from Mathematics in Lean. Indeed, in my own project it doesn't work. I don't understand where MIL places the macro though, not that it matters here. Sorry, will check in my own project from now on. (Lean macros are too powerful for me :D)
Thanks for your patience!

Patrick Massot (Oct 03 2023 at 18:37):

No problem. This power bug is a curse but one day it will go away and become a legend that young people will get tired of hearing allusions to.

Kevin Buzzard (Oct 03 2023 at 19:14):

Patrick and I still remember the days when coercing an integer to a natural would time out...

Adomas Baliuka (Oct 03 2023 at 19:21):

OK, so now I copied the file defining the Aesop ruleset Continuous, the file defining the tactic continuity and text-replaced "continuous" by "differentiable". Then I copied a bunch of theorems from Mathlib (things like differentiable_coswhich do not currently have any attributes) and added the new differentiability attribute to them.

Finally, I was forced to copy Differentiable.add and give it the new attribute, although it already has a @[simp]. Why wasn't it found automatically?

All in all, this seems to work

-- Note: Not MWE, needs the custom tactic.
local macro_rules | `($x ^ $y)   => `(HPow.hPow $x $y)
example : Differentiable  (fun (y : )  y^2 + sin y + (cos y * exp y) ^ 4)  := by
    differentiability

Adomas Baliuka (Oct 03 2023 at 19:27):

So I guess we could have this tactic by just adding the tactic definition and adding @[differentiability] to all theorems that are "differentiability analogues" (there seem to be a lot) of theorems currently marked @[continuity].

Kevin Buzzard (Oct 03 2023 at 19:28):

I knew this would be a nice project :-)

Eric Wieser (Oct 04 2023 at 08:42):

Does it work for HasDerivAt too?

Eric Wieser (Oct 04 2023 at 08:42):

In particular, it would be great if have : HasDerivAt f _ x := by differentiation filled in the _ by unification

Eric Wieser (Oct 04 2023 at 08:42):

Does it work for HasDerivAt too?

Adomas Baliuka (Oct 04 2023 at 08:46):

What I did so far can't do HasDerivAt. For filling in the derivative one might need an actual tactic, not just an Aesop ruleset?

Jannis Limperg (Oct 04 2023 at 08:54):

ab said:

For example, this outputs something misaligned

This is a known issue with aesop?. I think I even had a partial fix for it on some branch; let me reactivate that. If you find more bugs with this functionality (or Aesop in general), please ping me.

Jannis Limperg (Oct 04 2023 at 08:56):

ab said:

What I did so far can't do HasDerivAt. For filling in the derivative one might need an actual tactic, not just an Aesop ruleset?

If the derivative can be filled in by unification (i.e. you can solve the goal by applying lemmas without ever writing the derivative), then Aesop should be able to automate the proof.

Adomas Baliuka (Oct 04 2023 at 09:56):

Jannis Limperg said:

If the derivative can be filled in by unification (i.e. you can solve the goal by applying lemmas without ever writing the derivative), then Aesop should be able to automate the proof.

Sounds interesting. Can you give an example (manual, not necessarily Aesop) where the derivative is filled by unification?

On another note: Do we want some version of differentiability in Mathlib? It seems like the "plagiarize continuity version" would be about as useful as continuity. Of course, one might extend these and make them a lot more useful still.

Kevin Buzzard (Oct 04 2023 at 11:49):

Some version of differentiability in mathlib: absolutely. My students already miss this functionality and one of them even wrote a lean 3 tactic doing this last year by mimicking lean 3 continuity.

Jannis Limperg (Oct 04 2023 at 12:06):

ab said:

Sounds interesting. Can you give an example (manual, not necessarily Aesop) where the derivative is filled by unification?

Here's a proof that the derivative of sin is cos:

import Mathlib.Analysis.SpecialFunctions.ExpDeriv

open Complex

/-- The complex sine function is everywhere strictly differentiable, with the derivative `cos x`. -/
theorem hasStrictDerivAt_sin (x : ) : HasStrictDerivAt sin (cos x) x := by
  have : HasStrictDerivAt sin _ x := by
    simp only [sin, div_eq_mul_inv]
    apply HasStrictDerivAt.mul_const
    apply HasStrictDerivAt.mul_const
    apply HasStrictDerivAt.sub
    · apply HasStrictDerivAt.cexp
      apply HasStrictDerivAt.mul_const
      apply HasStrictDerivAt.neg
      apply hasStrictDerivAt_id
    · apply HasStrictDerivAt.cexp
      apply HasStrictDerivAt.mul_const
      apply hasStrictDerivAt_id
  have : cos x = (cexp (-x * I) * (-1 * I) - cexp (x * I) * (1 * I)) * I * 2⁻¹ := by
    rw [sub_mul, mul_assoc, mul_assoc, I_mul_I, neg_one_mul, neg_neg, mul_one, one_mul, mul_assoc,
         I_mul_I, mul_neg_one, sub_neg_eq_add, add_comm, cos, div_eq_mul_inv]
  rwa [this]

The first have leaves the derivative unspecified, and the subsequent applys determine it by unification. (The original proof is in Mathlib/Analysis/SpecialFunctions/Trigonometric/Deriv.lean.)

Jannis Limperg (Oct 04 2023 at 12:28):

And here's the Aesop version:

theorem hasStrictDerivAt_sin (x : ) : HasStrictDerivAt sin (cos x) x := by
  have : HasStrictDerivAt sin _ x := by
    simp only [sin, div_eq_mul_inv]
    aesop (add safe apply
      [HasStrictDerivAt.mul_const, HasStrictDerivAt.sub, HasStrictDerivAt.cexp,
       HasStrictDerivAt.neg, hasStrictDerivAt_id'])
  have : cos x = ((cexp (-(x * I)) * -(1 * I) - cexp (x * I) * (1 * I)) * I * 2⁻¹) := by
    rw [cos, sub_mul, mul_assoc, mul_assoc,  neg_mul,  neg_mul,
        mul_assoc _ I I, I_mul_I, neg_one_mul, neg_neg, mul_one, one_mul, neg_mul,
        I_mul_I, mul_neg_one, sub_neg_eq_add, add_comm, div_eq_mul_inv]
  rwa [this]

It's a bit slow for my liking though. Seems like running simp_all on every goal is quite expensive. It's also mostly pointless, but when I disable Aesop's integrated simp_all, it doesn't solve the goal any more.

Jannis Limperg (Oct 04 2023 at 12:31):

Another Aesop version, this time without simp but with the lemmas applied with default transparency:

theorem hasStrictDerivAt_sin (x : ) : HasStrictDerivAt sin (cos x) x := by
  have : HasStrictDerivAt sin _ x := by
    simp only [sin, div_eq_mul_inv]
    aesop (add safe (apply (transparency! := default))
      [HasStrictDerivAt.mul_const, HasStrictDerivAt.sub, HasStrictDerivAt.cexp,
       HasStrictDerivAt.neg, hasStrictDerivAt_id'])
      (simp_options := { enabled := false })
  have : cos x = (cexp (-x * I) * (-1 * I) - cexp (x * I) * (1 * I)) * I * 2⁻¹ := by
    rw [sub_mul, mul_assoc, mul_assoc, I_mul_I, neg_one_mul, neg_neg, mul_one, one_mul, mul_assoc,
         I_mul_I, mul_neg_one, sub_neg_eq_add, add_comm, cos, div_eq_mul_inv]
  rwa [this]

This one seems to be a little bit faster. But default transparency destroys indexing, so it's going to become (much?) slower with more lemmas.

Adomas Baliuka (Oct 04 2023 at 12:50):

I wonder: why is continuity is using attribute [aesop (rule_sets [Continuous]) unfold norm] npowRec? This leads to a big slow-down for larger powers while it could be done in a single step. Of course, such large powers almost never happen in practice but it sill seems weird to me. I guess it's some ofNat shenanigans with literals going wrong?

import Mathlib
open Real
local macro_rules | `($x ^ $y)   => `(HPow.hPow $x $y)

example : Continuous (fun (y : )  (cos y * exp y) ^ 20)  := by
    continuity? -- does `aesop_unfold [npowRec]` 20 times and a bunch of `apply` to go with that

example (n : ) : Continuous (fun (y : )  (cos y * exp y) ^ n)  := by
    continuity? -- does not work, even though `Continuous.pow` is marked with `continuity`.

example (f :   ) (h: Continuous f) (n : )
    : Continuous (fun x  (f x) ^ n) := by
    exact Continuous.pow h n  -- here, `continuity` also doesn't work.

Jannis Limperg (Oct 04 2023 at 13:01):

Good question. I'll look into it.

Patrick Massot (Oct 04 2023 at 13:03):

See also the discussion at https://leanprover.zulipchat.com/#narrow/stream/287929-mathlib4/topic/.60Continuous.2Epow.60.20not.20picked.20up.20by.20.60continuity.60

Adomas Baliuka (Oct 05 2023 at 11:51):

I guess there are some architectural/design choices to make in order to have that work for continuity? Maybe differentiability should wait for that to happen?

I didn't find any relevant PRs or issues on Mathlib4 (except peripherally relevant https://github.com/leanprover-community/mathlib4/issues/5030 and https://github.com/leanprover-community/mathlib4/issues/430).

Adomas Baliuka (Oct 24 2023 at 12:54):

(See also discussion here)

I made a PR (still WIP) adding the differentiability tactic to Mathlib4. https://github.com/leanprover-community/mathlib4/pull/7892
I'm confused now where to ask/talk about the process of making it good. Guidelines seem to say Zulip is the main place, PR discussion I guess concerns more concrete things happening to the branch.

What I did

  • copy the files defining the Aesop-rulesets, attribute and tactics for continuity
  • replace all notions of "continuous" by appropriate "differentiability" notions
  • Make a test file, also by copying the test file for continuity.
  • Note: I kept the copyright notices of the original files and added myself to the author list. I assume this is fine unless someone objects.

The next step is (WIP) is making a list of lemmas to be tagged with the differentiability attribute. My approach so far:

  • Copy the list returned by Loogle for |- Differentiable _ _, put in test/Differentiability.lean, give them all the attribute and fix imports.
  • This doesn't work that well. I think it's because there's too much stuff and the "exhaustive search" of Aesop does not find the right combination with given settings of depth/iterations/whatnot.
  • Do git grep --heading -A 1 "\@\[.*continuity.*\]" on Mathlib to find all tactics tagged with continuity. Go over that list manually and check if there are analogous lemmas concerning differentiability. I looked for a way to list all definitions tagged with the attribute but apparently this is not possible in general?
  • Add those lemmas to the test file as comments (TODO delete when tactic is ready)

    1. Comment out all lemmas where I didn't find a differentiability analogue and it seems unlikely there is one given my very limited understanding of advanced math. Also indent them twice.
    2. Comment out without identation all lemmas where analogue was found.
    Add analogue below, thus giving it the new attribute.

    3. Comment out lemmas where an analogue seems like it should/could exist but I couldn't find one. Add comment below for what the analogue might be called if I had found it. Add a "TODO" marker.

TODOs

  • fix commit message
  • Make tactic work well (works with very few lemmas tagged, does not with current maximal set)
  • Consider extending this to DifferentiableAt, deriv, etc... (there are no analogues in Continuity land that do this. Why? I tested making another tactic for DifferentiableAt (not in PR) and it seemed to work about equally well as the current differentiability tactic when its set of lemmas is restricted.) If tagging too many lemmas is a problem for Aesop performance, perhaps the tactic differentiability (or continuity for that matter) could have a custom-written (non-Aesop) step that checks what the goal looks like and then delegates to Aesop with the proper rule set?

Adomas Baliuka (Oct 24 2023 at 12:55):

My first confusion is what the HTML comment in the PR message is supposed to do exactly. I put the description of "what I did" (copy-pasted above) in there but it's not visible on GitHub so who sees it?


Last updated: Dec 20 2023 at 11:08 UTC