Zulip Chat Archive
Stream: metaprogramming / tactics
Topic: differentiation tactic
Yaël Dillies (Dec 30 2023 at 19:48):
I just proved Jordan's inequality by computing derivatives. See here. This was way harder than it should.
Yaël Dillies (Dec 30 2023 at 19:50):
I propose we make a new differentiability
tactic which proves goals of the form DifferentiableOn
, Differentiable
, HasDerivAt
, HasDerivWithinAt
.
Eric Wieser (Dec 30 2023 at 19:54):
Such a tactic is already being worked on
Yaël Dillies (Dec 30 2023 at 19:54):
You type too fast, Eric!
Yaël Dillies (Dec 30 2023 at 19:56):
I am aware of #7892. However, I find a few problem with the aesop
approach:
- There are still rough edges
- It doesn't support
DifferentiableOn
at all - Even if it did, I want more complicated logic than what
aesop
can provide. Eg I want to be able to calldifferentiability
onDifferentiable (fun x ↦ sin x / x)
and be left with the goalDifferentiableAt (fun x ↦ sin x / x) 0
(or more realistically something like∀ x ∉ {x : ℝ | x ≠ 0}, DifferentiableAt (fun x ↦ sin x / x) x
, which is still good enough since it can be further simplified)
Yaël Dillies (Dec 30 2023 at 19:57):
Namely I would like the tactic to be able to handle domains.
Michael Stoll (Dec 30 2023 at 20:02):
See here for a similar discussion on continuity
.
Yaël Dillies (Dec 30 2023 at 20:15):
Interesting discussion, but it's disjoint from my concerns.
Adomas Baliuka (Jan 07 2024 at 12:55):
Eric Wieser said:
Such a tactic is already being worked on
As the one apparantly "working" on this, I should note that I'm still a beginner and don't know how to make progress from the situation described there :(
Also it's hard to tell how good the tactic needs to be in order for it to make sense to be included. I don't currently see a straightforward way to make significant improvements. The things Yaël Dillies was asking for would probably require a lot of custom meta-programming, which I'm definitely not able to do.
Yaël Dillies (Jan 07 2024 at 13:08):
Since I've written that specification, I noticed a subtlety: f
being differentiable on s
and on t
does not mean that f
is differentiable on s union t
. This is because differentiability on a set is differentiability within that set, not differentiability at every point of the set (which is the same if the set is open but not in general).
Yaël Dillies (Jan 07 2024 at 13:15):
I don't really know how to solve that. The core issue is that DifferentiableOn
does not respect enough set operations. My best bet is that we would need a new DifferentiableWithinOn s t f
predicate which says that the function f
is differentiable within a set s
at every point of another set t
. Then DifferentiableOn s f
is the same as DifferentiableWithinOn s s f
and we do have DifferentiableWithinOn s t1 f -> DifferentiableWithinOn s t2 f -> DifferentiableWithinOn s (t1 union t2) f
.
Yaël Dillies (Jan 07 2024 at 13:16):
But maybe Patrick Massot will tell me this is a terrible idea
Yaël Dillies (Jan 07 2024 at 13:17):
(can someone move this to #metaprogramming / tactics ?)
Adomas Baliuka (Jan 07 2024 at 13:27):
My personal opinion is that we should include some version of the differentiability
tactic as soon as it's as good as continuity
. I think that @Jireh Loreaux has also said something to that effect.
It can still be improved or replaced later. I fear that if we wait for all of these more advanced features, especially if they require changes/additions to the Differentiability API, the effort may never complete. Having a useful but imperfect tactic might also attract more people to work on improving it.
Debugging the issues I'm currently looking at might make a tactic that's as useful as continuity
but I'm a bit stuck at the moment.
Eric Wieser (Jan 07 2024 at 13:29):
Yaël Dillies said:
(can someone move this to #metaprogramming / tactics ?)
From which message?
Yaël Dillies (Jan 07 2024 at 14:41):
Maybe from Patrick's second message?
Notification Bot (Jan 08 2024 at 02:51):
Patrick Massot has marked this topic as unresolved.
Patrick Massot (Jan 08 2024 at 02:55):
I think the case of having DifferentiableOn
involving different sets in the same proof is pretty rare. I would rather have a tactic that works in the common cases than dreaming forever of the ultimate differentiability tactic.
Yaël Dillies (Feb 14 2024 at 12:34):
Here's a MWE with a derivative calculation that ought to be automatable, and a use case for said calculation:
import Mathlib.Analysis.SpecialFunctions.Trigonometric.Deriv
open Real Set
lemma aux {x : ℝ} : HasDerivWithinAt (fun x ↦ x - sin x) (1 - cos x) (interior (Ici 0)) x :=
((hasDerivAt_id _).sub $ hasDerivAt_sin _).hasDerivWithinAt
lemma sin_le {x : ℝ} (hx : 0 ≤ x) : sin x ≤ x := by
suffices : MonotoneOn (fun x ↦ x - sin x) (Ici 0)
· simpa using this left_mem_Ici hx hx
exact monotoneOn_of_hasDerivWithinAt_nonneg (convex_Ici _) (Continuous.continuousOn $ by
continuity) (fun x _ ↦ aux) $ by
simp [cos_le_one]
Yaël Dillies (Feb 14 2024 at 12:36):
@Tomas Skrivan, can fun_prop
handle goals containing metavariables? If so, one could use fun_prop
to compute the derivative of the function above by doing
example {x : ℝ} : HasDerivWithinAt (fun x ↦ x - sin x) _ (interior (Ici 0)) x := by fun_prop
Tomas Skrivan (Feb 14 2024 at 12:40):
Nope fun_prop
is not designed to do that.
I have started porting the fun_trans
tactic that will transform deriv fun x => x - sin x
to fun x => 1 - cos x
.
Tomas Skrivan (Feb 14 2024 at 12:42):
What is actually the point of all the HasDeriv
variants?
Yaël Dillies (Feb 14 2024 at 12:44):
In very high generality, the derivative is not unique. In practice, the advantage is that you can use functions f
, f'
, f''
rather than f
, deriv f
, deriv (deriv f)
.
Eric Wieser (Feb 14 2024 at 12:47):
The key advantage of HasDeriv
is that it combines deriv
and Differentiable
into a single proposition, so you don't have to do everything twice
Tomas Skrivan (Feb 14 2024 at 12:47):
I see, now I think about it, in programming it will be useful too. As I can implement a function gradient_descent (f f') (h : HasDeriv f f')
and keep everything computable.
Yaël Dillies (Feb 14 2024 at 12:48):
This is similar to docs#HasSum
Tomas Skrivan (Feb 14 2024 at 12:52):
Eric Wieser said:
The key advantage of
HasDeriv
is that it combinesderiv
andDifferentiable
into a single proposition, so you don't have to do everything twice
Yeah that is what I always thought.It is a crutch and with good automation it might be obsolete.
Once fun_trans
is ported adapting it to HasDeriv
will be easy as the proof of differentiability will be sitting somewhere in the simplifier's cache.
Eric Wieser (Feb 14 2024 at 12:53):
It is a crutch and with good automation it might be obsolete.
Well, even if the human isn't doing everything twice, the automation still has to, as does the type-checker.
Eric Wieser (Feb 14 2024 at 12:55):
(It's perhaps worth remembering that docs#Differentiable is effectively defined in terms of docs#HasDeriv !)
Geoffrey Irving (Feb 14 2024 at 13:28):
Even in human mathematics HasDerivAt comes first. :)
Tomas Skrivan (Feb 15 2024 at 13:55):
Yaël Dillies said:
Tomas Skrivan, can
fun_prop
handle goals containing metavariables? If so, one could usefun_prop
to compute the derivative of the function above by doingexample {x : ℝ} : HasDerivWithinAt (fun x ↦ x - sin x) _ (interior (Ici 0)) x := by fun_prop
Thanks for asking this question :) Previously, I had to deal with unknown data for ContDiff ℝ n? f
, so it took only a small tweak to fun_prop
and voilà it works. This (test file):
#check ((by fun_prop) : HasDeriv (fun x : α => x * x * x) _)
prints
(_ : HasDeriv (fun x => x * x * x) fun x dx => (dx * x + dx * x) * x + dx * (x * x))
Right now, I tested it only with a spoofed version of HasDeriv
. Once mathlib recompiles I will try it with the real stuff.
PR: #10593
Yaël Dillies (Feb 15 2024 at 14:11):
Ahah, that's great!
Yaël Dillies (Feb 15 2024 at 14:12):
All I can say is thank you to Bhavik for having taught me the power of whining
Yaël Dillies (Feb 15 2024 at 14:14):
Let me try to put together compute_deriv
as a wrapper around fun_prop
Tomas Skrivan (Feb 15 2024 at 16:50):
Great, it is working with the actual HasFDerivAt
. The ergonomic is not the best as the field 𝕜
can't be inferred from HasFDerivAt (fun x : ℝ => x * x) _ a
.
Example:
#check ((by fun_prop (disch:=assumption)) : (@HasFDerivAt ℝ _ _ _ _ _ _ _ (fun x : ℝ => (x+3)*(x*x + 1)⁻¹) _ a))
prints horrendous
(_ : HasFDerivAt (fun y => (y + 3) * (y * y + 1)⁻¹)
((a + 3) • ContinuousLinearMap.comp (-((ContinuousLinearMap.mulLeftRight ℝ ℝ) (a * a + 1)⁻¹) (a * a + 1)⁻¹)
(a • ContinuousLinearMap.id ℝ ℝ + a • ContinuousLinearMap.id ℝ ℝ) + (a * a + 1)⁻¹ • ContinuousLinearMap.id ℝ ℝ) a)
In SciLean, I have lambda notation for bundled morphisms fun x =>L[ℝ] x + 3•x
. Using that the result would be
(_ : HasFDerivAt (fun y => (y + 3) * (y * y + 1)⁻¹)
(fun dx =>L[ℝ] (((a + 3) • (-(a • dx + a • dx) * (a * a + 1)⁻¹ * (a * a + 1)⁻¹) + (a * a + 1)⁻¹ • dx)
a)
which is much much nicer. It might be worth considering this for mathlib.
Tomas Skrivan (Feb 15 2024 at 16:52):
Also using simproc it should be easy to allow simplifier to simplify the function body of a bundled morphism and then call fun_prop
to reprove the function property.
Yaël Dillies (Feb 15 2024 at 19:55):
Here's my tactic idea (not yet PRed because fun_prop
doesn't handle HasDeriv
):
/-- A tactic to compute the derivative of a function. -/
macro "compute_deriv " f:term : tactic =>
`(tactic| have (x) : HasDerivAt $f _ x := by fun_prop (disch := assumption))
Tomas Skrivan (Feb 15 2024 at 19:59):
This would not work as it would complain it can't infer the field 𝕜
. Also, what is x
? You use it as a new identifier and also in the statement HasDerivAt $f _ x
. Also what would be the use case for this? If a function expects HasDerivAt
then you can just call fun_prop
directly. I can imagine one would like to transform fderiv 𝕜 f x
into the actual derivative and that would require more meta code.
Yaël Dillies (Feb 15 2024 at 20:04):
A typical use case would be #10525, where all I need is some derivative to exist. Then I can massage it into a form where I can show it's positive/nonnegative/whatever.
Tomas Skrivan (Feb 15 2024 at 20:05):
Ohh I understand the have (x) : ... x :=
syntax now!
Yaël Dillies (Feb 15 2024 at 20:07):
Today's world is Lean 4's world :sunglasses: (with its fancy new syntax)
Tomas Skrivan (Feb 15 2024 at 20:12):
Yaël Dillies said:
A typical use case would be #10525, where all I need is some derivative to exist. Then I can massage it into a form where I can show it's positive/nonnegative/whatever.
I see, but wouldn't you just call this?
refine monotoneOn_of_hasDerivWithinAt_nonneg (convex_Icc ..) (f := (fun x ↦ 1 - 2 / π ^ 2 * x ^ 2 - cos x)) (by fun_prop) (by fun_prop) fun x hx ↦ sub_nonneg.2 ?_
Yaël Dillies (Feb 15 2024 at 20:15):
Hmm, because I hadn't thought of it. Let me try it again once fun_prop
support for HasDerivWithinAt
has magically appeared.
Last updated: May 02 2025 at 03:31 UTC