Zulip Chat Archive
Stream: maths
Topic: convolution of functions
Floris van Doorn (Jan 25 2022 at 10:32):
I am planning to define the convolution of two functions.
Is anyone already working on this? (@Anatole Dedecker @Yury G. Kudryashov)
Yaël Dillies (Jan 25 2022 at 10:34):
This is not directly related, but Alex and I have incidence algebras over at branch#incidence_algebra.
Patrick Massot (Jan 25 2022 at 11:56):
Yaël, is there any relation at all?
Yaël Dillies (Jan 25 2022 at 11:57):
Multiplication in the incidence algebra is a convolution.
Floris van Doorn (Jan 25 2022 at 12:38):
I think that is a different meaning of the word convolution: https://en.wikipedia.org/wiki/Dirichlet_convolution
Yaël Dillies (Jan 25 2022 at 13:14):
It's analogous at least.
Floris van Doorn (Jan 25 2022 at 13:42):
True
Floris van Doorn (Jan 25 2022 at 13:52):
Questions about design decisions:
- Convolution takes 2 functions as arguments. Currently I have the domain of these functions as an additive (topological/measurable) additive group. This group carries some measure that is invariant under translation (so I cleaned this notion up a bit: #11655).
- For many results I also want the measure to be invariant under negation. Is there a name for this property on measures?
- Also, do we want the the domain group to be able to be multiplicative? This is going to be a nightmare though:
to_additive
won't be happy that we want to additivize the group operation on the domain, but not on the codomain. - Should I try to make the codomain a
normed_ring
? Or isis_R_or_C
enough?
Patrick Massot (Jan 25 2022 at 14:00):
I think we also want allow the two functions to land in different places. We could have one function (seen as a weight function) landing in real number and the other one (seen as a function we want to average)in a real vector space.
Patrick Massot (Jan 25 2022 at 14:00):
When we want to apply convolution to smoothing functions.
Floris van Doorn (Jan 25 2022 at 14:01):
I see, good point.
Yaël Dillies (Jan 25 2022 at 14:01):
Again, have a look at branch#incidence_algebra :wink:
Yaël Dillies (Jan 25 2022 at 14:01):
Everything there is done in terms of scalar actions.
Patrick Massot (Jan 25 2022 at 14:01):
Yaël, this is pretty difficult because you merge master very often
Patrick Massot (Jan 25 2022 at 14:02):
We should probably look at the diff with the last master you merged
Yaël Dillies (Jan 25 2022 at 14:02):
What does this have to do with anything? You can just compare on Github?
Yaël Dillies (Jan 25 2022 at 14:03):
Here, you see the file and it's all clean: https://github.com/leanprover-community/mathlib/compare/incidence_algebra
Yury G. Kudryashov (Jan 25 2022 at 14:23):
If you open a WIP
PR, then it will be easier to compare.
Yaël Dillies (Jan 25 2022 at 14:24):
I'm planning on opening a PR for the first half soonish, but sure
Heather Macbeth (Jan 25 2022 at 14:27):
Floris van Doorn said:
I am planning to define the convolution of two functions.
Is anyone already working on this? (Anatole Dedecker Yury G. Kudryashov)
That would be great!
@Thomas Browning is working on Pontryagin duals, and there's a connection: the Pontryagin dual of a locally compact abelian topological group is canonically homeomorphic to the space of characters of the Banach algebra defined on by convolution. This enters into the proof of the general Plancherel theorem (their spaces are canonically isomorphic).
Thomas and I discussed this briefly before but I don't know which part of the subject he's working on currently.
Yaël Dillies (Jan 25 2022 at 14:29):
Here you go: #11656
Heather Macbeth (Jan 25 2022 at 14:43):
Floris van Doorn said:
- Also, do we want the the domain group to be able to be multiplicative? This is going to be a nightmare though:
to_additive
won't be happy that we want to additivize the group operation on the domain, but not on the codomain.
I think yes, eventually, eg for the Selberg trace formula. But it could maybe just be a TODO for now.
Thomas Browning (Jan 25 2022 at 17:49):
Heather Macbeth said:
Thomas and I discussed this briefly before but I don't know which part of the subject he's working on currently.
Right now, I'm still showing that the Pontryagin dual is a topological group, but I'm getting close.
Floris van Doorn (Feb 17 2022 at 17:19):
Update:
Until today I was working on a definition of convolution where one function is scalar-valued and the other function is vector-valued: (𝕜
is a normed field and E
is a normed space over it)
def convolution [has_sub G] (f : G → 𝕜) (g : G → E) (μ : measure G . volume_tac) : G → E :=
λ x, ∫ t, f t • g (x - t) ∂μ
However, it seems to work with convolution of vector-valued functions properly, we need a generalization where both functions are vector valued, and there is some continuous bilinear map to "multiply" the two function values: (E
and E'
are normed spaces over 𝕜
)
def convolution [has_sub G] (f : G → E) (g : G → E') (L : E →L[𝕜] E' →L[𝕜] F)
(μ : measure G . volume_tac) : G → F :=
λ x, ∫ t, L (f t) (g (x - t)) ∂μ
The reason is the following. Suppose that I want to write down that the derivative of f ⋆ g
is f' ⋆ g
. If f
is a function in 1 variable, this is fine, and we get something like has_deriv_at (f ⋆ g) ((deriv f ⋆ g) x₀) x₀
. However, if f
is a function in multiple variables, then fderiv 𝕜 f
is a linear map, and no longer a scalar value (so the old definition of convolution
is not applicable).
We can state this with the new definition:
def precompL (L : E →L[𝕜] E' →L[𝕜] F) : (E'' →L[𝕜] E) →L[𝕜] E' →L[𝕜] (E'' →L[𝕜] F) := ...
lemma (...) : has_fderiv_at (f ⋆[L; μ] g) ((fderiv 𝕜 f ⋆[L.precompL G; μ] g) x₀) x₀
Note that we could have used the old definition for directional derivatives, but then we would not get to use all the properties of fderiv
.
I looked with @Patrick Massot, and all books we looked at so far only define convolution for scalar-valued function, and prove differentiation properties only using directional derivatives. However, it seems useful, and I expect that this generality will be useful in other applications as well.
Additional advantages of the new definition:
- This generalization also makes it possible to use
@[to_additive]
everywhere (since we no longer have a multiplicativemul
/smul
that cannot become additive) - The definition now also is a lot more symmetric than before (the definition is still asymmetric, but at least the type is symmetric in the two functions)
Yaël Dillies (Feb 17 2022 at 17:40):
But... smul
can become additive. What about vadd
?
Yaël Dillies (Feb 17 2022 at 17:41):
Else, I agree that this looks like a useful generalization.
Yury G. Kudryashov (Feb 17 2022 at 17:57):
You need a bilinear map. Vadd isn't bilinear
Yury G. Kudryashov (Feb 17 2022 at 17:58):
Does it make sense to use semilinear maps here? I don't know the answer.
Yaël Dillies (Feb 17 2022 at 18:34):
smul
neither. What do you mean?
Floris van Doorn (Feb 17 2022 at 18:36):
Yaël Dillies said:
But...
smul
can become additive. What aboutvadd
?
There is no smul
in the new version of the definition (and in most lemmas about it)
Damiano Testa (Feb 17 2022 at 19:19):
In algebraic geometry, an analogue of convolution is called Fourier-Mukai transform. In that context, you do exactly what you are suggesting: you multiply the functions by a "kernel" before integrating one of the variables out.
Damiano Testa (Feb 17 2022 at 19:26):
(In that same context, things are even more general, you integrate f t * g s * K t s
over t
. The case of the "normal" convolution is when f
and g
have the same/dual domain and K
is a δ-function supported in the antidiagonal.)
Matthew Ballard (Feb 17 2022 at 20:00):
Damiano Testa said:
In algebraic geometry, an analogue of convolution is called Fourier-Mukai transform. In that context, you do exactly what you are suggesting: you multiply the functions by a "kernel" before integrating one of the variables out.
This is often also called an integral transform.
Kevin Buzzard (Feb 17 2022 at 21:23):
If f : G → E
and g : G → F
can you convolute them to get f.g : G → E ⊗ F
and then get everything else from that?
Eric Wieser (Feb 18 2022 at 00:56):
Or even maybe G → E × F
? Is linearity actually needed here?
Will Sawin (Feb 19 2022 at 02:46):
Eric Wieser said:
Or even maybe
G → E × F
? Is linearity actually needed here?
Yes, for the integration.
Floris van Doorn (Apr 20 2022 at 12:42):
I made a sequence of PRs ending in #13540 that provides an API for convolutions.
Notable properties:
- Prove that when one of the functions has compact support and is
C^n
and the other function is locally integrable, the convolution isC^n
. - Compute the total derivative of the convolution (when one of the functions has compact support)
- Prove that when taking the convolution with functions that "tend to the Dirac delta function", the convolution tends to the original function.
Jireh Loreaux (Apr 20 2022 at 14:34):
I guess I need to PR approximate units now so that you can have a def for "tend to the Dirac delta function." :grinning:
Last updated: Dec 20 2023 at 11:08 UTC