Zulip Chat Archive

Stream: lean4

Topic: Can fun_prop fill in metavariables?


Frédéric Dupuis (Aug 02 2024 at 02:57):

@Tomas Skrivan I have a quick question about fun_prop: am I right in thinking that this is a bad fun_prop lemma?

@[fun_prop]
theorem ContinuousOn.comp'' {g : β  γ} {f : α  β} {s : Set α} {t : Set β} (hg : ContinuousOn g t)
    (hf : ContinuousOn f s) (h : Set.MapsTo f s t) : ContinuousOn (fun x => g (f x)) s :=
  ContinuousOn.comp hg hf h

My understanding is that since t doesn't appear in ContinuousOn (fun x => g (f x)) s, fun_prop will never be able to figure out what t should be, and so this will never work.

Tomas Skrivan (Aug 02 2024 at 19:46):

fun_prop is smart enough to sometimes delay theorem arguments and infer them from assumptions in the context. Thus ContinuousOn.comp'' is definitely a valid fun_prop theorem but I'm unsure of its usefulness as I'm not sure if there is a good tactic to discharge the subgoal Set.MapsTo f s t.

Here are few examples of when fun_prop fills in metavariables and when it fails:

import Mathlib

attribute [fun_prop] ContDiff.differentiable

set_option trace.Meta.Tactic.fun_prop true

-- My main motivation for filling in metavariables was something like this:
-- `fun_prop` applies `ContDiff.differentiable` with subgoal `ContDiff ℝ ?n f` and `?n` is later filled in by  unifying with `hf`
example (f :   ) (hf : ContDiff  2 f) : Differentiable  f := by fun_prop (disch:=decide)


-- If there are two different hypothesis that can fill in the metavariable then `fun_prop` can't recover from incorrectly assigning it.
-- In this example `hf'` is used first, `?n` is assigned `0` and `decide` fails to prove `1 ≤ 0`. `fun_prop` then fails to use `hf` because `?n` has already been assigned to `0`.
-- I'm aware where the problem is but I'm still thinking how to fix it. My main concern is that I don't know how expensive is to store and recover `MetaM` state and I'm worried about performance.
example (f :   ) (hf : ContDiff  2 f) (hf' : ContDiff  0 f) : Differentiable  f := by fun_prop (disch:=decide)

-- this works because `hf` is used first
example (f :   ) (hf' : ContDiff  0 f) (hf : ContDiff  2 f) : Differentiable  f := by fun_prop (disch:=decide)


-- Here `fun_prop` fails applying `Differentiable.continuous` fails because the type class argument can't be infered `NontriviallyNormedField ?𝕜`
-- current implementation of `fun_prop` requires that typeclass arguments are infered immediatelly
-- This should not be too hard to fix.
example (f :   ) (hf : ContDiff  2 f) : Continuous f := by fun_prop


attribute [fun_prop] ContinuousOn.comp''

-- Looks like that `ContinuousOn.comp''` can be used if all the assumptions align.
-- However I'm unsure if there is any good tactic able to discharge subgoals like `Set.MapsTo f {1}ᶜ {0}ᶜ` in more complicated scenarios.
example
  (f :   ) (hf : ContinuousOn f {1})
  (g :   ) (hg : ContinuousOn g {0})
  (h : Set.MapsTo f {1}  {0}) :
  ContinuousOn (fun x => g (f x)) {1} := by fun_prop (disch:=assumption)

Tomas Skrivan (Aug 02 2024 at 19:52):

@Jireh Loreaux Here is a concrete example of what I describe here and you expressed concern about it.

example (f :   ) (hf : ContDiff  2 f) (hf' : ContDiff  0 f) : Differentiable  f := by fun_prop (disch:=decide) -- fails as `?n` is assigned with `0`and `fun_prop` can't recover from it

I'm still unsure what is the best way to address it.

Frédéric Dupuis (Aug 02 2024 at 20:00):

I see, thanks for the explanation!

Jireh Loreaux (Aug 02 2024 at 21:01):

Yeah, I would expect the point of ContinuousOn.comp'' is exactly what you suggest. Someone writes a have statement and uses it with a discharger that can utilize it.


Last updated: May 02 2025 at 03:31 UTC