Zulip Chat Archive

Stream: Brownian motion

Topic: Adapted Filtrations for Markov Chains and Markov Processes


David Ledvinka (Dec 14 2025 at 17:46):

This is not directly related to the Brownian Motion project but I figured this would be the best channel to post this since, a) this channel probably contains most of the people who have answers, b) it may be relevant to the project down the line, c) this may involve refactoring definitions important to the project.

I have started thinking a little bit about formalizing Markov Chains and Markov Processes and I realized the definition of an Adapted filter requires a topology on the state space (and strong measurability). This makes sense for Martingales since you need to take expectations on the state space as part of the definition but is not necessary for Markov chains. Should we have Adapted and StronglyAdapted? Is there a way to do this that avoids duplicating everything?

Etienne Marion (Dec 14 2025 at 17:52):

I think it would make sense to have StronglyAdapted yes. I came across this a while ago: #mathlib4 > Measurable functions, natural filtration and adapted process

Rémy Degenne (Dec 14 2025 at 18:08):

Yes, this question has come up several times, and that's evidence we need both definitions.

David Ledvinka (Dec 14 2025 at 20:04):

#32882

Joris van Winden (Feb 15 2026 at 16:03):

David Ledvinka said:

This is not directly related to the Brownian Motion project but I figured this would be the best channel to post this since, a) this channel probably contains most of the people who have answers, b) it may be relevant to the project down the line, c) this may involve refactoring definitions important to the project.

I have started thinking a little bit about formalizing Markov Chains and Markov Processes and I realized the definition of an Adapted filter requires a topology on the state space (and strong measurability). This makes sense for Martingales since you need to take expectations on the state space as part of the definition but is not necessary for Markov chains. Should we have Adapted and StronglyAdapted? Is there a way to do this that avoids duplicating everything?

Did you ever get around to formalizing Markov processes? I want to prove the law of reflection for Brownian motion, and would greatly benefit from a theorem which says 'Markov implies strong Markov'. But this theorem holds at a high level of generality and requires appropriate definitions of the weak and strong Markov property.

David Ledvinka (Feb 15 2026 at 16:21):

No, partly because I got stuck on this issue #mathlib4 > Conditional Independence for Not Standard Borel Spaces and wanted to come back to it once I had the opportunity to see what the consequences of generalising would be. I did work on it a bit and have some rough ideas but then got busy with other things.

Joris van Winden (Feb 15 2026 at 17:18):

Do you mind if I give it a try? As a first attempt, I came up with the following definitions:

variable {Ω : Type*} { : MeasurableSpace Ω} {P : Measure Ω}
variable {ι : Type*} [Preorder ι] {β : ι  Type*} [m : (i : ι)  MeasurableSpace (β i)]

-- `MeasurableSpace` containing all information up to time `i`
-- same definition used in `Filtration.natural`, but with less restrictive assumptions
def past (X : (i : ι)  Ω  β i) (i : ι) : MeasurableSpace Ω :=
   j  i, .comap (X j) (m j)

-- `MeasurableSpace` containing all information after time `i`
-- can alternatively be defined using `past` and `OrderDual`
def future (X : (i : ι)  Ω  β i) (i : ι) : MeasurableSpace Ω :=
   j  i, .comap (X j) (m j)

-- past and future and conditionally independent given the present
def IsMarkovProcess [StandardBorelSpace Ω] [IsFiniteMeasure P]
    {X : (i : ι)  Ω  β i} (hX : (i : ι)  Measurable (X i)) : Prop :=
   i : ι, CondIndep (.comap (X i) (m i)) (past X i) (future X i) (hX i).comap_le P

David Ledvinka (Feb 15 2026 at 17:46):

I do not mind. Some comments:

Firstly, I think one should first define a Markov Process with respect to a filtration (similar to how Martingales are already defined). See the first "alternative formulation" here: https://en.wikipedia.org/wiki/Markov_property.

With regards to Filtration.natural and all similar things (like I did for adapted in the PR above) I think annoyingly we will just need to have strongly measurable and measurable versions.

As I mentioned in the other thread I am not a huge fan of the StandardBorelSpace assumption but its probably fine to use it for now and then fix it later.

Joris van Winden (Feb 15 2026 at 18:05):

Thanks! Could you explain why you prefer the definition w.r.t a filtration? For martingales I get that a filtration cannot be avoided since it is an intrinsic part of the definition, but for Markov processes this is not the case.

In either case, I think it is important that the definition should be symmetric under time/order reversal, since a time-reversed Markov process is Markov. So any definition featuring a filtration should (annoyingly) probably also include a backwards filtration.

David Ledvinka (Feb 15 2026 at 19:30):

Why is it more intrinsic to martingales? One could only consider the natural filtration for martingales too. The more general definition allows you to say that it remains a Markov process with additional information.

Joris van Winden (Feb 15 2026 at 20:10):

I see your point. I also checked the textbooks I am familiar with, and most of them use a definition of markovianity which includes a filtration. Still, it slightly bugs me that this definition breaks time-reversal symmetry.

Joris van Winden (Feb 17 2026 at 15:58):

Slightly related to this issue: I saw that random times were recently changed from τ : Ω → ι to τ: Ω → WithTop ι. But why shouldn't a random time be able to take the value -\infty?

I could easily see this come up in the theory of random dynamical systems, where it is not uncommon for processes to have R\mathbb{R} as the index set.


Last updated: Feb 28 2026 at 14:05 UTC