Stochastic process
In the mathematical theory of probability, a Doob martingale (named after Joseph L. Doob,[1] also known as a Levy martingale) is a stochastic process that approximates a given random variable and has the martingale property with respect to the given filtration. It may be thought of as the evolving sequence of best approximations to the random variable based on information accumulated up to a certain time.
When analyzing sums, random walks, or other additive functions of independent random variables, one can often apply the central limit theorem, law of large numbers, Chernoff's inequality, Chebyshev's inequality or similar tools. When analyzing similar objects where the differences are not independent, the main tools are martingales and Azuma's inequality.[clarification needed]
Let
be any random variable with
. Suppose
is a filtration, i.e.
when
. Define
![{\displaystyle Z_{t}=\mathbb {E} [Y\mid {\mathcal {F}}_{t}],}](https://wikimedia.org/api/rest_v1/media/math/render/svg/f74edc01fe1ddb6ff7e76c0c37ada4f444c3adbd)
then
is a martingale,[2] namely Doob martingale, with respect to filtration
.
To see this, note that
;
as
.
In particular, for any sequence of random variables
on probability space
and function
such that
, one could choose

and filtration
such that

i.e.
-algebra generated by
. Then, by definition of Doob martingale, process
where
![{\displaystyle {\begin{aligned}Z_{0}&:=\mathbb {E} [f(X_{1},X_{2},\dots ,X_{n})\mid {\mathcal {F}}_{0}]=\mathbb {E} [f(X_{1},X_{2},\dots ,X_{n})],\\Z_{t}&:=\mathbb {E} [f(X_{1},X_{2},\dots ,X_{n})\mid {\mathcal {F}}_{t}]=\mathbb {E} [f(X_{1},X_{2},\dots ,X_{n})\mid X_{1},X_{2},\dots ,X_{t}],\forall t\geq 1\end{aligned}}}](https://wikimedia.org/api/rest_v1/media/math/render/svg/e26744a2522547f8affafbd91491b6bf0cd64c73)
forms a Doob martingale. Note that
. This martingale can be used to prove McDiarmid's inequality.
McDiarmid's inequality
[edit]
The Doob martingale was introduced by Joseph L. Doob in 1940 to establish concentration inequalities such as McDiarmid's inequality, which applies to functions that satisfy a bounded differences property (defined below) when they are evaluated on random independent function arguments.
A function
satisfies the bounded differences property if substituting the value of the
th coordinate
changes the value of
by at most
. More formally, if there are constants
such that for all
, and all
,

McDiarmid's Inequality[1]—Let
satisfy the bounded differences property with bounds
.
Consider independent random variables
where
for all
.
Then, for any
,
![{\displaystyle {\text{P}}\left(f(X_{1},X_{2},\ldots ,X_{n})-\mathbb {E} [f(X_{1},X_{2},\ldots ,X_{n})]\geq \varepsilon \right)\leq \exp \left(-{\frac {2\varepsilon ^{2}}{\sum _{i=1}^{n}c_{i}^{2}}}\right),}](https://wikimedia.org/api/rest_v1/media/math/render/svg/dcec32c891eaf7e6f8fb7c99d8749928ad1636e1)
![{\displaystyle {\text{P}}(f(X_{1},X_{2},\ldots ,X_{n})-\mathbb {E} [f(X_{1},X_{2},\ldots ,X_{n})]\leq -\varepsilon )\leq \exp \left(-{\frac {2\varepsilon ^{2}}{\sum _{i=1}^{n}c_{i}^{2}}}\right),}](https://wikimedia.org/api/rest_v1/media/math/render/svg/7e8acf5ba6be8dd3cf27f282bc328b17bfd23a93)
and as an immediate consequence,
![{\displaystyle {\text{P}}(|f(X_{1},X_{2},\ldots ,X_{n})-\mathbb {E} [f(X_{1},X_{2},\ldots ,X_{n})]|\geq \varepsilon )\leq 2\exp \left(-{\frac {2\varepsilon ^{2}}{\sum _{i=1}^{n}c_{i}^{2}}}\right).}](https://wikimedia.org/api/rest_v1/media/math/render/svg/a60a6951869cfdf72d6bf5c6f1f2f2b38abef530)