Processing math: 100%

MathJax

Wednesday, January 28, 2015

Convergence Concepts

Implications of convergence



1. Almost Sure Convergence

Examples of statements that hold almost surely (a.s.)

  • Let X,X be two random variables.  Then X=X a.s. means P[X=X]=1; that is, there exists an event NB, such that P(N)=0 and if ωNc, then X(ω)=X(ω).
  • If {Xn} is a sequence of random variables, then limnXn exists a.s. means there exists an event NB, such that P(N)=0 and if ωNc then limnXn(w) exists. It also means that for a.a. ω, lim supnXn(ω)=lim infnXn(ω). We will write limnXn=X or Xna.s.X.
  • If {Xn} is a sequence of random variables, then nXn converges a.s. means there exists an event NB, such that P(N)=0, and ωNc implies nXn(w) converges.

2. Convergence in Probability

Suppose Xn,n1 and X are random variables.  Then Xn converges in probability (i.p.) to X, written XnPX, if for any ϵ>0 limnP[|XnX|>ϵ]=0.
Almost sure convergence of {Xn} demands that for a.e. ω, Xn(w)X(w) gets small and stay small.  Convergence i.p. is weaker and merely requires that the probability of the difference Xn(w)X(w) being non-trivial become small.

It is possible for a sequence to converge in probability but not almost surely.

Theorem 1. Convergence a.s. implies convergence i.p. Suppose that Xn,n1 and X are random variables on a probability space (Ω,B,P).  If XnX,a.s. then XnPX.
Proof.  If XnX a.s. then for any ϵ,
0=P([|XnX|>ϵ]i.o.)=P(lim supn[|XnX|>ϵ])=limNP(nN[|XnX|>ϵ])limnP[|XnX|>ϵ]

3. Lp Convergence

Recall the notation XLp which means E(|X|p)<. For random variables X,YLp, we define the Lp metric for p1 by
d(X,Y)=(E|XY|p)1/p.  This metric is norm induced because
Xp:=(E|X|p)1/p is a norm on the space Lp.

A sequence {Xn} of random variables converges in Lp to X, written
XnLpX, if
E(|XnX|p)0 as n.

Facts about Lp convergence.
  1. Lp convergence implies convergence in probability: For p>0, if XnLpX then XnPX.  This follows readily from Chebychev's inequality, P[|XnX|ϵ]E(|XnX|p|)ϵp0.
  2. Convergence in probability does not imply Lp convergence.  What can go wrong is that the nth function in the sequence can be huge on a very small set.
    Example.  Let the probability space be ([0,1],B([0,1]),λ), where λ is Lebesgue measure and define
    Xn=2n1(0,1n) then
    P[|Xn|>ϵ]=P((0,1n))=1n0 but
    E(|Xn|p)=2np1n
  3. Lp convergence does not imply almost sure convergence.
    Example.  Consider the functions {Xn} defined on ([0,1],B([0,1]),λ), where λ is Lebesgue measure.
    X1=1[0,12],X2=1[12,1]X3=1[0,13],X4=1[13,23]X5=1[13,1],X6=1[0,14], and so on, Note that for any p>0,
    E(|X1|p)=E(|X2|p)=12,E(|X3|p)=E(|X4|p)=E(|X5|p)=13,E(|X6|p)=14, so  E(|Xn|p)0 and
    XnLp0.
    Observe that {Xn} does not converge almost surely to 0.

No comments: