Does XnL2→XX_n \overset{L^2}{\rightarrow} X imply X2nL1→X2X_n^2 \overset{L^1}{\rightarrow} X^2?

Suppose Xn,XX_n,X are random variables with XnL2→XX_n \overset{L^2}{\rightarrow} X. Does this imply X2nL1→X2X_n^2 \overset{L^1}{\rightarrow} X^2?

XnL2→XX_n \overset{L^2}{\rightarrow} X means E(|Xn−X|2)→0E(|X_n-X|^2) \rightarrow 0 and what we have to show is that E(|X2n−X2|)→0E(|X_n^2 – X^2|) \rightarrow 0.

=================

1

 

Convergence in L2L^2 is often meant as assuming that XX and every XnX_n are all in L2L^2. Then indeed the implication holds.
– Did
yesterday

=================

2 Answers
2

=================

As Did and saz mentioned, the standard definition of convergence in L2L^2 stipulates that all the random variables involved be twice integrable. However, OP took the trouble of defining in their question what L2L^2 convergence meant for them, and this definition does not involve the above-mentioned stipulation. Moreover, after I posted my original answer 15 hours ago, which covered only the case that XX was twice integrable, OP commented on my answer and told me in no uncertain terms that they were interested in the case where there were no restrictions on any of the random variables involved. The fact that OP did not accept my answer then, which was the only game in town at that point, further drove the point home. Therefore, in what follows I do not presuppose this condition. In fact, I don’t even presuppose that any of the random variables are integrable.

Case 1: E(X2)<∞\mathbf{E(X^2) < \infty} If E(X2)<∞E(X^2)<\infty, the implication holds. Firstly note that, given a sequence (a1,a2,…)(a_1, a_2, \dots) of real numbers, limn→∞an=0⟺limn→∞a2n=0. \lim_{n\rightarrow\infty} a_n = 0 \iff \lim_{n\rightarrow \infty} a_n^2=0. Secondly note that, since E(|Xn−X|2)→0E(|X_n-X|^2)\rightarrow 0, we may assume, w.l.g., that the sequence (E(|X1−X|2),E(|X2−X|2),…)(E(|X_1-X|^2), E(|X_2-X|^2),\dots) is bounded, say by L∈[0,∞)L \in [0,\infty). Thirdly note that we may assume, w.l.g., that the sequence (E(X21),E(X22),…)(E(X_1^2), E(X_2^2), \dots) is bounded by M:=L+2√LE(X2)+E(X2)M:=L + 2\sqrt{LE(X^2)}+E(X^2). Indeed, E(X2n)=E(((Xn−X)+X)2)≤E(|Xn−X|2)+2E(|Xn−X||X|)+E(X2)Cauchy-Schwarz≤E(|Xn−X|2)+2√E(|Xn−X|2)E(X2)+E(X2)≤L+2√LE(X2)+E(X2). \begin{align} E(X_n^2) &= E\Big(\big((X_n-X)+X\big)^2\Big) \\ &\leq E(|X_n-X|^2)+2E(|X_n-X||X|) + E(X^2) \\ &\overset{\text{Cauchy-Schwarz}}{\leq} E(|X_n-X|^2)+2\sqrt{E(|X_n-X|^2)E(X^2)} + E(X^2) \\ &\leq L + 2\sqrt{LE(X^2)} + E(X^2). \end{align} Now write E(|X2n−X2|)=E(|X2n−XnX+XnX−X2|)=E(|Xn(Xn−X)+X(Xn−X)|)≤E(|Xn||Xn−X|)+E(|X||Xn−X|). \begin{align} E(|X_n^2-X^2|) &= E(|X_n^2-X_nX+X_nX-X^2|) \\ &= E(|X_n(X_n-X)+X(X_n-X)|) \\ &\leq E(|X_n||X_n-X|)+E(|X||X_n-X|). \end{align} To see that the left-hand summand of the last expression converges to zero, note that E2(|Xn||Xn−X|)Cauchy-Schwarz≤E(X2n)≤M ⋅ E(|Xn−X|2)→0 by assump.→0. E^2(|X_n||X_n-X|) \overset{\text{Cauchy-Schwarz}}{\leq} \underset{\leq M}{E(X_n^2)}\ \cdot\ \underset{\rightarrow 0\text{ by assump.}}{E(|X_n-X|^2)}\rightarrow 0. A similar argument shows that the other summand converges to zero too. Case 2: E(X2)=∞\mathbf{E(X^2) = \infty} If E(X2)=∞E(X^2) = \infty, the implication does not hold. Here's a counter-example. For every n∈{1,2,…}n \in \{1, 2, \dots\} define an:=12√n(n+1),bn:=12√n(n−1)=ann−1n+1. \begin{align} a_n &:= \frac{1}{2}\sqrt{n}(n+1), \\ b_n &:= \frac{1}{2}\sqrt{n}(n-1) = a_n\frac{n-1}{n+1}. \end{align} Verify that every pair (an,bn)(a_n, b_n), n∈{1,2,…}n \in \{1, 2, \dots\}, satisfies an>bn>0a_n > b_n > 0, and solves the following system of equations:
(a−b)2=n,a2−b2=n2.
\begin{align}
(a – b)^2 &= n, \\
a^2 – b^2 &= n^2.
\end{align}

We will later use the following estimate:
n−1nan−bn=n−1nan−n−1n+1an=(1n−1n+1)(n−1)an≥0.
\begin{align}
\frac{n-1}{n}a_n – b_n &= \frac{n-1}{n}a_n – \frac{n-1}{n+1}a_n \\
&= \left(\frac{1}{n}-\frac{1}{n+1}\right)(n-1)a_n \\
&\geq 0.
\end{align}

Set
C2:=∞∑k=1k−2,C3:=∞∑k=1k−3,
\begin{align}
C_2 &:= \sum_{k=1}^\infty k^{-2}, \\
C_3 &:= \sum_{k=1}^\infty k^{-3},
\end{align}

and define, for every n∈{0,1,2,…}n \in \{0, 1, 2, \dots\},
Sn:={0,n=0,∑ni=11/C3i3,n≥1.
S_n := \begin{cases}
0 &, n = 0, \\
\sum_{i = 1}^n \frac{1/C_3}{i^3} &, n \geq 1.
\end{cases}

Consider the standard probability space ([0,1),B,λ)([0,1),\mathcal{B},\lambda). We now define two random variables, X0,XX_0, X on this probability space as follows.
X0:=∞∑n=1an1[Sn−1,Sn),X:=∞∑n=1bn1[Sn−1,Sn).
\begin{align}
X_0 &:= \sum_{n = 1}^\infty a_n\mathbb{1}_{[S_{n-1},S_n)}, \\
X &:= \sum_{n = 1}^\infty b_n\mathbb{1}_{[S_{n-1},S_n)}.
\end{align}

Furthermore, for every n∈{1,2,…}n \in \{1, 2, \dots\} we define
Xn:=1nX0+(1−1n)X.
X_n := \frac{1}{n} X_0 + \left(1-\frac{1}{n}\right)X.

Observe that X0>X≥0X_0 > X \geq 0, and therefore, for every n∈{1,2,…}n \in \{1, 2, \dots\}, Xn>X≥0X_n > X \geq 0. Also note that, for every n∈{1,2,…}n \in \{1, 2, \dots\},
n−1nX0−X≥0.
\frac{n-1}{n} X_0 – X \geq 0.

Then, for every n∈{1,2,…}n \in \{1, 2, \dots\},
E(|Xn−X|2)=1n2E((X0−X)2)=1n2∞∑i=1(ai−bi)21/C3i3=1/C3n2∞∑i=1ii3=C2/C3n2,E(|X2n−X2|)=E(X2n−X2)=E((1nX0+(1−1n)X)2−X2)=E(1n2X20+2n−1n2X0X+(1−1n)2X2−X2)=E(1n2X20+2n−1n2X0X−(2n−1n2)X2)=1n2E(X20+X2)+2nE((n−1nX0−X)X)≥1n2E(X20+X2)≥1n2E(X20−X2)=1n2∞∑i=1(a2i−b2i)1/C3i3=1/C3n2∞∑i=1i2i3=∞.
\begin{align}
E(|X_n-X|^2) &= \frac{1}{n^2} E\big((X_0 – X)^2\big) \\
&= \frac{1}{n^2} \sum_{i=1}^\infty (a_i – b_i)^2 \frac{1/C_3}{i^3} \\
&= \frac{1/C_3}{n^2} \sum_{i=1}^\infty \frac{i}{i^3} \\
&= \frac{C_2/C_3}{n^2}, \\
E\big(|X_n^2-X^2|\big) &= E\big(X_n^2-X^2\big) \\
&= E\Big(\big(\frac{1}{n}X_0 + (1-\frac{1}{n})X\big)^2 – X^2\Big) \\
&= E\Big(\frac{1}{n^2}X_0^2 + 2\frac{n-1}{n^2}X_0X + \big(1-\frac{1}{n}\big)^2X^2 – X^2\Big) \\
&= E\Big(\frac{1}{n^2}X_0^2 + 2\frac{n-1}{n^2}X_0X – \big(\frac{2}{n}-\frac{1}{n^2}\big)X^2\Big) \\
&= \frac{1}{n^2}E\Big(X_0^2 + X^2\Big) + \frac{2}{n}E\Big(\big(\frac{n-1}{n}X_0-X\big)X\Big) \\
&\geq \frac{1}{n^2}E\Big(X_0^2 + X^2\Big) \\
&\geq \frac{1}{n^2}E\Big(X_0^2 – X^2\Big) \\
&= \frac{1}{n^2}\sum_{i=1}^\infty (a_i^2-b_i^2)\frac{1/C_3}{i^3} \\
&= \frac{1/C_3}{n^2} \sum_{i=1}^\infty \frac{i^2}{i^3} \\
&= \infty.
\end{align}

Case 3: \mathbf{E(X^2) = \infty}\mathbf{E(X^2) = \infty} revisited

In this section I will show that it is possible to salvage some of the flavor of Case 1 even if E(X^2) = \inftyE(X^2) = \infty, namely I will show that if E(X^2) = \inftyE(X^2) = \infty, then E(X_n^2) \rightarrow \inftyE(X_n^2) \rightarrow \infty.

Suppose to the contrary. Then there is some T \in [0,\infty)T \in [0,\infty), such that, for all ii in some infinite subset I \subseteq \{1, 2, \dots\}I \subseteq \{1, 2, \dots\}, E(X_i^2) \leq TE(X_i^2) \leq T. Then, for every i \in Ii \in I,

\begin{align}
E(X^2) &= E\Big(\big((X-X_i)+X_i\big)^2\Big) \\
&\leq E(|X_i-X|^2)+2E(|X_i-X||X_i|) + E(X_i^2) \\
&\overset{\text{Cauchy-Schwarz}}{\leq} E(|X_i-X|^2)+2\sqrt{E(|X_i-X|^2)E(X_i^2)} + E(X_i^2) \\
&\leq L + 2\sqrt{LT} + T,
\end{align}

\begin{align}
E(X^2) &= E\Big(\big((X-X_i)+X_i\big)^2\Big) \\
&\leq E(|X_i-X|^2)+2E(|X_i-X||X_i|) + E(X_i^2) \\
&\overset{\text{Cauchy-Schwarz}}{\leq} E(|X_i-X|^2)+2\sqrt{E(|X_i-X|^2)E(X_i^2)} + E(X_i^2) \\
&\leq L + 2\sqrt{LT} + T,
\end{align}

a contradiction. (LL is the same bound introduced in Case 1.)

This, coupled with Case 1, shows that, if X, X_1, X_2, \dotsX, X_1, X_2, \dots are integrable random variables defined over the same probability space, then E\big(|X_n-X|^2\big) \rightarrow 0E\big(|X_n-X|^2\big) \rightarrow 0 implies V(X_n) \rightarrow V(X)V(X_n) \rightarrow V(X), and this holds regardless of whether E(X^2)E(X^2) is finite.

  

 

The reason why I am asking this question is that I read that X_n \overset{L^2}{\rightarrow} XX_n \overset{L^2}{\rightarrow} X implies var(X_n) \to var(X)var(X_n) \to var(X) (without any assuptions on X_n^2,X^2X_n^2,X^2). I tried to prove this but failed as I think we need X_n^2 \overset{L^1}{\rightarrow} X^2X_n^2 \overset{L^1}{\rightarrow} X^2 to show that var(X_n) \to var(X)var(X_n) \to var(X). Am I wrong?
– mahu_83
2 days ago

  

 

@mahu_83: I have added an addendum, which together with the first part of the proof, implies that, if X_n\overset{L^2}{\rightarrow}XX_n\overset{L^2}{\rightarrow}X (in the sense you defined in your question), then \mathrm{var}(X_n)\rightarrow\mathrm{var}(X)\mathrm{var}(X_n)\rightarrow\mathrm{var}(X), without any assumptions on X_n^2X_n^2 and X^2X^2. Please consider accepting my answer. Thanks.
– Evan Aad
2 days ago

  

 

@mahu_83: I have now answered your question completely, in all the generality that you required. Additionally, I answered the question that you posed in your comment above (see Case 3). Please consider accepting my answer. Thanks.
– Evan Aad
yesterday

  

 

“Please consider accepting my answer.” I wish people would stop posting these requests. Common decency and all that…
– Did
yesterday

  

 

Following the logic of this answer (which, IMHO, is irrelevant to the question but…), one should also decide what happens when the random variables are not in L^1L^1 since then the variances do not exist. Note that “not in L^2L^2” in this post seems to mean “in L^1L^1 but not in L^2L^2”.
– Did
yesterday

Since

X_n^2 – X^2 = (X_n-X) (X_n+X)X_n^2 – X^2 = (X_n-X) (X_n+X)

we have by the Cauchy-Schwarz inequality

\mathbb{E}(|X_n^2-X^2|) \leq \sqrt{\mathbb{E}(|X_n-X|^2)} \sqrt{\mathbb{E}(|X_n+X|^2)}. \tag{1}\mathbb{E}(|X_n^2-X^2|) \leq \sqrt{\mathbb{E}(|X_n-X|^2)} \sqrt{\mathbb{E}(|X_n+X|^2)}. \tag{1}

The first term on the right-hand side converges to 00 as n \to \inftyn \to \infty. For the second one, note that

\mathbb{E}(|X_n+X|^2) \leq 2 \mathbb{E}(X_n^2) + 2 \mathbb{E}(X^2) \xrightarrow[]{n \to \infty} 4 \mathbb{E}(X^2)<\infty.\mathbb{E}(|X_n+X|^2) \leq 2 \mathbb{E}(X_n^2) + 2 \mathbb{E}(X^2) \xrightarrow[]{n \to \infty} 4 \mathbb{E}(X^2)<\infty. (Here, we used the elementary inequality (a+b)^2 \leq 2a^2+2b^2(a+b)^2 \leq 2a^2+2b^2.) Consequently, the claim follows by letting n \to \inftyn \to \infty in (1)(1). Remark: The standard definition of L^2L^2-convergence X_n \to XX_n \to X requires that X \in L^2X \in L^2; note that this is, in particular, satisfied if \mathbb{E}(|X_n-X|^2) \to 0\mathbb{E}(|X_n-X|^2) \to 0 and X_k \in L^2X_k \in L^2 for some k \in \mathbb{N}k \in \mathbb{N}.      Why do we have 2 \mathbb{E}(X_n^2) + 2 \mathbb{E}(X^2) \xrightarrow[]{n \to \infty} 4 \mathbb{E}(X^2)2 \mathbb{E}(X_n^2) + 2 \mathbb{E}(X^2) \xrightarrow[]{n \to \infty} 4 \mathbb{E}(X^2)? I think \mathbb{E}(X_n^2) \to \mathbb{E}(X^2)\mathbb{E}(X_n^2) \to \mathbb{E}(X^2) is the statement we have to prove. – mahu_83 yesterday 1   @mahu_83 No, we have to prove that \mathbb{E}(|X_n-X|^2) \to 0\mathbb{E}(|X_n-X|^2) \to 0; this is stronger than \mathbb{E}(X_n^2) \to \mathbb{E}(X^2)\mathbb{E}(X_n^2) \to \mathbb{E}(X^2). The convergence \mathbb{E}(X_n^2) \to \mathbb{E}(X^2)\mathbb{E}(X_n^2) \to \mathbb{E}(X^2) is a direct consequence of the reverse triangle inequality (for the L^2L^2 norm), i.e. |\|X_n\|_{L^2}-\|X\|_{L^2}| \leq \|X_n-X\|_{L^2} \to 0.|\|X_n\|_{L^2}-\|X\|_{L^2}| \leq \|X_n-X\|_{L^2} \to 0. Alternatively, note that \mathbb{E}(X_n^2) = \mathbb{E}((X_n-X+X)^2) \leq 2 \mathbb{E}((X_n-X)^2) + 2 \mathbb{E}(X^2) \xrightarrow[]{n \to \infty} 2 \mathbb{E}(X^2).\mathbb{E}(X_n^2) = \mathbb{E}((X_n-X+X)^2) \leq 2 \mathbb{E}((X_n-X)^2) + 2 \mathbb{E}(X^2) \xrightarrow[]{n \to \infty} 2 \mathbb{E}(X^2). – saz yesterday      Nice, but I think you wanted to write "have to prove that \mathbb{E}(|X_n^2-X^2|) \to 0\mathbb{E}(|X_n^2-X^2|) \to 0" in the first line. Your "alternatively" part shows that the limit is bounded and finite but does not tell you what the limit is, am I correct? – mahu_83 yesterday      @mahu_83 Proving the L^2L^2-convergence is exactly what my answer is about.... please reread my answer carefully.... – saz yesterday      My comment refered to your comment, not to your answer. – mahu_83 yesterday