Asymptotic property of two stage estimator under missing response data
0 Introduction
In this paper, we consider the following semiparametric regression model
where {(Xi, Yi, Ti), 1≤i≤n} are independent identically distributed sample, Xi=(Xi1, …, Xip)', Ti∈[0, 1], 1≤i≤n, β is p×1 vector, g(t) is a unknown and smooth function on the [0, 1]. The {εi, 1≤i≤n} are independent identically distributed random variables with Eεi=0, Eε12=σ02>0, where Eεi and Eεi2 denote the expectation value of εi and εi2, respectively.And it is independent on the {(Xi, Ti), 1≤i≤n}. The model (1) belongs to a sort of the semiparametric regression model and has extensive application in many practical problems. Recently, the model(1) has been discussed by many authors with complete data[1, 2, 3, 4]. However, it is not surprising that little attention has been paid to model (1) with missing data. Limited studies in this direction have been discussed by reference [5], generalized the local linear estimation of [6], to the case with covariate missing at random.Reference [7] developed estimation theory for semiparametric regression analysis in the presence of missing response. Refercence[8] discussed the generalized partially linear models with missing covariates.Reference[9] discussed the partially linear models with missing responses at random. More references see [10, 11, 12, 13]. This paper focuses on establishing asymptotic normality and consistency of the two stage estimator in semiparametric regression model with missing response data.
In the semiparametric regression analysis setting up, the basic inference begins by considering the random sample
for i=1, 2, …, n. Here all the Xi' and Ti areobserved.If the response variable Yi with companion Xi is observed, its associated indicator δi is set to be 1;otherwise, Yi is missing and δi=0, for each i=1, 2, …, n.
By a purely semiparametric approach to discussing the missing data (2), the MAR assumption would require that there exists a chance mechanism denoted by p(Xi, Ti), such that
P(δ=1|Xi, Yi, Ti)=P(δ=1|Xi, Ti)=p(Xi, Ti)
|
(3) |
holds almost surely. In practice, (3) is a common assumption for statistical analysis with missing data and is reasonable in many practical applications, see reference[14].
1 The two stage estimator
In this section we define the estimators that we will analyze in this paper. We describe how to estimate the regression function.
Let α=Eg(Ti), ei=g(Ti)-α+εi, i=1, …, n, the model (1) turn into following
Where the e1, …, en are independent identically distributed random variables with Ee1=0 and 0<σ2=Ee12=Eε12+Var(g(Ti))=σ02+σ12<∞. The model (4) can be changed into the following form=
Yn=(1n, Xn)α β+en=1nα+Xnβ+en.
|
(5) |
Where Xn=(X1, …, Xn)', Yn=(Y1, …, Yn)' and en=(e1, …, en)', 1n=(1, …, 1)'. Set Sn=Xn'QnXn, Pn=QnXnSn-1Xn'Qn' and dn=1n'(Qn-Pn)1n=$\sum\limits_{i=1}^{n}{{}}$δi-1n'Pn1n, where Qn=Diag(δ1, …, δn).
In order to obtain the solution of the following least squares problem (5), we have to find α and β to minimize
Wn=(Yn-Xnβ-1nα)'Qn(Yn-Xnβ-1nβ).
|
By optimization theory, we have that
$\left\{ \begin{align}
& X_{n}^{'}{{Q}_{n}}{{X}_{n}}\beta +X_{n}^{'}{{Q}_{n}}{{1}_{n}}\alpha =X_{n}^{'}{{Q}_{n}}{{Y}_{n}}, \\
& 1_{n}^{'}{{Q}_{n}}{{1}_{n}}\alpha +1_{n}^{'}{{Q}_{n}}{{X}_{n}}\beta =1_{n}^{'}{{Q}_{n}}{{Y}_{n}}, \\
\end{align} \right.$
|
and thus
$\left\{ \begin{align}
& \beta _{n}^{*}={{\left( X_{n}^{'}{{Q}_{n}}{{X}_{n}} \right)}^{-1}}X_{n}^{'}{{Q}_{n}}\left( {{Y}_{n}}-{{1}_{n}}\alpha _{n}^{*} \right), \\
& \alpha _{n}^{*}={{\left( 1_{n}^{'}{{Q}_{n}}{{1}_{n}} \right)}^{-1}}1_{n}^{'}{{Q}_{n}}\left( {{Y}_{n}}{{X}_{n}}\beta _{n}^{*} \right). \\
\end{align} \right.$
|
(7) |
Joining the first equality into the second equality of (7), we obtain αn*=dn-1[1n'(Qn-Pn)Yn], and consequently
$\left\{ \begin{align}
& \alpha _{n}^{*}=d_{n}^{-1}\left[ 1_{n}^{'}\left( {{Q}_{n}}-{{P}_{n}} \right){{Y}_{n}} \right], \\
& \beta _{n}^{*}=S_{n}^{-1}X_{n}^{'}{{Q}_{n}}{{Y}_{n}}-S_{n}^{-1}X_{n}^{'}{{Q}_{n}}{{1}_{n}}\alpha _{n}^{*}. \\
\end{align} \right.$
|
As a result, the first stage estimator βn* of β is obtained.Substituting βn* for β in the model(1), we have
Now, we defined the nonparametric estimator of g(t) that
gn*(t)=$\sum\limits_{j=1}^{n}{{}}$ Wnj(t)(Yj-Xj'βn*)δj,
|
(9) |
where Wnj=$\left[ K\frac{{{T}_{j}}-t}{{{h}_{n}}} \right]/\left[ \sum\limits_{r=1}^{n}{{}}K\left( \frac{{{T}_{r}}-t}{{{h}_{n}}} \right){{\delta }_{r}} \right]$, K(·) is a kernel function and hn>0 is the bandwidth.Substituting gn* in the model (1), we have
Using the generalized least squares for the model (10), we can find β to minimize
$\sum\limits_{i=1}^{n}{{}}$[Yi-Xi'β-gn*(Ti)]2,
|
(11) |
and obtain estimator of β that
${\hat{\beta }}$n=Sn-1Xn'Qn(Yn-gn*(T)),
|
where gn*(T)=[gn*(T1), gn*(T2), …, gn*(Tn)]T. The estimator of g(T) is as follows:
${\hat{g}}$n(t)=$\sum\limits_{j=1}^{n}{{}}$Wnj(t)(Yj-Xj'${\hat{\beta }}$n)δj.
|
So we now can obtain the estimation of θ=E(Y). The regression imputation estimator of θ can be denoted by
${\hat{\theta }}$I=$\frac{1}{n}$$\sum\limits_{i=1}^{n}{{}}${δiYi+(1-δi)(XiT${\hat{\beta }}$n+${\hat{g}}$n(Ti))}.
|
(12) |
Thus, we have the propensity score weighted estimator
${\hat{\theta }}$w=$\frac{1}{n}$$\sum\limits_{i=1}^{n}{{}}$$\left\{ \frac{{{\delta }_{i}}{{Y}_{i}}}{\hat{P}\left( {{X}_{i}}, {{T}_{i}} \right)}+\left( 1-\frac{{{\delta }_{i}}}{\hat{P}\left( {{X}_{i}}, {{T}_{i}} \right)} \right)X_{i}^{T}{{{\hat{\beta }}}_{n}}+{{{\hat{g}}}_{n}}\left( {{T}_{i}} \right) \right\}, $
|
(13) |
where ${\hat{P}}$(x, t) is a high-dimensional kernel estimator of the propensity score defined by
$\hat{P}\left( x, t \right)=\left[ \sum\limits_{j=1}^{n}{{}}{{\delta }_{j}}W\left( \frac{x-{{X}_{j}}}{{{h}_{n}}}, \frac{t-{{T}_{j}}}{{{h}_{n}}} \right) \right]/\sum\limits_{j=1}^{n}{{}}W\left( \frac{x-{{X}_{j}}}{{{h}_{n}}}, \frac{t-{{T}_{j}}}{{{h}_{n}}} \right), $
|
(14) |
with W(·, ·) is the weighting function and hn is the bandwidth sequence.
2 The asymptotic properties and consistencies
We explore the asymptotic distribution and consistency of the all estimators. The following notation and assumptions are needed.
(ⅰ) The T1, T2…Tn are independent identically distributed random variables and the {Ti} is independent of the {ei}.
(ⅱ) The rank(Xn)=p<n.
(ⅲ) $\underset{n>p}{\mathop{\sup }}\,\frac{1_{n}^{'}{{P}_{n}}{{1}_{n}}}{n}$<1.
(ⅳ) E[g(T1)]2<∞.
(ⅴ) Existence 0<C1≤C2<∞ and have C1I(‖u‖<2)≤K(u)≤C2 I(‖u‖<2), u∈[0, 1].
(ⅵ) The probability density function of Ti is r(t) and
0<$\mathop {\inf }\limits_{0 \le t \le 1} $r(t)≤$\mathop {\sup }\limits_{0 \le t \le 1} $r(t)<∞.
|
(1) |
In what follows the main results will be established for the asymptotic distribution and consistency of the semiparametric regression model.
Theorem 1 Under conditions (ⅰ)~(ⅴ), we have that
(1) αn*→α, a.s.
(2) βn*→β, a.s. if and only if sn-1→0.
(3) βn*$\xrightarrow{P}$β⇔βn*$\xrightarrow{{{L}_{r}}}$β(0<r≤2)⇔βn*→β, a.s.
Theorem 2 Under conditions (ⅰ)~(ⅴ), if $\underset{n\to \infty }{\mathop{\lim }}\, \underset{1\le k\le n}{\mathop{\max }}\, $Xk'Sn-1Xk=0 and $\underset{n\to \infty }{\mathop{\lim }}\, $nSn-1=p(x, t)Σ is symmetric and positive definite, where p(x, t)>0 for any x>0 and t>0, and Σ is a symmetric and positive definite matrix,
(1) then $\sqrt{n}$(βn*-β)$\xrightarrow{L}$N(0, σ2p(x, t)Σ),
(2) further, if the condition (ⅲ) is replaced by $\underset{n>p}{\mathop{\sup }}\, $1n'Pn 1n=O(1), and g∈R is bounded when Sn-1→0 and nSn-1→p(x, t)Σ, then
(ⅰ)${\hat{\beta }}$ n→β, a.s.
(ⅱ) $\sqrt{n}$(n-β)$\xrightarrow{L}$N(0, σ02p(x, t)Σ).
Theorem 3 Under conditions (ⅰ)~(ⅵ), suppose that T1, T2… Tn in the conditions (ⅰ) are independent identically distributed random variables and the probability density function r(t) is unknown substitute the condition of (ⅰ). In addition, inf(n1-αhn)>0, α∈(1/2, 1) and $\underset{n\to \infty }{\mathop{\lim }}\, \underset{1\le k\le n}{\mathop{\max }}\, $Xk'Sn-1Xk=0 when hn→0 and [$\sqrt{n}$hn]/[logn]→∞,
(1) then gn*(t)-g(t)→0, a.s. ∀t∈Cr∧{t, r(t)>0},
(2) further if g∈R is bounded when Sn-1→0, then ${\hat{g}}$n(t)-g(t)→0, a.s.
Theorem 4 Under conditions (i)~(vi), we have
$\sqrt{n}$(${\hat{\theta }}$-θ)$\xrightarrow{L}$N(0, σ02p(x, t)Σ).
|
(15) |
3 Sketches of the proofs
In this section, we will give the proof of Theorem 1~3.The following lemmas are needed for our technical proofs.
Lemma 1 If hn→0, nhnd/(n1-1/rlogn)→∞, the kernel estimator of nonparametric regression under response missing is
$\hat{g}\left( x \right)=\sum\limits_{i=1}^{n}{{}}K\frac{{{X}_{i}}-x}{{{h}_{n}}}{{Y}_{i}}{{\delta }_{i}}/\sum\limits_{j=1}^{n}{{}}K\left( \frac{{{X}_{j}}-x}{{{h}_{n}}} \right){{\delta }_{j}}=\sum\limits_{i=1}^{n}{{}}W_{ni}^{'}\left( x \right){{Y}_{i}}, $
|
we have $\underset{n\to \infty }{\mathop{\lim }}\, \hat{g}$(x)=g(x), a.s.
Proof Similar to the theorem in reference[15].
Proof the Theorem 1 Similar to the Lemma 2.1 in reference [16].
Proof the Theorem 2
(1) $\sqrt{n}$(βn*-β)=$\sqrt{n}$[(Sn-1Xn'QnYn-Sn-1Xn'Qn1ndn-11n'(Qn-Pn)Yn)-β]=
$\sqrt{n}$[(Sn-1Xn'Qn(1nα+Xnβ+en)-
Sn-1Xn'Qn1ndn-11n'(Qn-Pn)(1nα+Xnβ+en)-β]=
$\sqrt{n}$[Sn-1Xn'Qnen-Sn-1Xn'Qn1ndn-11n'(Qn-Pn)Xnβ-
Sn-1Xn'Qn1ndn-11n'(Qn-Pn)en]=
$\sqrt{n}$[Sn-1Xn'Qn en-Sn-1Xn'Qn1ndn-11n'(Qn-Pn)en]=
$\sqrt{n}$Sn-1Xn'Qn en-$\sqrt{n}$Sn-1Xn'Qn1n1n'(Qn-Pn)en dn-1$\triangleq $J1-J2.
It is easy to prove J2$\xrightarrow{P}$0 and J1=$\sqrt{n}$Sn-1Xn'Qnen. Thus, EJ1=0, VarJ1=Var($\sqrt{n}$Sn-1Xn'Qnεn)=σ2p(x, t)Σ.By Linderberg Theorem, we have
J1$\xrightarrow{L}$N(0, σ2p(x, t)Σ).
|
(2) Firstly, we prove the conclusion (ⅰ) of (2).
${\hat{\beta }}$n-β=Sn-1Xn'Qn(Yn-gn*(T))-β=
Sn-1Xn'Qn(Xnβ+g(T)+εn-gn*(T))-β=
Sn-1Xn'Qnεn-Sn-1Xn'Qn(gn*(T)-g(T)).
|
By Lemma 1 in reference[17], in order to obtain the proof of (i), we only prove
Sn-1Xn'Qn(gn*(T)-g(T))→0, a.s.
|
By $\underset{n\to \infty }{\mathop{\lim }}\, $nSn-1=p(x, t)Σ>0, it is known that there is c3>c4>0 and c3≥|Sn|/n≥c4 such that
‖Sn-1Xn'Qn‖≤‖Sn-1‖‖Xn'‖≤‖Sn-1‖|$\sum\limits_{i, j}^{{}}{{}}$|Xij||≤${n \over {X_{nn}^2}}\sqrt {\sum\limits_{i,j}^n {} X_{ij}^2/n} \le \sqrt {{n \over {\left| {{S_n}} \right|}}} < c.$
|
According to the first conclusion of Theorem 3, gn*(T)-g(T)→0, a.s.Thus,
Sn-1Xn'Qn(gn*(T)-g(T))→0, a.s.
|
Now we prove conclusion (ⅱ) of (2).
$\sqrt{n}$(${\hat{\beta }}$n-β)=$\sqrt{n}$[Sn-1Xn'Qn(Yn-gn*(T))-β]=
$\sqrt{n}$Sn-1Xn'Qnεn-$\sqrt{n}$Sn-1Xn'Qn(gn*(T)-g(T)).
|
Because |Xn'Qn/n|≤$\sqrt{\left\| {{S}_{n}} \right\|/n}$≤c, nSn-1→ p(x, t)Σ is symmetric and positive definite, and gn*(T)→g(T), a.s.
$\sqrt{n}$Sn-1Xn'Qn(gn*(T)-g(T))=nSn-1Xn'Qn$\frac{1}{\sqrt{n}}$(gn*(T)-g(T))→0, a.s.
|
It is known that $\sqrt{n}$Sn-1Xn'Qnen is i.i.d. random variable,
E($\sqrt{n}$Sn-1Xn'Qnen)=0.
|
and
Var$\sqrt{n}$Sn-1Xn'Qnεn=p(x, t)σ02Σ.
|
It follows from Linderberg theorem that
$\sqrt{n}$Sn-1Xn'Qnεn$\xrightarrow{L}$N(0, p(x, t)σ02Σ),
|
namely, $\sqrt{n}$(${\hat{\beta }}$n-β)$\xrightarrow{L}$N(0, p(x, t)σ02Σ).This completes the proof of Theorem 2.
Theorem 3
(1) Let Wn(t)=(Wn1(t), …, Wnn(t))'=, where ti∈Cf∧{Ti, f(ti)>0}.Since
gn*(t)=$\sum\limits_{j=1}^{n}{{}}$Wnj(t)(Yj-Xj'βn*)δj=
$\sum\limits_{j=1}^{n}{{}}$Wnj(t)(Yj-Xj'β+Xj'β-Xj'βn*)δj=
Wn'(t)Qn(Yn-Xnβ)-Wn'(t)QnXn(βn*-β)$\triangleq $
J1-J2,
J1=Wn'(t)Qn(Yn-Xnβ)=$\sum\limits_{j=1}^{n}{{}}$Wnj(t)(Yj-Xj'β)δj=
$\sum\limits_{j=1}^{n}{{}}$Wnj'(t)(g(tj)+εj)=$\sum\limits_{j=1}^{n}{{}}$Wnj'(t)kj,
|
where {kj} is i.i.d. and 0<Ek12<∞, E(k1|t1=t)=g(t). By the Lemma 1, J1→g(t), a.s.Thus, we only prove J2→0, a.s.
J2=Wn'(t)QnXn(βn*-β)=
Wn'(t)QnXn[(Sn-1Xn'QnYn-Sn-1Xn'Qn1nαn*)-β]=
Wn'(t)Pnen-Wn'(t)Pn1n(αn*-α).
|
Let
bnk=$\sum\limits_{j=1}^{n}{{}}$Wnj(t)ajk(n)=$\sum\limits_{j=1}^{n}{{}}$K($\frac{{{t}_{j}}-t}{{{h}_{n}}}$)ajk(n)/$\sum\limits_{i=1}^{n}{{}}$K($\frac{{{t}_{j}}-t}{{{h}_{n}}}$)δi,
|
where Pn=(aij(n)).Now we prove Wn'(t)Pnen→0, a.s.It is true that
$W_{n}^{'}\left( t \right){{P}_{n}}{{e}_{n}}=\sum\limits_{i=1}^{n}{{}}{{b}_{ni}}{{\varepsilon }_{i}}={{\left( {{f}_{n}}\left( t \right) \right)}^{-1}}\left\{ \sum\limits_{k=1}^{n}{{}}\left[ \sum\limits_{j=1}^{n}{{}}K\left( \frac{{{t}_{j}}-t}{{{h}_{n}}} \right)a_{jk}^{\left( n \right)} \right]{{\varepsilon }_{k}} \right\}\triangleq ={{\left( {{f}_{n}}\left( t \right) \right)}^{-1}}{{{\bar{u}}}_{n}}\left( t \right).$
|
It follows from Lemma 3 in reference[9] that un(t)→0, a.s. Following Lemma 4 in reference [17], it is easy to know that fn(t)→p(x, t)f(t), a.s. Thus,
Within Wn'(t)Pn1n(αn*-α), let uik(n)=$\frac{{{A}_{ni}}\left( {{\delta }_{k}}-{{A}_{nk}} \right)}{{{d}_{n}}}$, where Ani=$\sum\limits_{j=1}^{n}{{}}$aij(n), Pn=(aij(n)). Then
Wn'(t)Pn1n(αn*-α)=Wn'(t)Pn1ndn-1[1n'(Qn-Pn)en]=
(fn(t))-1$\sum\limits_{k=1}^{n}{{}}$$\sum\limits_{j=1}^{n}{{}}$K$\left( \frac{{{t}_{j}}-t}{{{h}_{n}}} \right)$ujk(n)εknhn$\triangleq $
(fn(t))-1Vn(t).
|
By the condition (ⅱ), 0<dn=1n'(Qn-Pn)1n=$\sum\limits_{k=1}^{n}{{}}$(δk-Ank)=$\sum\limits_{k=1}^{n}{{}}$(δk-Ank)2.Since
$\sum\limits_{k=1}^{n}{{}}$Ank2=1n'Pn1n=$\sum\limits_{k=1}^{n}{{}}$Ank,
$\sum\limits_{j, k=1}^{n}{{}}$(ujk(n))2=dn-2$\sum\limits_{j=1}^{n}{{}}$$\sum\limits_{k=1}^{n}{{}}$Anj2(δk-Ank)2=
dn-1$\sum\limits_{j=1}^{n}{{}}$Anj2=dn-1(1n'Pn1n)=1n'Pn1n$\sum\limits_{j=1}^{n}{{}}$δj-1n'Pn1n= $\frac{1}{\sum\limits_{j=1}^{n}{{}}{{\delta }_{j}}/1_{n}^{'}{{P}_{n}}{{1}_{n}}-1}$≤c,
|
when n>p.The Cauchy-Schwarz inequality yields
$\sum\limits_{k=1}^{n}{{}}$$\sum\limits_{r=1}^{n}{{}}$ukk(n)urr(n)=dn-2$\sum\limits_{k=1}^{n}{{}}$$\sum\limits_{r=1}^{n}{{}}$Ank(1-Ank)Anr(1-Anr)≤<
dn-2[$\sum\limits_{k=1}^{n}{{}}$$\sum\limits_{r=1}^{n}{{}}$Ank2(1-Anr)2]1/2[$\sum\limits_{k=1}^{n}{{}}$$\sum\limits_{r=1}^{n}{{}}$Anr2(1-Ank)2]1/2≤c,
|
when n≥p.Therefore,
$\sum\limits_{k=1}^{n}{{}}$($\sum\limits_{j=1}^{n}{{}}$ujk(n))2≤n$\sum\limits_{k=1}^{n}{{}}$$\sum\limits_{j=1}^{n}{{}}$ (ujk(n))2≤cn.
|
Since the uik(n) in the un(t) is the same function as the ajk(n) in the ${{{\bar{u}}}_{n}}$(t), νn(t)→0, a.s.can be obtained. According to the proof above and Lemma 4 in reference [17], we have
Wn'(t)Pn1n(αn*-α)→0, a.s.
|
(17) |
(16) and (17) show J2→0, a.s. This completes the proof (1) of Theorem 3.
(2) It is not difficult to obtain
${\hat{g}}$n(t)=$\sum\limits_{j=1}^{n}{{}}$Wnj(t)(Yj-Xj')δj=
Wn'(t)Qn(Yn-Xnβ)-Wn'QnXn(${\hat{\beta }}$n-β),
${\hat{\beta }}$n-β=Sn-1Xn'Qn(Yn-gn*(T))-β=
Sn-1Xn'Qnen-Sn-1Xn'Qn(gn*(T)-g(T)),
|
and
${\hat{g}}$n(t)=Wn'Qn(Yn-Xnβ)-Wn'(t)QnXnSn'Xn'Qnεn+
Wn'(t)QnXnSn-1Xn'Qn(gn*(T)-g(T))=
Wn'(t)Qn(Yn-Xnβ)-Wn'(t)Pnen+
Wn'(t)Pn(gn*(T)-g(T))=
I1-I2+I3.
|
(18) |
It has been proved that
I1=Wn'(t)Qn(Yn-Xnβ)→ g(t), a.s.
|
and
Wn'(t)Pnen=$\sum\limits_{i=1}^{n}{{}}$bnjεi=
(fn(t))-1$\sum\limits_{k=1}^{n}{{}}$$\frac{1}{n{{h}_{n}}}$$\sum\limits_{j=1}^{n}{{}}$K$\left( \frac{{{t}_{j}}-t}{{{h}_{n}}} \right)$ajk(n)εk$\triangleq $
(fn(t))-1un'(t).
|
(19) |
Using the same method of the proof for (1) of Theorem 3, it follows from Lemma 3 in reference [17] that
By the conditions (Ⅴ) we know that
$\begin{align}
& W_{n}^{'}\left( t \right){{P}_{n}}=\sum\limits_{j, k}^{{}}{{}}K\left( \frac{{{t}_{j}}-t}{{{h}_{n}}} \right)a_{jk}^{\left( n \right)}/\sum\limits_{i=1}^{n}{{}}\left( \frac{{{t}_{i}}-t}{{{h}_{n}}} \right){{\delta }_{i}}\le \\
& c\sum\limits_{j, k}^{{}}{{}}\left| a_{jk}^{\left( n \right)} \right|/\sum\limits_{i=1}^{n}{{}}K\left( \frac{{{t}_{i}}-t}{{{h}_{n}}} \right){{\delta }_{i}}\le \\
& c\sum\limits_{j, k}^{{}}{{}}\left| a_{jk}^{\left( n \right)} \right|n\le c\sum\limits_{k=1}^{{}}{{}}\frac{\left| {{A}_{nk}} \right|}{n}, \\
\end{align}$
|
and
1n'Pn1n=$\sum\limits_{j, k}^{{}}{{}}$ajk(n)=$\sum\limits_{k=1}^{n}{{}}$Ank=$\sum\limits_{k=1}^{n}{{}}$Ank2.
|
According to the conditions (ⅲ), it holds that $\sum\limits_{k=1}^{n}{{}}$($\frac{{{A}_{nk}}}{n}$)2≤c. Thus, when n>p, $\sum\limits_{k=1}^{n}{{}}$|$\frac{{{A}_{nk}}}{n}$|≤c.Namely, Wn'(t)Pn≤c. From conclusion (1) of Theorem 3, we have gn*(T)-g(T)→0, a.s. Thus,
Wn'(t)Pn(gn*(T)-g(T))→0, a.s.
|
(21) |
(18), (20) and (21) imply ${\hat{g}}$n(t)→ g(t), a.s. This completes the proof of Theorem 3.