# CS计算机代考程序代写 MP, MS, DT.

MP, MS, DT.
F70TS2 – Time Series
Solution to Exercise Sheet 1 – Stationarity and the autocorrelation function
Solution 1 To calculate the autocorrelation function we first calculate the autocovariance function
and
and
Var(Xt)=E[Xt2]
γ(0) =
= E[(βεt−1 + εt)2]
= E[β2ε2t−1 + ε2t + 2βεt−1εt]
= β2E[ε2t−1] + E[ε2t ]
= (1+β2)σε2,
γ(±1) =
= E[β ε2t−1 ]
E[(βεt−2 + εt−1)(βεt−1 + εt)] = βσε2,
γ(±k) = E[(βεt−k−1 + εt−k)(βεt−1 + εt)] =0,
for k ≥ 2 because of {εt} is i.i.d with mean zero. Hence, the autocorrelation function is ρ(1) = 1, ρ(±1)= β ,andρ(±k)=0fork≥2. For|β|<1itholds1+β2 >2βand1+β2 >−2β.
1+β2 So−1<β <1. 2 1+β2 2 Solution 2 We present a detailed solution for (c), which is the most difficult case. The solutions to (a) and (b) are derived by following the same steps, and are omitted as a result. Note that εt are iid with mean zero. γ(0) = var (Xt) = var (εt + 0.6εt−1 + 0.3εt−2 − 0.2εt−3) = var (εt) + var (0.6εt−1) + var (0.3εt−2) + var (−0.2εt−3) = 1.49σε2. γ(±1) = cov (Xt, Xt+1) = cov (εt + 0.6εt−1 + 0.3εt−2 − 0.2εt−3, εt+1 + 0.6εt + 0.3εt−1 − 0.2εt−2) = 1 ∗ 0.6var (εt) + 0.6 ∗ 0.3var (εt−1) − 0.3 ∗ 0.2var (εt−2) = 0.72σε2. γ(±2) = cov (Xt, Xt+2) = cov (εt + 0.6εt−1 + 0.3εt−2 − 0.2εt−3, εt+2 + 0.6εt+1 + 0.3εt − 0.2εt−1) = 1 ∗ 0.3var (εt) − 0.6 ∗ 0.2var (εt−1) = 0.18σε2. γ(±3) = cov (Xt, Xt+3) = cov (εt + 0.6εt−1 + 0.3εt−2 − 0.2εt−3, εt+3 + 0.6εt+2 + 0.3εt+1 − 0.2εt) = −1 ∗ 0.2var (εt) = −0.2σε2. γ(±k) = 0 for k > 3.
And ρ(0) = 1, ρ(±1) = 0.483, ρ(±2) = 0.121, ρ(±3) = −0.134, ρ(±k) = 0 for k > 3. Solution 3
(i) Since E[Zt] = 0, E[Yt] = E[Yt−1] for all t, and so the mean of the process is constant. But
Var(Yt) = Var(Yt−1) + Var(Zt) + 2Cov(Yt−1, Zt) = Var(Yt−1) + Var(Zt)
So, for stationarity we would need Var(Zt) = 0, which is not the case. Hence, the process
is not stationary. It is, in fact, a random walk. 1

(ii) E[Yt] = E[Yt−1]+α so for the mean to be constant we would need α = 0, which is not the case. Hence, the process is not stationary. It is a random walk with deterministic drift.
(iii) E[Yt] = αE[Yt−1], so for stationarity in mean we have μ = αμ, so μ = 0.
Var(Yt) = α2Var(Yt−1) + Var(Zt) + 2Cov(Yt−1, Zt) = α2Var(Yt−1) + Var(Zt)
So for stationarity we have have γ0 = α2γ0 + σZ2 , so γ0 = σZ2 /(1 − α2). Note that the variance γ0 increases as |α| increases and γ0 → σZ2 as |α| → 0. With α = 0, Yt is itself just a noise process.
Cov(Yt, Yt+k) = Cov(Yt, αYt+k−1 + Zt+k)
= αCov(Yt, Yt+k−1) + Cov(Yt, Zt+k) = αCov(Yt, Yt+k−1)
fork≥1. So,forstationaritywehaveγk =αγk−1 fork≥1. Thatis,γ1 =αγ0, γ2 = αγ1 = α2γ0 and generally γk = αkγ0. This gives γk which depends only on k, not on t, so the process is stationary.
In fact, this is a ‘Markov’ time series process, an autoregressive process of order 1. (iv) E[Yt] = E[Zt−1]E[Yt−2] + E[Zt] = 0, so the process is stationary in mean.
WehaveY2=Z2 Y2 +Z2+2ZZ Y ,andso t t−1 t−2 t t t−1 t−2
E[Y2]=E[Z2 ]E[Y2 ]+E[Z2]+2E[Z]E[Z ]E[Y ] t t−1 t−2 t t t−1 t−2
So for stationarity we would have γ0 = γ0 + 1, which is not possible for finite γ0. Hence, the process is not stationary.
Solution 4 E[Yt] = E[Ut]+E[Vt] = μU +μV . This does not depend on t and so Y is stationary in mean.
Cov(Yt, Yt+k) = Cov(Ut + Vt, Ut+k + Vt+k)
= Cov(Ut, Ut+k) + Cov(Ut, Vt+k) + Cov(Vt, Ut+k) + Cov(Vt, Vt+k) = γkU +0+0+γkV
since the U and V processes are independent. So, {Yt} is stationary in second–order moments and hence (weakly) stationary.
Solution 5 From Question 4, {Yt} is stationary with γkY = γkU + γkZ. Recall that γkZ = 0 for k≥1. Hence,σY2 =γ0Y =γ0U +γ0Z =σU2 +σZ2 andγkY =γkU fork≥1. Therefore
Y γkY γkU ρUk ρUk ρk=γY=σ2+σ2= σZ2=1+1
0UZ1+σU2 SNR
for k ≥ 1. In practice we see that ρYk < ρUk , i.e. the autocorrelation of the observed signal is less than that of the pure signal. If the noise variance is small compared to that of the signal, the signal is not disturbed very much: the SNR is large and we have ρYk ≈ ρUk (the observed signal has much the same autocorrelation structure as the pure signal). If the noise variance is of the same order as that of the signal then ρYk ≈ 0.5ρUk . If the noise variance is large compared to that of the signal, the signal is seriously disturbed: the SNR is small and ρYk is much smaller than ρUk (the observed signal has much weaker autocorrelation structure than the pure signal). In general, increasing the variance of the noise imposed on the pure signal weakens the autocorrelation structure of the observed signal. 2