Artificial intelligent assistant

Autocorrelation problem, regression analysis Bit stuck on my econometrics course (old exam q), not big on mathematical statistics, anyway this is the problem: Given some model $y_{it}=\beta_0+\beta_1x_{it}+u_{it}$ and suppose that the idiosyncratic errors are serially uncorrelated with constant variance i.e. $var(u)=\sigma^2$, $E(u)=0$ $corr(u_{it},u_{is})=0\: \forall \: t\neq s $. Show that $corr(u_{it}-u_{it-1},u_{it-1}-u_{it-2})=-0.5$. I've been trying to jerk around a bit with expected value formulas of variance/covariance, but I keep hitting a dead end. Also note that since this is taught by the econ department my matrix algebra in this area is poor, so it would be more helpful if it were dissected in "simple terms". Any help appreciated! Thanks /I

Separate the terms. Then you get:

$$ Cov(u_{it}-u_{it-1}, u_{it-1}-u_{it-2})=Cov(u_{it}, u_{it-1})-Cov(u_{it}, u_{it-2})-Cov(u_{it-1}, u_{it-1})+Cov(u_{it-1}, u_{it-2}) $$ The first term is 0 by assumption. So are the second and fourth ones as well. The third one is $$ -Cov(u_{it-1}, u_{it-1})=-V(u_{it-1})=-\sigma^2. $$

So then $$ Corr(u_{it}-u_{it-1}, u_{it-1}-u_{it-2})=\frac{Cov(u_{it}-u_{it-1}, u_{it-1}-u_{it-2})}{\sqrt{V(u_{it}-u_{it-1})V(u_{it-1}-u_{it-2})}}=-\frac{\sigma^2}{\sqrt{4(\sigma^2)^2}}\\\ =-\frac{1}{2}. $$

xcX3v84RxoQ-4GxG32940ukFUIEgYdPy 33a19de2e0bab7648edcec919cd9c8a4