Artificial intelligent assistant

Convergence in distribution and OLS in the regression model Suppose a random variable $X_n$ converges in distribution to X and $E[X]< \infty$, Is this statement true or is there any condition to satisfy this? > $ plim_{n\to \infty}Var(X_n) = Var(X) $ (or $lim_{n\to \infty}Var(X_n) = Var(X)$ ) In fact, I would like to know whether the (unconditional) variance of A=$\sqrt n(\hat\beta -\beta )$ converges in probability to the variance of the asymptotic distribution of A for the OLS estimator $\hat\beta$ in the linear regression model with stochastic regressors when the regressor X and the error term are independent. e.g) $y_i = x_i\beta + e_i$ , $e_i$ ~ i.i.d $( 0, \sigma^2) $, $plim{{\sum x_i^2} \over n}=E(x_i^2)<\infty$ , and x and e are independent. I thought that Var(A) = $\sigma^2 E[({{\sum x_i^2} \over n})^{-1} ] $ and A $\to^D $ N(0 , $\sigma^2 [E(x_i^2)]^{-1}$) I would like to know whether $plim_{x\to \infty}Var(A)=\sigma^2 (E(x_i^2))^{-1}$

No, in general, $X_{n}$ converges in disn to $X$, and $E \vert{X}\vert < \infty$ does not imply that $Var(X_{n})$ converges to $Var(X)$. You also need to have uniform integrability of $\vert X_{n}\vert^{2}$

xcX3v84RxoQ-4GxG32940ukFUIEgYdPy 6148f855020cc9eabd88a512ca918eaf