Artificial intelligent assistant

Simple linear regression of the true values onto the fitted values? I'm trying to figure out that if I were to do a simple regression, say: $\hat{y_i}=\hat{\beta_0}+\hat{\beta_1}x_i$ That if I were then to run another regression of $y_i$ (the true value) onto $\hat{y_i}$ then the coefficients would be zero for the constant and 1 for the slope. I know that it's easy enough to calculate the slope coefficient in this one variable case with: $\frac{\sum(y_i-\bar{y})\left(x_i-\bar{x}\right)}{\sum\left(x_i-\bar{x}\right)^2}$ Then from there the constant is very easy to calculate. According to Stata no matter the data set I use I always get that the slope is equal to one and thus the constant is zero. I just can't seem to prove why from the equation (I must be missing something)? Example in stata: webuse auto regress mpg weight gen mpghat = _b[_cons] + _b[weight]*weight regress mpg mpghat Which gives me a _b[mpghat] == 1 in the regression.

When you regress the observed $y$ values onto the fitted $\hat y$ values, you're replacing the original independent variable $x$ with a new independent variable, namely $\hat y$. So if we label these new independent variables $z_i$: $$ z_i:=\hat y_i := \hat\beta_0 +\hat\beta_1 x_i $$ then the mean $z$ value is $ \bar z=\hat\beta_0 +\hat\beta_1\bar x $ and the deviation of $z_i$ from its mean is $z_i-\bar z=\hat\beta_1(x_i-\bar x)$. Therefore your formula for the slope of the new regression line gives $$ \frac{\sum(y_i-\bar y)(z_i-\bar z)}{\sum(z_i-\bar z)^2}= \frac{\sum(y_i-\bar y)\hat\beta_1(x_i-\bar x)}{\sum\hat\beta_1^2(x_i-\bar x)^2}=\frac1{\hat\beta_1}\frac{\sum(y_i-\bar y)(x_i-\bar x)}{\sum(x_i-\bar x)^2}=\frac1{\hat\beta_1}\hat\beta_1 =1. $$

xcX3v84RxoQ-4GxG32940ukFUIEgYdPy 212979989b78d94dee42fcbd31330ceb