Artificial intelligent assistant

Which mean is used for standard error? While calculating the standard error, we take the standard deviation of the sampling distribution. But for calculating the standard deviation we must consider a 'mean' to take difference about. In Wikipedia its given that the 'mean' is the population mean while I came across a lecture which said it to be sampling distribution mean. So for standard error do we take the population mean or the sampling distribution mean as the 'mean'?

In general, you must have given the dataset with sample size $n$. Now obviously the population standard deviation is $\sqrt{E(X-\mu)^2}$, but when we try to find the sample standard deviation we use $$s=\sqrt{\frac{1}{n-1}\sum_{j=1}^{n}(x_j -\bar{x})^2}$$ which is a good estimator of population standard deviation.

You can not calculate $\sqrt{E(X-\mu)^2}$, since the everything in the expression is unknown. But if you know population mean $s$ changes to $$s=\sqrt{\frac{1}{n}\sum_{j=1}^{n}(x_j -\mu)^2}$$.

(But the second case does not happen in practical field.)

xcX3v84RxoQ-4GxG32940ukFUIEgYdPy a30c3afe50ebe720487576610c6f4ed1