Artificial intelligent assistant

Law of total expectance for a sum The law of total expectane is $E(X) = E_Y(E_{X|Y}(X|Y))$, but how does one expand this for a sum $\mathrm{E}(\sum X_i)$ ? > \begin{align}A)&& \mathrm{E}\left(\sum_{i=1}^N X_i\right)=\mathrm{E}\left(\mathrm{E}\left(\left.\sum_{i=1}^N X_i\right|X_1,\ldots, X_N\right)\right) \end{align} or > \begin{align}B)&& \mathrm{E}\left(\sum_{i=1}^N X_i\right)=\mathrm{E}\left(\sum_{i=1}^N\mathrm{E}\left(\left. X_i\right|X_1,\ldots,X_{i-1},X_{i+1},\ldots,X_N\right)\right) \end{align} or simply > \begin{align}C)&& \mathrm{E}\left(\sum_{i=1}^N X_i\right)=\mathrm{E}\left(\sum_{i=1}^N\mathrm{E}\left(\left. X_i\right|X_1,\ldots,X_N\right)\right) \end{align} ? It should be relatively straightforward, but I just want to be sure. If it's relevant, $X_i$ are all independently chosen from the same distribution.

Well, sure, they are all valid, if not particularly useful. Just consider the case $N=3$.

$$\mathsf E(X_1+X_2+X_3) ~{= \mathsf E(\mathsf E(X_1+X_2+X_3\mid X_1,X_2,X_3))\\\ =\mathsf E(\mathsf E(X_1\mid X_2,X_3)+\mathsf E(X_2\mid X_1,X_3)+\mathsf E(X_3\mid X_1,X_2)) \\\ = \mathsf E(\mathsf E(X_1\mid X_1,X_2,X_3)+\mathsf E(X_2\mid X_1,X_2,X_3)+\mathsf E(X_3\mid X_1,X_2,X_3)) } $$

* * *

When $N$ is itself a random variable, what might be more useful is:

$$\mathsf E(\sum_{k=1}^N X_k) = \mathsf E(\sum_{k=1}^N\mathsf E(X_k\mid N))$$

Particularly when each $X_k$ is an identical and independently distributed random variable (and independent of $N$ as well), this simplifies to:

$$\mathsf E(\sum_{k=1}^N X_k) = \mathsf E(N)\mathsf E(X_1)$$

xcX3v84RxoQ-4GxG32940ukFUIEgYdPy 894ba43f55659197b439a7c442a52354