Artificial intelligent assistant

Mutual information conditioned on Z = z I understand that the Mutual information I(X : Y ) can be calculated as follows: mutual information While the Conditional Mutual information I(X; Y |Z) can be calculated as follows: conditional mutual information If I want to calculate the mutual information between X and Y conditioned on Z = z which can be denoted as I(X : Y |Z=z) how can I do that ? and how this can be derived from the equations before ? is there any reference for that ?

To compute the mutual information (or joint entropy... or whatever) conditioned on a particular value of $Z$, you simply replace all the probabilities functions by the corresponding conditional probabilities.

Say, you know (among other equivalent formulas) that

$$I(X;Y)=H(X) - H(X\mid Y)=\\\ E_{X,Y}[\log(\frac{p(X,Y)}{p(X)P(Y)})]$$

Then

$$I(X;Y\mid Z=z)=H(X\mid Z=z) - H(X\mid Y, Z=z)=\\\ E_{X,Y\mid Z=z}[\log(\frac{p(X,Y \mid Z=z)}{p(X\mid Z=z)P(Y\mid Z=z)})]$$

xcX3v84RxoQ-4GxG32940ukFUIEgYdPy 92ce67659e278d80729ca7af53f3bb71