To compute the mutual information (or joint entropy... or whatever) conditioned on a particular value of $Z$, you simply replace all the probabilities functions by the corresponding conditional probabilities.
Say, you know (among other equivalent formulas) that
$$I(X;Y)=H(X) - H(X\mid Y)=\\\ E_{X,Y}[\log(\frac{p(X,Y)}{p(X)P(Y)})]$$
Then
$$I(X;Y\mid Z=z)=H(X\mid Z=z) - H(X\mid Y, Z=z)=\\\ E_{X,Y\mid Z=z}[\log(\frac{p(X,Y \mid Z=z)}{p(X\mid Z=z)P(Y\mid Z=z)})]$$