Please forgive me if this is not the right Stack Exchange (I also posted it at Cross Validated and Theoretical Computer Science). Please also forgive me for inventing terms.
For discrete random variables X and Y, the mutual information of X and Y can be defined as follows: $I(X;Y) = \sum_{y \in Y} \sum_{x \in X} p(x,y) \log{ \left( \frac{p(x,y)}{p_1(x)\,p_2(y)} \right) }, \,\!$
I will define the mutual information of a "cell" $x_0$ to be: $CI(x_0,Y) = \sum_{y \in Y} p(x_0,y) \log{ \left( \frac{p(x_0,y)}{p_1(x_0)\,p_2(y)} \right) }, \,\!$
I'm not sure if this quantity goes by another name. Essentially I'm restricting focus to a single state of variable X (and then the full MI can be calculated by summing all the cell MIs).
My question: is it guaranteed that $CI(x_0,Y) \ge 0$? We know $I(X;Y)\ge0$ and we know that the pointwise mutual information can be negative. I feel like CI should be nonnegative and that I might be missing some obvious proof.