Artificial intelligent assistant

Amount of information a hidden state can convey (HMM) In this paper (Products of Hidden Markov Models, < the authors say that: > The hidden state of a single HMM can only convey log K bits of information about the recent history. K being the number of states. I was trying to understand what they mean and came up with the following: imagine I have 4 states: <0-0>, <0-1>, <1-0>, <1-1>. These 4 states convey log(4)=2 bits of information, A and B, each of which can have values 0 or 1 (e.g. state with A=1 and B=0 is <1-0>). However, I'm not sure if this is the correct reasoning.

In $n$ bits you can encode up to $2^n$ states (that's the total amount of information available in $n$ bits).
In reverse, to encode $K$ states you'll need at least $log(K)$ bits.

xcX3v84RxoQ-4GxG32940ukFUIEgYdPy cb0dfe24f914a8cee72bfe2489dbed18