Artificial intelligent assistant

Given a conditional probability P(B|A), is it reasonable to consider B and A are independent? This is a concrete example of Bayes Theorem and Hidden Markov Models. followings are given. If a random day (day_0) is sunny, the next day (day_1) is sunny in a probability of 80%, rainy in a probability of 20%. it has a probability of 0.8 that Bob is Happy when it is sunny, 0.2 for grumpy. Bob is Happy when it is rainy in a probability of 40%, 60% is grumpy. assume Wednesday is sunny(0.67), Bob is Happy(0.8) and Thursday is rainy (0.2) and Bob is grumpy (0.6). this video claims the probability of the whole thing (of a Hidden Markov Models) happening is the product of all these. the probability that Bob is Happy is conditional on the event Wednesday is sunny, given this, is it reasonable to consider all these four are independent? $0.8*0.67*0.2*0.6 = 0.064$

> the probability that Bob is Happy is conditional on the event Wednesday is sunny, given this, is it reasonable to consider all these four are independent?

No, and to the contrary, you have been given _conditional probabilities_.

$$\begin{align}&\mathsf P(W{=}s, B_W{=}h, T{=}r, B_T{=}g)\\\\[1ex]=~&\mathsf P(W{=}s)\,\mathsf P(B_W{=}h\mid W{=}s)\,\mathsf P(T{=}r\mid W{=}s)\,\mathsf P(B_T{=}g\mid T{=}r)\\\\[1ex]=~&\tfrac 23\cdot 0.8\cdot0.2\cdot0.6 \end{align}$$

This factorisation can be described by the following Directed Acyclic Graph.$$\require{enclose}{\enclose{circle}{~W~}\to\enclose{circle}{B_W}\\\~\downarrow\\\\\enclose{circle}{~T~}\to\enclose{circle}{B_T}}$$

xcX3v84RxoQ-4GxG32940ukFUIEgYdPy 732b0618ee619c9f580e2e457e36c4d4