Artificial intelligent assistant

What are differences and relationship between shannon entropy and fisher information? When I first got into information theory, information was measured or based on shannon entropy or in other words, most books I read before were talked about shannon entropy. Today someone told me there is another information called fisher information. I got confused a lot. I tried to google them. Here are links, fisher information: < and shannon entropy goes here < What are differences and relationship between shannon entropy and fisher information? Why do two kinds of information exist there? Currently, my idea is that it seems fisher information is a statistical view while shannon entropy goes probability view. Any comments or anwsers are welcome. Thanks.

Fisher information is related to the asymptotic variability of a maximum likelihood estimator. The idea being that higher Fisher Information is associated with lower estimation error.

Shannon Information is totally different, and refers to the content of the message or distribution, not its variability. Higher entropy distributions are assumed to convey less information because they can be transmitted with fewer bits.

However, there is a relationship between Fisher Information and Relative Entropy/KL Divergence, as discussed on Wikipedia.

xcX3v84RxoQ-4GxG32940ukFUIEgYdPy 34bc2702cf65f0e35449e8929b9dad9e