Artificial intelligent assistant

Algebraic Proof of the Double - Dabble Method/Algorithms: Binary Numbers to Decimal Numbers I have found a great book: Binary Arithmetics and Boolean Algebra by Angelo G. Gille. He is using a Double-Dablle Method to convert Binary Numbers to Decimal Numbers. Here is how its done. I have tried to understand/proove why its working. Maybe someone sees it? !img !img

This is the same as Horner's method for evaluating polynomials. It works because

$$a_n2^n+a_{n-1}2^{n-1}+...+a_12+a_0=2(...(2(2(2a_n+a_{n-1})+a_{n-1})...+a_1)+a_0$$

If the formula in general looks obscure, you can try it for small values of $n$. For example, $n=3$: $$2^3a_3+2^2a_2+2a_1+a_0=2(2(2a_3+a_2)+a_1)+a_0$$

xcX3v84RxoQ-4GxG32940ukFUIEgYdPy 211da0142458b96b1ded553eede04e87