Artificial intelligent assistant

How to derive the equations in 3:19-3:30 provide in a MIT opencourse ware lecture about the least square method? How to derive the equations in 3:19-3:30 provide in a MIT opencourse ware lecture about the least square method? Link: < Thanks in advance!

Suppose that you have a collection of data points $(x_i,y_i)$ and you are trying to optimize the parameters of a line $y=ax+b$ to get the best fit, where we're defining best fit to mean least squares. Then we want to minimize

$$\sum_i (ax_i+b-y_i)^2 $$

This will happen when the partial derivatives with respect to both $a$ and $b$ are zero. If we take the partial derivative with respect to $a$, we obtain

$$2\sum_i (ax_i+b-y_i)x_i=0$$

and if we take the partial with respect to $b$ we obtain

$$2\sum_i (ax_i+b-y_i)=0$$.

The equations on the board are just these rewritten.

xcX3v84RxoQ-4GxG32940ukFUIEgYdPy 2467189103f130f2784eed81b6e5be9a