Artificial intelligent assistant

Polynomial fitting on intervals, how to do it? Recently I investigated a function that proved to have really crappy Taylor expansions in the sense that they only fit well in a very narrow neighbourhood of the point of expansion even for higher orders. Then I started thinking about how one can try and build integral fitters. One possibility that came to mind was this: $$p_o = \min_{p\in \mathbb P^k}\int_{x_0}^{x_1} (p(x)-f(x))^2 dx$$ But I am unsure of how to attack it analytically / algebraically. For example $p(x)^2$ pops up which contains a self convolution of coefficients which is non-linear. If one allows oneself to sample $f$, treat it as a vector of constants on the interval then we could reduce it to the famous least squares problem $${\bf c_o} = \min_{\bf c}\|{\bf \Phi c - f}\|_2$$ But assuming we can't do that, what other options do we have?

You can try to use gradient descent on the coefficients of $p$. If, say, $p(x) = \sum a_i x^i$, then $$\frac{\partial}{\partial a_k}\int\left(p(x) - f(x)\right)^2\operatorname{dx} = \int 2 (p(x) - f(x))x^k\operatorname{dx}$$

If you can calculate the last integral, you can directly apply gradient descent.

xcX3v84RxoQ-4GxG32940ukFUIEgYdPy e41d8d1556644f50b4898e0d0d590c5a