Artificial intelligent assistant

Partial derivative involving ln() I'm looking at a paper involving a constrained optimisation problem using Lagrange multipliers, in which the following Lagrangian function appears: $\gamma = \max \sum_{i=1}^nr_i(1-e^{-a_ix_i})+\lambda(1-\sum_{i=1}^{n}x_i)$ For the partial derivative of $\gamma$ with respect to $x_1$, the authors write: $\frac{\partial\gamma}{\partial x_1} = a_1x_1+ \ln(\lambda)-\ln(r_1a_1)$ As a thoroughgoing newbie to all this, I thought that the partial derivative with respect to $x_1$ would be: $\frac{\partial\gamma}{\partial x_1} = -a_1r_1e^{-a_1x_1} +\lambda$ ...because I thought the partial derivative with respect to $x$ of $e^{kx}$ is $k\cdot e^{kx} $ Clearly, I'm missing something pretty fundamental (!), so I'd be really grateful if anyone could explain how the partial derivative above appears! Thanks in advance!

My first reaction was : " _one more typo_ ". And, I was wrong !!

But let $$F=\sum _{i=1}^n r_i \left(1-e^{-a_i x_i}\right)+\lambda \left(1-\sum _{i=1}^n x_i\right)$$ which gives $$\frac{dF}{dx_i}=a_i r_i e^{-a_i x_i}-\lambda$$ what you properly found. But they now set the derivative equal equal to $0$. Then, and only then, $$a_i r_i e^{-a_i x_i}=\lambda$$ Now, take the logarithms. But this does not represent the derivative of anything.

xcX3v84RxoQ-4GxG32940ukFUIEgYdPy 06ffcd5cdc7423889b8548214d58ac43