Artificial intelligent assistant

Transition using integral of partial derivatives unclear I hope I am not overreaching my boundaries on this one, but I have been trying to wrap my head around a claim I read in a paper and would greatly appreciate any help on figuring it out. In the following paper: Axiomatic Attribution for Deep Networks the authors stated the following: $ IG_i::= (x_i-x'_i)\int^1_0\frac{dF(x'+\alpha(x-x'))}{dx_i}d\alpha $ And then: $\sum^n_{i=1}IG_i(x)=F(x)-F(x')$ I can't seem to prove this transition (from the first to the second) and there is no proof supplied in the paper (which makes me think that there's something trivial I am missing...). Would be grateful for any help on this. Thanks!

Just for the sake of completion, here are the full steps (thanks to David Ullrich):

Define:

$g(\alpha) = F(x'+\alpha(x-x'))$

$y=x'+\alpha(x-x')$

Where $F(\cdot)$ is a scalar and $x$ and $x'$ are vectors (and therefore $y$ is too).

Then: $g'(\alpha) = \frac{\partial F}{\partial y} \frac{\partial y}{\partial \alpha} = (\frac{\partial F}{\partial y_1}(x_1-x'_{1}), \frac{\partial F}{\partial y_2}(x_2-x'_{2}), ..., \frac{\partial F}{\partial y_n}(x_n-x'_{n}) )= \sum_i \frac{\partial F}{\partial y_i}(x_i - x'_{i})$

Notice that the second step follows because we're deriving a scalar by a vector (i.e. $F(\cdot)$ by $y$).

Now integrate over $\alpha$:

$\int_{\alpha =0}^{1} g'(\alpha)d\alpha = \int_{\alpha =0}^{1} \sum_i \frac{\partial F}{\partial y_i}(x_i - x'_{i}) = \sum_i \int_{\alpha =0}^{1} \frac{\partial F}{\partial y_i}(x_i - x'_{i}) = \sum_i IG_i$

Notice that:

$\int_{\alpha =0}^{1} g'(\alpha)d\alpha = F(x)-F(x')$

Q.E.D :)

xcX3v84RxoQ-4GxG32940ukFUIEgYdPy 5afd32276248618af72c2ea27362e77f