Artificial intelligent assistant

Why is the only $k$ step method with stiff decay BDF? I'm studying for a test and I'd like to know how justify why the only $k$-step method of order $k$ with stiff decay is BDF. By definition of stiff decay(Ascher & Petzold) a method has stiff decay if \begin{equation}|y_n-g(t_n)|\rightarrow \ 0,\qquad \text{as }h_nRe(\lambda)\rightarrow -\infty,\end{equation} where \begin{equation} y'=\lambda(y-g(t)),\end{equation} and $g(t)$ is an arbitrary bounded function. Assuming stiff decay and considering the definition of the general LMM I don't see why this forces $\beta_j=0$ for $j>0$. Thanks for your time.

Applying a linear multistep method to the equation $y' = \lambda(y-g(t))$ yields $$ \sum_{j=0}^k \alpha_jy_{n−j} = \sum_{j=0}^k h \lambda \beta_j(y_{n-j}-g(t_{n−j})), $$ which we can rewrite as $$ \sum_{j=0}^k \beta_j(y_{n-j}-g(t_{n−j})) - \frac1{h\lambda} \sum_{j=0}^k \alpha_jy_{n−j} = 0. $$ In the limit that $h\lambda \to -\infty$, the second term goes to zero, so this becomes $$ \sum_{j=0}^k \beta_j(y_{n-j}-g(t_{n-j})) = 0, $$ which we can re-arrange as $$ y_n - g(t_n) = \frac{1}{\beta_0} \sum_{j=1}^k \beta_j(y_{n-j}-g(t_{n-j})). $$ Since this needs to be zero for all values of $y_{n-j} - g(t_{n-j})$, the coefficients $\beta_j$ ($j=1,\dots,k$) need to be zero.

xcX3v84RxoQ-4GxG32940ukFUIEgYdPy 715eecffd2e68ba6a805c0b684866d9b