Artificial intelligent assistant

Trouble understanding the proof for linear independency of a basis for a linear transformation I am reading Matrix Analysis and Applied Linear Algebra by Meyer and the following statement and proof are given !enter image description here What I am having trouble understanding is how the author is showing that $\beta_{\mathcal L}$ is a linearly independent set. Essentially, he states that the set of the transformations $\mathbf B_{ji}$ that act on an arbitrary but fixed element of $\beta$ is linearly independent. How does it follow from this premise that the entire set $\beta_{\mathcal L}$ is linearly independent?

The author is proving that for $\sum_{j,i}\eta_{j,i}\mathbf B_{j,i}=0$ to hold, all $nm$ coefficients $\eta_{j,i}$ have to be zero, since that is the definition of linear independence. This is shown in groups of $m$ coefficients at a time, namely for fixed $k\in\\{1,\ldots,n\\}$ it is shown that $\eta_{k,1},\ldots,\eta_{k,m}$ must all be zero. The latter is achieved by applying both sides of the given identity to $\mathbf u_k$: the right hand side is zero of course, and the left hand side gives a linear combination of $\mathbf v_1,\ldots,\mathbf v_m$ with $\eta_{k,1},\ldots,\eta_{k,m}$ as coefficients; by the linear independence of $\mathbf v_1,\ldots,\mathbf v_m$ this is only possible if all those $m$ coefficients are zero, as desired.

xcX3v84RxoQ-4GxG32940ukFUIEgYdPy a3a42f4acf0098e4dce1b1f8ccbfc520