Artificial intelligent assistant

Two matrices having the same norm Let $V$ be a real finite dimensional vector space with inner product. A set of $k$ vectors $y_1,...,y_k\in V$, defines the linear operator $A: V\rightarrow V$ by $$Ax=\sum_{j=1}^k\langle y_j,x\rangle y_j$$. Define $k\times k$ matrix, $M$ with $M_{ij}=\langle y_i,y_j\rangle$. The norm of $A$ is the operator norm coming from the vector norm $\|x\|^2=\langle x,x\rangle$. The norm of $M$ is the matrix norm coming from $\mathbb{R}^k$: $\|z\|^2=\sum_{j=1}^kz_j^2$. Show that $\|A\|=\|M\|$. So how do we start this? A direct computation seems difficult and heady

For convenience define $Y:V\to\mathbb R^k$ by $$ Y(x) = \begin{bmatrix} \langle y_1, x \rangle \\\ \vdots \\\ \langle y_k, x \rangle \end{bmatrix}. $$ Then, the adjoint ("transpose") $Y^*:\mathbb R^k\to V$ of $Y$ is given by $$ Y^*(u) = \sum_{i=1}^k u_i y_i. $$ Thus, we have $A = Y^*Y$ and $M=YY^*$.

It is easy to check that $A$ is self-adjoint, (that is $\langle Ax, z \rangle = \langle x, A z \rangle $) and $M$ is symmetric. Thus, the operator norm is given by the largest eigenvalue in modulus. If $\lambda$ is an eigenvalue of $A$ with eigenvector $x$, then it follows $$ M Yx = YY^* Y x = Y Ax = \lambda Yx. $$ Thus, if $\lambda\
e 0$, then $\lambda$ is also an eigenvalue of $M$ with eigenvector $Yx$. (Why is $Yx\
e 0$?) We can show in a similar way that a nonzero eigenvalue of $M$ is also an eigenvalue of $A$. Thus, the nonzero eigenvalues of $A$ and $M$ agree, and so do their operator norms.

xcX3v84RxoQ-4GxG32940ukFUIEgYdPy 511ca5f57dca9fcd0268600a284f3a57