Artificial intelligent assistant

Finding approximate eigenvalues of perturbed matrix Assume I have some constant matrix $A$ to which I add a perturbation, resulting in $M(\epsilon )=A+\epsilon B$ the perturbed matrix ($B$ is constant as well), and that I can easily find the eigenvalues and corresponding eigenvectors of $A=M(0)$. I'm looking for a way to find an approximation for the smallest eigenvalue and corresponding eigenvector of $M(\epsilon )$, accurate up to $O(\epsilon ^2)$.

Have you heard the term Gershgorin’s Theorem for estimating eigenvalues?

* If you have "Matrix Analysis" by Horn and Johnson - it has an excellent chapter (Ch 6 - Location and Perturbation of eigenvalues) on the matter.

* If you have "Matrix Computations" by Golub and Van Loan, they have a nice summary of the matter (called Eigenvalue Sensitivity).

* You might be able to find numerical examples in books on Numerical Analysis, for example, "Numerical Analysis" by Burden and Faires has some basic results.




What Gershgorin intervals (based on circles) allow you to do is to determine the intervals for your eigenvalues.

There is a proof that shows that if the matrix $A$ is perturbed by a symmetric matrix $E$, then its eigenvalues do not move by more the $||E||$.

There are other useful results, like the Bauer-Fike theorem and related results that can help you determine other results for bounds on perturbed eigenvalues.

Regards

xcX3v84RxoQ-4GxG32940ukFUIEgYdPy c64f323717d78e1cf14d307dbbcabfd8