![]() |
Linear algebra proof
Can someone point me to the proof that if the geometric multiplicity of each eigenvalue is equal to the corresponding algebraic multiplicity, then the matrix is diagonalizable.
Thanks in advance. |
Could you please define "geometric multiplicity?" (It sounds like something relating to the eigenspace, but I want to be sure.)
|
[QUOTE=ewmayer;97844]Could you please define "geometric multiplicity?" (It sounds like something relating to the eigenspace, but I want to be sure.)[/QUOTE]
The geometric multiplicity is the dimenion of the eigenspace. |
Then it's quite simple: for a repeated eigenvalue (the only case one need be concerned about w.r.to possible nondiagonalizability), if the geometric multiplicity of the corresponding eigenspace is equal to the algebraic multiplicity of the eigenvalue (call that K), that means that one can find K linearly independent eigenvectors, hence the matrix is diagonalizable.
Put another way, one only winds up with a Jordan form (nondiagonalizability) if the eigenspace is rank-deficient. In that case the best one can do is to find a set of pseudo-eigenvectors (real eigenvectors plus some non-eigenvectors to "fill in" the rank-deficient elements of the eigenspace corresponding to the particular problematic repeated eigenvalues) which "nearly" diagonalize the matrix. |
Thanks, and other question
Thank you very much. I've got a new question: is there a proof that for distinct eigenvalues, there correspond linear independent eigenvectors, that does not use mathematical induction?
Thanks in advance, Damian. |
[QUOTE=Damian;97996]Thank you very much. I've got a new question: is there a proof that for distinct eigenvalues, there correspond linear independent eigenvectors, that does not use mathematical induction?[/QUOTE]
This would appear to follow directly from the definition of an eigenvector. Try this: assuming that for 3 distinct eigenvalues l1,l2,l3 with corresponding eigenvectors x,yz, one of the eigenvectors is a linear combination of the other 2. e.g. z = a*x+b*y. Multiply by the matrix, and you should pretty quickly get a contradiction. |
[QUOTE=ewmayer;98001]This would appear to follow directly from the definition of an eigenvector. Try this: assuming that for 3 distinct eigenvalues l1,l2,l3 with corresponding eigenvectors x,yz, one of the eigenvectors is a linear combination of the other 2. e.g. z = a*x+b*y. Multiply by the matrix, and you should pretty quickly get a contradiction.[/QUOTE]
OK, I verified that this does lead to a simple proof, but one still needs to also show that given a starting point of one {eigenvalue,eigenvector} pair, the eigenvector for the *second* distinct eigenvalue must be LI of the first, which in this case reduces to "not a multiple" of. Again easy to show, but in the end it does amount to proof by induction. |
[quote]Way Out in Hilbert Space[/quote]
An infinite-dimensional exit? If so, where does it lead to? Paul |
[QUOTE=xilman;98306]An infinite-dimensional exit? If so, where does it lead to?[/QUOTE]
Like the sign (you know, the one that pops up almost everywhere) says: In Hilbert Space, all roads converge (if they converge) to a li'l place called Norm's Functional Rest Stop. I'm hoping to get there at some point so I can start to unload some of my collection of old vinyl L[sup]p[/sup]'s, but failing that, simply to achieve closure. p.s.: Norm's is best-known for its "eat a burger, drink a beer and smoke adjoint" special, but interestingly, they also offer a nice lineup of Cauchy foods. |
| All times are UTC. The time now is 01:20. |
Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2021, Jelsoft Enterprises Ltd.