mersenneforum.org  

Go Back   mersenneforum.org > Great Internet Mersenne Prime Search > Math

Reply
 
Thread Tools
Old 2007-02-06, 02:35   #1
Damian
 
Damian's Avatar
 
May 2005
Argentina

2×3×31 Posts
Default Linear algebra proof

Can someone point me to the proof that if the geometric multiplicity of each eigenvalue is equal to the corresponding algebraic multiplicity, then the matrix is diagonalizable.
Thanks in advance.
Damian is offline   Reply With Quote
Old 2007-02-06, 17:03   #2
ewmayer
2ω=0
 
ewmayer's Avatar
 
Sep 2002
República de California

22×31×79 Posts
Default

Could you please define "geometric multiplicity?" (It sounds like something relating to the eigenspace, but I want to be sure.)
ewmayer is offline   Reply With Quote
Old 2007-02-06, 22:04   #3
Damian
 
Damian's Avatar
 
May 2005
Argentina

2×3×31 Posts
Default

Quote:
Originally Posted by ewmayer View Post
Could you please define "geometric multiplicity?" (It sounds like something relating to the eigenspace, but I want to be sure.)
The geometric multiplicity is the dimenion of the eigenspace.
Damian is offline   Reply With Quote
Old 2007-02-06, 22:18   #4
ewmayer
2ω=0
 
ewmayer's Avatar
 
Sep 2002
República de California

22·31·79 Posts
Default

Then it's quite simple: for a repeated eigenvalue (the only case one need be concerned about w.r.to possible nondiagonalizability), if the geometric multiplicity of the corresponding eigenspace is equal to the algebraic multiplicity of the eigenvalue (call that K), that means that one can find K linearly independent eigenvectors, hence the matrix is diagonalizable.

Put another way, one only winds up with a Jordan form (nondiagonalizability) if the eigenspace is rank-deficient. In that case the best one can do is to find a set of pseudo-eigenvectors (real eigenvectors plus some non-eigenvectors to "fill in" the rank-deficient elements of the eigenspace corresponding to the particular problematic repeated eigenvalues) which "nearly" diagonalize the matrix.
ewmayer is offline   Reply With Quote
Old 2007-02-08, 16:28   #5
Damian
 
Damian's Avatar
 
May 2005
Argentina

2×3×31 Posts
Default Thanks, and other question

Thank you very much. I've got a new question: is there a proof that for distinct eigenvalues, there correspond linear independent eigenvectors, that does not use mathematical induction?
Thanks in advance,
Damian.
Damian is offline   Reply With Quote
Old 2007-02-08, 17:13   #6
ewmayer
2ω=0
 
ewmayer's Avatar
 
Sep 2002
República de California

22·31·79 Posts
Default

Quote:
Originally Posted by Damian View Post
Thank you very much. I've got a new question: is there a proof that for distinct eigenvalues, there correspond linear independent eigenvectors, that does not use mathematical induction?
This would appear to follow directly from the definition of an eigenvector. Try this: assuming that for 3 distinct eigenvalues l1,l2,l3 with corresponding eigenvectors x,yz, one of the eigenvectors is a linear combination of the other 2. e.g. z = a*x+b*y. Multiply by the matrix, and you should pretty quickly get a contradiction.
ewmayer is offline   Reply With Quote
Old 2007-02-12, 19:31   #7
ewmayer
2ω=0
 
ewmayer's Avatar
 
Sep 2002
República de California

264416 Posts
Default

Quote:
Originally Posted by ewmayer View Post
This would appear to follow directly from the definition of an eigenvector. Try this: assuming that for 3 distinct eigenvalues l1,l2,l3 with corresponding eigenvectors x,yz, one of the eigenvectors is a linear combination of the other 2. e.g. z = a*x+b*y. Multiply by the matrix, and you should pretty quickly get a contradiction.
OK, I verified that this does lead to a simple proof, but one still needs to also show that given a starting point of one {eigenvalue,eigenvector} pair, the eigenvector for the *second* distinct eigenvalue must be LI of the first, which in this case reduces to "not a multiple" of. Again easy to show, but in the end it does amount to proof by induction.
ewmayer is offline   Reply With Quote
Old 2007-02-12, 20:27   #8
xilman
Bamboozled!
 
xilman's Avatar
 
"𒉺𒌌𒇷𒆷𒀭"
May 2003
Down not across

100111100100112 Posts
Default

Quote:
Way Out in Hilbert Space
An infinite-dimensional exit? If so, where does it lead to?


Paul

Last fiddled with by xilman on 2007-02-12 at 20:28 Reason: Fix tag
xilman is offline   Reply With Quote
Old 2007-02-12, 22:25   #9
ewmayer
2ω=0
 
ewmayer's Avatar
 
Sep 2002
República de California

22×31×79 Posts
Default

Quote:
Originally Posted by xilman View Post
An infinite-dimensional exit? If so, where does it lead to?
Like the sign (you know, the one that pops up almost everywhere) says: In Hilbert Space, all roads converge (if they converge) to a li'l place called Norm's Functional Rest Stop. I'm hoping to get there at some point so I can start to unload some of my collection of old vinyl Lp's, but failing that, simply to achieve closure.

p.s.: Norm's is best-known for its "eat a burger, drink a beer and smoke adjoint" special, but interestingly, they also offer a nice lineup of Cauchy foods.
ewmayer is offline   Reply With Quote
Reply

Thread Tools


Similar Threads
Thread Thread Starter Forum Replies Last Post
Has anyone tried linear algebra on a Threadripper yet? fivemack Hardware 3 2017-10-03 03:11
Restarting linear algebra wombatman Msieve 2 2013-10-09 15:54
Linear algebra at 600% CRGreathouse Msieve 8 2009-08-05 07:25
Linear algebra crashes 10metreh Msieve 3 2009-02-02 08:34
Linear algebra in MPQS R1zZ1 Factoring 2 2007-02-02 06:45

All times are UTC. The time now is 05:09.

Sat Oct 31 05:09:57 UTC 2020 up 51 days, 2:20, 2 users, load averages: 1.57, 1.72, 1.73

Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2020, Jelsoft Enterprises Ltd.

This forum has received and complied with 0 (zero) government requests for information.

Permission is granted to copy, distribute and/or modify this document under the terms of the GNU Free Documentation License, Version 1.2 or any later version published by the Free Software Foundation.
A copy of the license is included in the FAQ.