![]() |
![]() |
#1 |
Sep 2004
13·41 Posts |
![]()
I'm taking linear algebra and I'm not really understanding how putting a matrix into a function works. We learned that f(x) = A^2 + 2A + 1 means square matrix A + 2* matrix A + Identity Matrix. Our teacher said you can basically plug anything into a function. But does the function still obey the laws of algebra?
For example can you factor a function? I have a question that says show p1(A) = p2(A)p3(A) for any square matrix A. It tells me p1(x) = x^2 + 9 and p2(x) = x + 3 and p3(x) = x - 3. I verified it for a specific 2x2 matrix A, but am not sure how to generalize. I would like to generalize it even farther than they are asking to factor anything like I can with regular algebra. I have a similar question later, show that a square matrix A satisfies A^2 - 3A + I = 0 then A^-1 = 3I - A. It is trivial to show it works for a specific matrix A, and I would like to understand how they came up with the 2nd equation, and not merely that it is true, so I can understand what operations are valid. I don't really know the name of what I'm having trouble with, so I wasn't really able to find anything useful on the internet. I'm taking it at a community college, so the people in the math lab can't really help me, in fact I work in our math lab myself. Thanks so much for your help! Last fiddled with by Joshua2 on 2009-01-19 at 21:23 |
![]() |
![]() |
![]() |
#2 |
"William"
May 2003
New Haven
23·5·59 Posts |
![]()
Do you remember back when mathematics was still called arithematic you talked about associative laws, distributive laws, and commutative laws? Associative and Distributive laws still work for square matrices. Commutative doesn't work except in some special cases.
If you find that confusing, remember that the Associative law says (a + b) * c = a*c + b*c but it doesn't tell you what multiplication and addition are. It just tells you that if you that whatever "+" and "*" mean, this rearrangement is valid. When switching to matrices, you have new definitions of "+" and "*". To show the inverse is 3I-A, just multiply by 3I-A and apply the reorganization rules. Don't forget to show it for both right-multiplication and left-multiplication - you don't have the Commutative law, so you can't be sure they give the same result. Most things about factoring polynomials of a single variable depend only on the associative and distributive laws, so most of those things are still valid. Be careful with two variables, though. The associative and distributive laws are enough to show (X+Y)*(X+Y) = X^2 + XY + YX + Y^2 But you need a commutative law to join the middle terms, so this is not (X^2 + 2XY + Y^2) unless you have special knowledge that in this particular case XY=YX. William |
![]() |
![]() |
![]() |
#3 |
Sep 2004
13×41 Posts |
![]()
(a + b) * c = a*c + b*c is distributive law I believe
So I have x^2 + 9 which I guess is (x+3) * (x-3). To check it, it equals x(x-3)+3(x-3) = x^2-3x + 3x - 9 = x^2 + 9. Since any matrix follows distribute law, we can use any matrix A for x. Does that prove it? I suppose I could do the same logic with (x-3) * (x+3) = x(x+3) - 3(x+3) = x^2 + 9. By generalizing this logic we could show that we can factor functions if we are going to use matrices just like we would in arithmetic. For the 2nd question I was thinking of using the fact that A(3I - A) = (3I - A)A = I if A inverse = 3I - A. So 3A - A^2 = 3A - A^2 = I So if I can show that if A^2 - 3A + I = 0 then 3A - A^2 = I, I have succeeded. So I subtract I from both sides of the original A^2 - 3A + I and multiply through by -1 or -I, and that gives me 3A - A^2 = I. Does that work? |
![]() |
![]() |
![]() |
#4 | |
∂2ω=0
Sep 2002
República de California
2D4216 Posts |
![]() Quote:
When you get to things like matrix exponentials, that's when the fun really begins. :) [Seriously - it is fun, if you like that sort of thing.] |
|
![]() |
![]() |
![]() |
#5 | ||
Sep 2006
Brussels, Belgium
33·61 Posts |
![]() Quote:
Quote:
|
||
![]() |
![]() |
![]() |
#6 | |
Cranksta Rap Ayatollah
Jul 2003
64110 Posts |
![]() Quote:
I just wanted to point this out, because at community colleges, it's usually taught as "Manipulating Matrices For No Apparent Reason". Also, if you plan to go on to higher math, this will probably be one of the most important classes you ever take, but no one will tell you this until you've forgotten almost everything from it. |
|
![]() |
![]() |
![]() |
#7 |
Jun 2003
7×167 Posts |
![]()
That's correct. The associative law for multiplication states that (a * b) * c = a * (b * c). An associative law also applies to addition.
Last fiddled with by ewmayer on 2009-01-20 at 19:08 Reason: Your honor, the user of these weapons of math instruction pleads "guilt by association". |
![]() |
![]() |
![]() |
#8 |
Sep 2004
13×41 Posts |
![]()
Yes, I did get my negative and positive mixed up there :)
I didn't realize that the class was so important, or that it was just regular algebra using functions. Thanks guys! I had a question in class today that my teacher didn't know the answer. I have a 3 equations and 3 unknowns so I have matrix Ax = matrix B. So I multiply both sides by A-1 to get the x matrix values. He said I have to do A-1B with the A inverse on the left, since if I do AxA-1 it won't equal x on the left. I said couldn't you do A*A-1*x = x, and he said there is no middle to plug in. If we let CD = B, then could we do A*A-1*x = C*A-1*D, which is x = something we can compute. Last fiddled with by Joshua2 on 2009-01-21 at 01:36 |
![]() |
![]() |
![]() |
#9 | |
"Richard B. Woods"
Aug 2002
Wisconsin USA
769210 Posts |
![]()
Do you think that factoring B will be easier than simply putting the inverse on the left, instead of on the right, in the multiplication?
Quote:
Last fiddled with by cheesehead on 2009-01-21 at 02:27 |
|
![]() |
![]() |
![]() |
#10 |
Sep 2004
13·41 Posts |
![]()
Right. It is more of an intellectual exercise than a way to make life easier. It can be done the simple way, but I don't really understand the left vs right vs middle. At least I kind of do, but am wanting to know if what I said is possible so I know I am understanding the concepts.
In gaussian elimination can you add row 1 + row 2 + row 3 and put that somewhere like row 3 if it makes a convenient zero? I know that isn't a row operation, so it might cause trouble, but it should be the same as something that is legal. Like row1+2 store row2. row2+3 store row2. row2-1 store row2. |
![]() |
![]() |
![]() |
#11 | |
"William"
May 2003
New Haven
23×5×59 Posts |
![]() Quote:
Ax=b You can left multiply because A-1(Ax) = A-1(Ax) is an identity, then A-1(Ax) = A-1b by substitution of likes for likes then (A-1A)x = A-1b by associativity There is not a set of basic operations that creates (AA-1)x on the left. You don't get to just write anything that strikes your fancy. You must be able to justify every step. William |
|
![]() |
![]() |
![]() |
Thread Tools | |
![]() |
||||
Thread | Thread Starter | Forum | Replies | Last Post |
Is there a "trick" to plugging in a Switch or did I kill it? | petrw1 | Hardware | 35 | 2016-12-03 17:05 |
Small(ish) matrices on fast computers | fivemack | Msieve | 1 | 2011-03-14 23:43 |
Denser matrices save BL time | Batalov | Msieve | 8 | 2010-02-14 11:39 |
Count of Matrices | davar55 | Puzzles | 2 | 2007-06-13 20:46 |
matrices question | fuzzy | Miscellaneous Math | 1 | 2005-03-19 11:12 |