mersenneforum.org  

Go Back   mersenneforum.org > New To GIMPS? Start Here! > Homework Help

Reply
 
Thread Tools
Old 2009-01-19, 21:23   #1
Joshua2
 
Joshua2's Avatar
 
Sep 2004

13·41 Posts
Default Plugging Matrices into Functions

I'm taking linear algebra and I'm not really understanding how putting a matrix into a function works. We learned that f(x) = A^2 + 2A + 1 means square matrix A + 2* matrix A + Identity Matrix. Our teacher said you can basically plug anything into a function. But does the function still obey the laws of algebra?

For example can you factor a function? I have a question that says show p1(A) = p2(A)p3(A) for any square matrix A. It tells me p1(x) = x^2 + 9 and p2(x) = x + 3 and p3(x) = x - 3. I verified it for a specific 2x2 matrix A, but am not sure how to generalize. I would like to generalize it even farther than they are asking to factor anything like I can with regular algebra.

I have a similar question later, show that a square matrix A satisfies A^2 - 3A + I = 0 then A^-1 = 3I - A. It is trivial to show it works for a specific matrix A, and I would like to understand how they came up with the 2nd equation, and not merely that it is true, so I can understand what operations are valid.

I don't really know the name of what I'm having trouble with, so I wasn't really able to find anything useful on the internet. I'm taking it at a community college, so the people in the math lab can't really help me, in fact I work in our math lab myself.
Thanks so much for your help!

Last fiddled with by Joshua2 on 2009-01-19 at 21:23
Joshua2 is offline   Reply With Quote
Old 2009-01-19, 23:17   #2
wblipp
 
wblipp's Avatar
 
"William"
May 2003
New Haven

23·5·59 Posts
Default

Do you remember back when mathematics was still called arithematic you talked about associative laws, distributive laws, and commutative laws? Associative and Distributive laws still work for square matrices. Commutative doesn't work except in some special cases.

If you find that confusing, remember that the Associative law says

(a + b) * c = a*c + b*c

but it doesn't tell you what multiplication and addition are. It just tells you that if you that whatever "+" and "*" mean, this rearrangement is valid. When switching to matrices, you have new definitions of "+" and "*".

To show the inverse is 3I-A, just multiply by 3I-A and apply the reorganization rules. Don't forget to show it for both right-multiplication and left-multiplication - you don't have the Commutative law, so you can't be sure they give the same result.

Most things about factoring polynomials of a single variable depend only on the associative and distributive laws, so most of those things are still valid. Be careful with two variables, though. The associative and distributive laws are enough to show

(X+Y)*(X+Y) = X^2 + XY + YX + Y^2

But you need a commutative law to join the middle terms, so this is not (X^2 + 2XY + Y^2) unless you have special knowledge that in this particular case XY=YX.

William
wblipp is offline   Reply With Quote
Old 2009-01-19, 23:37   #3
Joshua2
 
Joshua2's Avatar
 
Sep 2004

13×41 Posts
Default

(a + b) * c = a*c + b*c is distributive law I believe
So I have x^2 + 9 which I guess is (x+3) * (x-3). To check it, it equals x(x-3)+3(x-3) = x^2-3x + 3x - 9 = x^2 + 9. Since any matrix follows distribute law, we can use any matrix A for x. Does that prove it? I suppose I could do the same logic with (x-3) * (x+3) = x(x+3) - 3(x+3) = x^2 + 9. By generalizing this logic we could show that we can factor functions if we are going to use matrices just like we would in arithmetic.

For the 2nd question I was thinking of using the fact that A(3I - A) = (3I - A)A = I if A inverse = 3I - A. So 3A - A^2 = 3A - A^2 = I
So if I can show that if A^2 - 3A + I = 0 then 3A - A^2 = I, I have succeeded.
So I subtract I from both sides of the original A^2 - 3A + I and multiply through by -1 or -I, and that gives me 3A - A^2 = I. Does that work?
Joshua2 is offline   Reply With Quote
Old 2009-01-19, 23:44   #4
ewmayer
2ω=0
 
ewmayer's Avatar
 
Sep 2002
Rep├║blica de California

2D4216 Posts
Default

Quote:
Originally Posted by Joshua2 View Post
For the 2nd question I was thinking of using the fact that A(3I - A) = (3I - A)A = I if A inverse = 3I - A. So 3A - A^2 = 3A - A^2 = I
So if I can show that if A^2 - 3A + I = 0 then 3A - A^2 = I, I have succeeded.
So I subtract I from both sides of the original A^2 - 3A + I and multiply through by -1 or -I, and that gives me 3A - A^2 = I. Does that work?
Yes - and you don't need to worry about commutativity in this case because A^2 automatically commutes, and so does multiplication by the identity matrix.

When you get to things like matrix exponentials, that's when the fun really begins. :) [Seriously - it is fun, if you like that sort of thing.]
ewmayer is offline   Reply With Quote
Old 2009-01-20, 06:28   #5
S485122
 
S485122's Avatar
 
Sep 2006
Brussels, Belgium

33·61 Posts
Default

Quote:
Originally Posted by Joshua2 View Post
So I have x^2 + 9 which I guess is (x+3) * (x-3). To check it, it equals x(x-3)+3(x-3) = x^2-3x + 3x - 9 = x^2 + 9. I suppose I could do the same logic with (x-3) * (x+3) = x(x+3) - 3(x+3) = x^2 + 9.
Don't you mean :
Quote:
So I have x^2 - 9 which I guess is (x+3) * (x-3). To check it, it equals x(x-3)+3(x-3) = x^2-3x + 3x - 9 = x^2 - 9. I suppose I could do the same logic with (x-3) * (x+3) = x(x+3) - 3(x+3) = x^2 - 9
Jacob
S485122 is offline   Reply With Quote
Old 2009-01-20, 15:35   #6
Orgasmic Troll
Cranksta Rap Ayatollah
 
Orgasmic Troll's Avatar
 
Jul 2003

64110 Posts
Default

Quote:
Originally Posted by Joshua2 View Post
I'm taking linear algebra and I'm not really understanding how putting a matrix into a function works. We learned that f(x) = A^2 + 2A + 1 means square matrix A + 2* matrix A + Identity Matrix. Our teacher said you can basically plug anything into a function. But does the function still obey the laws of algebra?
This is the point of the class. That's why they call it linear algebra. You're learning how to do algebra with matrices (the linear part comes in because every linear transformation has an associated matrix, and every matrix defines a linear transformation)

I just wanted to point this out, because at community colleges, it's usually taught as "Manipulating Matrices For No Apparent Reason".

Also, if you plan to go on to higher math, this will probably be one of the most important classes you ever take, but no one will tell you this until you've forgotten almost everything from it.
Orgasmic Troll is offline   Reply With Quote
Old 2009-01-20, 18:47   #7
Mr. P-1
 
Mr. P-1's Avatar
 
Jun 2003

7×167 Posts
Default

Quote:
Originally Posted by Joshua2 View Post
(a + b) * c = a*c + b*c is distributive law I believe
That's correct. The associative law for multiplication states that (a * b) * c = a * (b * c). An associative law also applies to addition.

Last fiddled with by ewmayer on 2009-01-20 at 19:08 Reason: Your honor, the user of these weapons of math instruction pleads "guilt by association".
Mr. P-1 is offline   Reply With Quote
Old 2009-01-21, 01:26   #8
Joshua2
 
Joshua2's Avatar
 
Sep 2004

13×41 Posts
Default

Yes, I did get my negative and positive mixed up there :)
I didn't realize that the class was so important, or that it was just regular algebra using functions. Thanks guys!

I had a question in class today that my teacher didn't know the answer. I have a 3 equations and 3 unknowns so I have matrix Ax = matrix B. So I multiply both sides by A-1 to get the x matrix values. He said I have to do A-1B with the A inverse on the left, since if I do AxA-1 it won't equal x on the left. I said couldn't you do A*A-1*x = x, and he said there is no middle to plug in. If we let CD = B, then could we do A*A-1*x = C*A-1*D, which is x = something we can compute.

Last fiddled with by Joshua2 on 2009-01-21 at 01:36
Joshua2 is offline   Reply With Quote
Old 2009-01-21, 02:15   #9
cheesehead
 
cheesehead's Avatar
 
"Richard B. Woods"
Aug 2002
Wisconsin USA

769210 Posts
Default

Quote:
Originally Posted by Joshua2 View Post
If we let CD = B
Do you think that factoring B will be easier than simply putting the inverse on the left, instead of on the right, in the multiplication?

Quote:
x = something we can compute.
The product A-1B is something we can compute, and doesn't require factoring B.

Last fiddled with by cheesehead on 2009-01-21 at 02:27
cheesehead is offline   Reply With Quote
Old 2009-01-21, 03:46   #10
Joshua2
 
Joshua2's Avatar
 
Sep 2004

13·41 Posts
Default

Right. It is more of an intellectual exercise than a way to make life easier. It can be done the simple way, but I don't really understand the left vs right vs middle. At least I kind of do, but am wanting to know if what I said is possible so I know I am understanding the concepts.

In gaussian elimination can you add row 1 + row 2 + row 3 and put that somewhere like row 3 if it makes a convenient zero? I know that isn't a row operation, so it might cause trouble, but it should be the same as something that is legal. Like row1+2 store row2. row2+3 store row2. row2-1 store row2.
Joshua2 is offline   Reply With Quote
Old 2009-01-21, 04:58   #11
wblipp
 
wblipp's Avatar
 
"William"
May 2003
New Haven

23×5×59 Posts
Default

Quote:
Originally Posted by Joshua2 View Post
He said I have to do A-1B with the A inverse on the left, since if I do AxA-1 it won't equal x on the left. I said couldn't you do A*A-1*x = x, and he said there is no middle to plug in. If we let CD = B, then could we do A*A-1*x = C*A-1*D, which is x = something we can compute.
Back to basics. From the equation


Ax=b

You can left multiply because
A-1(Ax) = A-1(Ax) is an identity, then
A-1(Ax) = A-1b by substitution of likes for likes then
(A-1A)x = A-1b by associativity

There is not a set of basic operations that creates

(AA-1)x on the left.

You don't get to just write anything that strikes your fancy. You must be able to justify every step.

William
wblipp is offline   Reply With Quote
Reply

Thread Tools


Similar Threads
Thread Thread Starter Forum Replies Last Post
Is there a "trick" to plugging in a Switch or did I kill it? petrw1 Hardware 35 2016-12-03 17:05
Small(ish) matrices on fast computers fivemack Msieve 1 2011-03-14 23:43
Denser matrices save BL time Batalov Msieve 8 2010-02-14 11:39
Count of Matrices davar55 Puzzles 2 2007-06-13 20:46
matrices question fuzzy Miscellaneous Math 1 2005-03-19 11:12

All times are UTC. The time now is 11:56.

Mon Jan 25 11:56:34 UTC 2021 up 53 days, 8:07, 0 users, load averages: 2.93, 3.07, 2.77

Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2021, Jelsoft Enterprises Ltd.

This forum has received and complied with 0 (zero) government requests for information.

Permission is granted to copy, distribute and/or modify this document under the terms of the GNU Free Documentation License, Version 1.2 or any later version published by the Free Software Foundation.
A copy of the license is included in the FAQ.