Math Is Fun Forum
  Discussion about math, puzzles, games and fun.   Useful symbols: ÷ × ½ √ ∞ ≠ ≤ ≥ ≈ ⇒ ± ∈ Δ θ ∴ ∑ ∫ • π ƒ -¹ ² ³ °

Login

Username

Password

Not registered yet?

#1 2013-11-18 09:50:07

Bezoux
Novice

Offline

Polynomials and matrixes

Hi everyone,
I've encountered a problem while studying matrices.
A={{2,1},{3,-1}}, B={{4,-2},{3,-1}}
Prove that there does not exist a polynomial with real coefficients such that p(A)=B or p(B)=A.
I've read up on eigenvalues, eigenvectors, characteristic polynomials and diagonalization, but nothing seems to be making sense, as the whole thing gets way too complicated for a high school problem.
I think there's a way to do this without using any of the aforementioned.
Can you help me out, please?

Last edited by Bezoux (2013-11-18 09:51:04)

#2 2013-11-19 00:48:05

bob bundy
Moderator

Offline

Re: Polynomials and matrixes

hi Bezoux

Welcome to the forum.

I have to admit straight away that I cannot do this question.  I held off making a post in the hope someone else would, but it doesn't look like that is going to happen, so I'll jump in with what I have.  Maybe someone will notice who can put us both straight.  smile

Firstly, I'm assuming those are 2 by 2 matrices.  This is how to display them; click the matrix and you will see the underlying Latex code.



and



Now, what makes you think that this is a question about eigenvalues?  Here's how to get a characteristic polynomial for a square matrix:



Multiplying and equating the first and second entries:



Solving for lambda:



And by a similar method for B



These are called the characteristic polynomials for A and for B.

But what has that got to do with

p(A)=B or p(B)=A.

If p(A) means 'the characteristic polynomial' then it cannot be equal to a matrix.  They just aren't the same thing.

So what does p(A) mean?  Do you have anything in your notes / textbook that tells us, because I don't recognise the notation.

The only thing I can think of is this:



where the small letters are the coefficients of a normal polynomial and there are a number of powers of A.

[note:  There can be no 'constant' matrix at the end, because it would be easy to make any sum of matrices equal to B by a suitable choice of constant term.]

If that is correct, then we have to show that no combination of powers of A, multiplied by coefficients, can ever sum to give B (and similarly the other way round).  At the moment I cannot think how to do that; but I'm working on it.

RSVP

Bob


You cannot teach a man anything;  you can only help him find it within himself..........Galileo Galilei

#3 2013-11-19 01:54:59

Nehushtan
Power Member

Offline

Re: Polynomials and matrixes

Bezoux is referring to a matrix polynomial:

http://en.wikipedia.org/wiki/Matrix_polynomial

(Not to be confused with “polynomial matrix”, which is a matrix whose entries are polynomials.)


134 books currently added on Goodreads

#4 2013-11-19 05:49:44

bob bundy
Moderator

Offline

Re: Polynomials and matrixes

hi Nehushtan

Many thanks for that info.  I'd not met that before.  Any idea about how to do the problem?

Bob


You cannot teach a man anything;  you can only help him find it within himself..........Galileo Galilei

#5 2013-11-19 10:35:43

Nehushtan
Power Member

Offline

Re: Polynomials and matrixes

Not a clue, unfortunately. sad

The Wikipedia article is very badly written. Also, Wolfram MathWorld has a different definition of matrix polynomial (http://mathworld.wolfram.com/MatrixPolynomial.html) defining it as a polynomial with matrix coefficients rather than matrix variables – but I think the Wikipedia definition makes more sense. In other words, if p(x) is a polynomial, p(A) is the matrix polynomial obtained by replacing the variable x by the matrix A and the constant term by a0I2 where I2 (= A0) is the 2×2 identity matrix.

I’ve Google-searched but found very little in the way of help on tackling problems of this sort. hmm

Last edited by Nehushtan (2013-11-19 10:41:54)


134 books currently added on Goodreads

#6 2013-11-21 09:56:40

Bezoux
Novice

Offline

Re: Polynomials and matrixes

Essentially, my first idea was to assume the opposite, i.e. that there is a polynomial such that p(A)=B, for example (I'd do the p(B)=A one separately).
Then,

, so if I find A^n for any natural number N (which can be done via diagonalization, if I'm not mistaken), I could get a contradiction out of it, but the process is a bit too complicated for such ugly numbers (the roots of the first quadratic equation have square roots of 21 in them).
I found a very useful link on finding the nth degree of a matrix (I can't link it unfortunately, but it's the first result that comes up when you google "finding the nth power of a matrix").

Still, something makes me think there's a much more elegant solution that I'm not seeing.

#7 2013-11-21 10:08:28

anonimnystefy
Real Member

Online

Re: Polynomials and matrixes

Hi

The matrix B has nicer eigen mvalues.


The limit operator is just an excuse for doing something you know you can't.
“It's the subject that nobody knows anything about that we can all talk about!” ― Richard Feynman
“Taking a new step, uttering a new word, is what people fear most.” ― Fyodor Dostoyevsky, Crime and Punishment

#8 2013-11-21 18:42:10

bob bundy
Moderator

Offline

Re: Polynomials and matrixes

Bezoux wrote:

my first idea was to assume the opposite, i.e. that there is a polynomial such that p(A)=B, for example (I'd do the p(B)=A one separately).

I wondered if that would work too.  I was hoping that the powers of A would have some common feature that would be incompatible with the equivalent in B, but nothing obvious occurs.

I'm also trying to construct a series of geometric transformations equivalent to A and also for B, in the hope that something will show the required result. 

Stefy wrote:

The matrix B has nicer eigen values.

Yes, but how do eigenvalues enter into this problem anyway?  dizzy

Bob


You cannot teach a man anything;  you can only help him find it within himself..........Galileo Galilei

#9 2013-11-21 21:20:55

anonimnystefy
Real Member

Online

Re: Polynomials and matrixes

You need eigenvalues to diagonalize a matrix.


The limit operator is just an excuse for doing something you know you can't.
“It's the subject that nobody knows anything about that we can all talk about!” ― Richard Feynman
“Taking a new step, uttering a new word, is what people fear most.” ― Fyodor Dostoyevsky, Crime and Punishment

#10 2013-11-22 02:01:22

bob bundy
Moderator

Offline

Re: Polynomials and matrixes

But why would I want to do that to solve this problem?

B


You cannot teach a man anything;  you can only help him find it within himself..........Galileo Galilei

#11 2013-11-22 05:54:09

anonimnystefy
Real Member

Online

Re: Polynomials and matrixes

Because you can then get the general form of A^n or B^n.


The limit operator is just an excuse for doing something you know you can't.
“It's the subject that nobody knows anything about that we can all talk about!” ― Richard Feynman
“Taking a new step, uttering a new word, is what people fear most.” ― Fyodor Dostoyevsky, Crime and Punishment

Board footer

Powered by FluxBB