Math Is Fun Forum
  Discussion about math, puzzles, games and fun.   Useful symbols: ÷ × ½ √ ∞ ≠ ≤ ≥ ≈ ⇒ ± ∈ Δ θ ∴ ∑ ∫ • π ƒ -¹ ² ³ °




Not registered yet?

Post a reply

Go back

Write your message and submit
:) :| :( :D :o ;) :/ :P :lol: :mad: :rolleyes: :cool: | :dizzy :eek :kiss :roflol :rolleyes :shame :down :up :touched :sleep :wave :swear :tongue :what :faint :dunno

Go back

Topic review (newest first)

2013-11-22 05:54:09

Because you can then get the general form of A^n or B^n.

bob bundy
2013-11-22 02:01:22

But why would I want to do that to solve this problem?


2013-11-21 21:20:55

You need eigenvalues to diagonalize a matrix.

bob bundy
2013-11-21 18:42:10

Bezoux wrote:

my first idea was to assume the opposite, i.e. that there is a polynomial such that p(A)=B, for example (I'd do the p(B)=A one separately).

I wondered if that would work too.  I was hoping that the powers of A would have some common feature that would be incompatible with the equivalent in B, but nothing obvious occurs.

I'm also trying to construct a series of geometric transformations equivalent to A and also for B, in the hope that something will show the required result. 

Stefy wrote:

The matrix B has nicer eigen values.

Yes, but how do eigenvalues enter into this problem anyway?  dizzy


2013-11-21 10:08:28


The matrix B has nicer eigen mvalues.

2013-11-21 09:56:40

Essentially, my first idea was to assume the opposite, i.e. that there is a polynomial such that p(A)=B, for example (I'd do the p(B)=A one separately).

, so if I find A^n for any natural number N (which can be done via diagonalization, if I'm not mistaken), I could get a contradiction out of it, but the process is a bit too complicated for such ugly numbers (the roots of the first quadratic equation have square roots of 21 in them).
I found a very useful link on finding the nth degree of a matrix (I can't link it unfortunately, but it's the first result that comes up when you google "finding the nth power of a matrix").

Still, something makes me think there's a much more elegant solution that I'm not seeing.

2013-11-19 10:35:43

Not a clue, unfortunately. sad

The Wikipedia article is very badly written. Also, Wolfram MathWorld has a different definition of matrix polynomial ( defining it as a polynomial with matrix coefficients rather than matrix variables – but I think the Wikipedia definition makes more sense. In other words, if p(x) is a polynomial, p(A) is the matrix polynomial obtained by replacing the variable x by the matrix A and the constant term by a0I2 where I2 (= A0) is the 2×2 identity matrix.

I’ve Google-searched but found very little in the way of help on tackling problems of this sort. hmm

bob bundy
2013-11-19 05:49:44

hi Nehushtan

Many thanks for that info.  I'd not met that before.  Any idea about how to do the problem?


2013-11-19 01:54:59

Bezoux is referring to a matrix polynomial:

(Not to be confused with “polynomial matrix”, which is a matrix whose entries are polynomials.)

bob bundy
2013-11-19 00:48:05

hi Bezoux

Welcome to the forum.

I have to admit straight away that I cannot do this question.  I held off making a post in the hope someone else would, but it doesn't look like that is going to happen, so I'll jump in with what I have.  Maybe someone will notice who can put us both straight.  smile

Firstly, I'm assuming those are 2 by 2 matrices.  This is how to display them; click the matrix and you will see the underlying Latex code.


Now, what makes you think that this is a question about eigenvalues?  Here's how to get a characteristic polynomial for a square matrix:

Multiplying and equating the first and second entries:

Solving for lambda:

And by a similar method for B

These are called the characteristic polynomials for A and for B.

But what has that got to do with

p(A)=B or p(B)=A.

If p(A) means 'the characteristic polynomial' then it cannot be equal to a matrix.  They just aren't the same thing.

So what does p(A) mean?  Do you have anything in your notes / textbook that tells us, because I don't recognise the notation.

The only thing I can think of is this:

where the small letters are the coefficients of a normal polynomial and there are a number of powers of A.

[note:  There can be no 'constant' matrix at the end, because it would be easy to make any sum of matrices equal to B by a suitable choice of constant term.]

If that is correct, then we have to show that no combination of powers of A, multiplied by coefficients, can ever sum to give B (and similarly the other way round).  At the moment I cannot think how to do that; but I'm working on it.



2013-11-18 09:50:07

Hi everyone,
I've encountered a problem while studying matrices.
A={{2,1},{3,-1}}, B={{4,-2},{3,-1}}
Prove that there does not exist a polynomial with real coefficients such that p(A)=B or p(B)=A.
I've read up on eigenvalues, eigenvectors, characteristic polynomials and diagonalization, but nothing seems to be making sense, as the whole thing gets way too complicated for a high school problem.
I think there's a way to do this without using any of the aforementioned.
Can you help me out, please?

Board footer

Powered by FluxBB