Discussion about math, puzzles, games and fun. Useful symbols: ÷ × ½ √ ∞ ≠ ≤ ≥ ≈ ⇒ ± ∈ Δ θ ∴ ∑ ∫ • π ƒ -¹ ² ³ °
You are not logged in.
Post a reply
Topic review (newest first)
Because you can then get the general form of A^n or B^n.
But why would I want to do that to solve this problem?
You need eigenvalues to diagonalize a matrix.
I wondered if that would work too. I was hoping that the powers of A would have some common feature that would be incompatible with the equivalent in B, but nothing obvious occurs.
Yes, but how do eigenvalues enter into this problem anyway?
Essentially, my first idea was to assume the opposite, i.e. that there is a polynomial such that p(A)=B, for example (I'd do the p(B)=A one separately).
I found a very useful link on finding the nth degree of a matrix (I can't link it unfortunately, but it's the first result that comes up when you google "finding the nth power of a matrix").
Still, something makes me think there's a much more elegant solution that I'm not seeing.
Not a clue, unfortunately.
Bezoux is referring to a matrix polynomial:
Now, what makes you think that this is a question about eigenvalues? Here's how to get a characteristic polynomial for a square matrix:
Multiplying and equating the first and second entries:
Solving for lambda:
And by a similar method for B
These are called the characteristic polynomials for A and for B.
But what has that got to do with
If p(A) means 'the characteristic polynomial' then it cannot be equal to a matrix. They just aren't the same thing.
where the small letters are the coefficients of a normal polynomial and there are a number of powers of A.
[note: There can be no 'constant' matrix at the end, because it would be easy to make any sum of matrices equal to B by a suitable choice of constant term.]
If that is correct, then we have to show that no combination of powers of A, multiplied by coefficients, can ever sum to give B (and similarly the other way round). At the moment I cannot think how to do that; but I'm working on it.