Prove that if A∈M_nxn(F) has n distinct eigenvalues, then A is diagonalizable.
My idea: We know that f(t)=(t-λ1)^k1 * (t-λ2)^k2 * ... * (t-λn)^kn
λi ≠ λj whenever i ≠ j
We only need to show k1=k2=...=kn=1, then A is diagonalizable.
But how?
Copyright © 2024 QUIZLS.COM - All rights reserved.
Answers & Comments
Verified answer
The fact that k1 = k2 = ... = kn = 1 is given by the hypothesis that the eigenvalues are distinct. This only requires knowing that the characteristic polynomial of any n x n matrix has degree n. That is, as there are n distict λ's,
f(t) = (t - λ1)(t - λ2)∙∙∙(t - λn)
which is an n degree polynomial.
The significant thing is that the combined eigenspace is n dimensional. This requires proof, but the proof consists of showing that there are n linearly independent eigenvectors.
To show this, consider any two eigenvalues λi and λj for i ≠ j. It is given that λi ≠ λj. Let u and v be corresponding eigenvectors (u for λi and v for λj). We have to show that they are linearly independent. To that end, consider the equation
c1 u + c2 v = 0.
Next, multiply through by A
c1 Au + c2 Av = A0 = 0 ==> c1 λi u + c2 λj v = 0.
From the first equation, we can write c1 u = -c2 v and substitute this into the second equation to get
- c2 λi v + c2 λj v = 0 ==> (λj - λi) c2 v = 0.
Now, v is an eigenvector and hence is nonzero, and λj - λi ≠ 0 because the eigenvalues are distinct. So c2 = 0 necessarily. Plugging this back into the first equation gives c1 = 0. So the only way for
c1 u + c2 v = 0 to hold is for c1 = c2 = 0.
Thus u and v are linearly independent. You can easily extend this to show that all eigenvectors corresponding to distinct eigenvalues are linearly independent. Since A has n distinct eigenvalues, A has n linearly independent eigenvectors x1, x2, ...., xn.
If you form an n x n P matrix whose columns are the eigenvectors x1, ..., xn, this matrix is invertible---this is the transformation matrix so that
P^(-1)AP = D.
I'm not sure how far you have to go with your proof. There is a theorem that states that A is diagonalizable if and only if the dimension of its combined eigenspace is n. If you have that theorem, then the above argument that the eigenvectors for distinct eigenvalues are linearly independent is all that is needed.
it particularly is fake counterexamples are attainable to locate working occasion, take V=R^2 with often occurring beginning e1=(0,a million), e2=(a million,0) Any line that passes by 0 inspite of if isn't an axis is a vectorial section. of direction those do no longer look generated by ability of way of e1 or e2 In prevalent, evaluate V had beginning u,v The subspace generated by ability of way of u+v has length a million inspite of if isn't generated by ability of way of u or v edit: i used to be typing on an identical time you revealed. furnish up commenting. i do no longer care approximately factors and BA as much as you