Similarity of matrices based on polynomials












2












$begingroup$


For $A,Bin mathbb{R}^{n,n}$ we know that characteristic polynomilas $p_A(x)=p_B(x)=(x-lambda_1)(x-lambda_2)cdotldotscdot(x-lambda_n)$, where $lambda_ineqlambda_j$ for $ineq j$. Prove that $A$ and $B$ are similar matrices.










share|cite|improve this question











$endgroup$








  • 1




    $begingroup$
    Note that $A,B$ are similar to the same diagonal matrix.
    $endgroup$
    – Song
    1 hour ago










  • $begingroup$
    Do you mean the diagonal matrix with the eigenvalues?
    $endgroup$
    – avan1235
    1 hour ago










  • $begingroup$
    Yes, with the eigenvalues of $A$ and $B$.
    $endgroup$
    – Song
    1 hour ago










  • $begingroup$
    But how to prove this formally?
    $endgroup$
    – avan1235
    1 hour ago










  • $begingroup$
    The question should specify what $p_A$ and $p_B$ mean. I suppose these are characteristic polynomials of the respective matrices, but this is not mentioned.
    $endgroup$
    – Marc van Leeuwen
    54 mins ago


















2












$begingroup$


For $A,Bin mathbb{R}^{n,n}$ we know that characteristic polynomilas $p_A(x)=p_B(x)=(x-lambda_1)(x-lambda_2)cdotldotscdot(x-lambda_n)$, where $lambda_ineqlambda_j$ for $ineq j$. Prove that $A$ and $B$ are similar matrices.










share|cite|improve this question











$endgroup$








  • 1




    $begingroup$
    Note that $A,B$ are similar to the same diagonal matrix.
    $endgroup$
    – Song
    1 hour ago










  • $begingroup$
    Do you mean the diagonal matrix with the eigenvalues?
    $endgroup$
    – avan1235
    1 hour ago










  • $begingroup$
    Yes, with the eigenvalues of $A$ and $B$.
    $endgroup$
    – Song
    1 hour ago










  • $begingroup$
    But how to prove this formally?
    $endgroup$
    – avan1235
    1 hour ago










  • $begingroup$
    The question should specify what $p_A$ and $p_B$ mean. I suppose these are characteristic polynomials of the respective matrices, but this is not mentioned.
    $endgroup$
    – Marc van Leeuwen
    54 mins ago
















2












2








2


1



$begingroup$


For $A,Bin mathbb{R}^{n,n}$ we know that characteristic polynomilas $p_A(x)=p_B(x)=(x-lambda_1)(x-lambda_2)cdotldotscdot(x-lambda_n)$, where $lambda_ineqlambda_j$ for $ineq j$. Prove that $A$ and $B$ are similar matrices.










share|cite|improve this question











$endgroup$




For $A,Bin mathbb{R}^{n,n}$ we know that characteristic polynomilas $p_A(x)=p_B(x)=(x-lambda_1)(x-lambda_2)cdotldotscdot(x-lambda_n)$, where $lambda_ineqlambda_j$ for $ineq j$. Prove that $A$ and $B$ are similar matrices.







matrices eigenvalues-eigenvectors






share|cite|improve this question















share|cite|improve this question













share|cite|improve this question




share|cite|improve this question








edited 44 mins ago







avan1235

















asked 1 hour ago









avan1235avan1235

3126




3126








  • 1




    $begingroup$
    Note that $A,B$ are similar to the same diagonal matrix.
    $endgroup$
    – Song
    1 hour ago










  • $begingroup$
    Do you mean the diagonal matrix with the eigenvalues?
    $endgroup$
    – avan1235
    1 hour ago










  • $begingroup$
    Yes, with the eigenvalues of $A$ and $B$.
    $endgroup$
    – Song
    1 hour ago










  • $begingroup$
    But how to prove this formally?
    $endgroup$
    – avan1235
    1 hour ago










  • $begingroup$
    The question should specify what $p_A$ and $p_B$ mean. I suppose these are characteristic polynomials of the respective matrices, but this is not mentioned.
    $endgroup$
    – Marc van Leeuwen
    54 mins ago
















  • 1




    $begingroup$
    Note that $A,B$ are similar to the same diagonal matrix.
    $endgroup$
    – Song
    1 hour ago










  • $begingroup$
    Do you mean the diagonal matrix with the eigenvalues?
    $endgroup$
    – avan1235
    1 hour ago










  • $begingroup$
    Yes, with the eigenvalues of $A$ and $B$.
    $endgroup$
    – Song
    1 hour ago










  • $begingroup$
    But how to prove this formally?
    $endgroup$
    – avan1235
    1 hour ago










  • $begingroup$
    The question should specify what $p_A$ and $p_B$ mean. I suppose these are characteristic polynomials of the respective matrices, but this is not mentioned.
    $endgroup$
    – Marc van Leeuwen
    54 mins ago










1




1




$begingroup$
Note that $A,B$ are similar to the same diagonal matrix.
$endgroup$
– Song
1 hour ago




$begingroup$
Note that $A,B$ are similar to the same diagonal matrix.
$endgroup$
– Song
1 hour ago












$begingroup$
Do you mean the diagonal matrix with the eigenvalues?
$endgroup$
– avan1235
1 hour ago




$begingroup$
Do you mean the diagonal matrix with the eigenvalues?
$endgroup$
– avan1235
1 hour ago












$begingroup$
Yes, with the eigenvalues of $A$ and $B$.
$endgroup$
– Song
1 hour ago




$begingroup$
Yes, with the eigenvalues of $A$ and $B$.
$endgroup$
– Song
1 hour ago












$begingroup$
But how to prove this formally?
$endgroup$
– avan1235
1 hour ago




$begingroup$
But how to prove this formally?
$endgroup$
– avan1235
1 hour ago












$begingroup$
The question should specify what $p_A$ and $p_B$ mean. I suppose these are characteristic polynomials of the respective matrices, but this is not mentioned.
$endgroup$
– Marc van Leeuwen
54 mins ago






$begingroup$
The question should specify what $p_A$ and $p_B$ mean. I suppose these are characteristic polynomials of the respective matrices, but this is not mentioned.
$endgroup$
– Marc van Leeuwen
54 mins ago












4 Answers
4






active

oldest

votes


















3












$begingroup$

The criterion that



$p_A(x) = p_B(x) = displaystyle prod_1^n (x - x_i) tag 1$



with



$i ne j Longrightarrow x_i ne x_j tag 2$



implies that the eigenvalues of $A$ and $B$ are distinct; hence each matrix is similar to the diagonal matrix



$D = [delta_{kl}x_l]; tag 3$



thus there exist invertible matrices $P$ and $Q$ such that



$PAP^{-1} = D = QBQ^{-1}; tag 4$



but this implies



$A = P^{-1}QBQ^{-1}P = (P^{-1}Q)B(P^{-1}Q)^{-1}, tag 5$



which shows that $A$ and $B$ are similar. $OEDelta$.



Nota Bene: In a comment to this answer, our OP avan1235 asks why $A$ and $B$ are similar to $D$; this may be seen as follows; considering first the matrix $A$, we see that since the $x_i$ are distinct, each corresponds to a distinct eigenvector $vec e_i$:



$A vec e_i = x_i vec e_i; tag 5$



now we may form the matrix $E$ whose columns are the $vec e_i$:



$E = [vec e_1 ; vec e_2 ; ldots ; vec e_n ]; tag 6$



it is easy to see that



$AE = [Avec e_1 ; Avec e_2 ; ldots ; Avec e_n ] = [x_1 vec e_1 ; x_2 vec e_2 ; ldots ; x_n vec e_n ]; tag 7$



it is also easy to see that



$ED = [x_1 vec e_1 ; x_2 vec e_2 ; ldots ; x_n vec e_n ]; tag 8$



thus,



$AE = ED; tag 9$



now since the $x_i$ are distinct, the $vec e_i$ are linearly independent, whence $E$ is an invertible matrix; therefore we have



$E^{-1}AE = D; tag{10}$



the same logic applies of course to $B$. End of Note.






share|cite|improve this answer











$endgroup$









  • 1




    $begingroup$
    Why is each matrix similiar to $D$?
    $endgroup$
    – avan1235
    1 hour ago










  • $begingroup$
    @avan1235: check out the edits to my answer, under the heading Nota Bene!
    $endgroup$
    – Robert Lewis
    57 mins ago





















3












$begingroup$

We can prove that if $p_A(t)=(t-lambda_1)(t-lambda_2)cdots(t-lambda_n)$ where $lambda_ine lambda_j$ for $ine j$, then eigenvectors $x_iinker(A-lambda_iI)$, $ile n$ are linearly independent, hence form a basis.



We proceed by induction. Assume $x_i$, $i<k$ are linearly independent. Consider
$$
sum_{i=1}^k alpha_i x_i =0.tag{*}
$$
By left-multiplying $A$, we get $$
sum_{i=1}^k alpha_i lambda_i x_i =0.
$$
With $text{(*)}$, this leads to
$$
sum_{i=1}^{k-1} alpha_i (lambda_i-lambda_k) x_i =0.
$$
By the assumption that $x_i$, $i<k$ are linearly independent, it follows $alpha_i(lambda_i-lambda_k)=0$ for all $i<k$. Since $lambda_ine lambda_k$, we have $alpha_i=0$ for all $i<k$, hence $alpha_k=0$. This shows $x_i$, $ile k$ are linearly independent, and by induction, it holds $x_i$, $ile n$ are linearly independent.



Now, since there exists a basis ${x_i, ile n}$ consisting of eigenvectors of $A$, it follows $A$ is diagonalizable.






share|cite|improve this answer









$endgroup$





















    3












    $begingroup$

    Matrices are similar whenever they are related by a change of basis, and similarity is an equivalence relation, so it suffices when a separate change of basis applied to each of the matrices results in the same matrix in both cases. A matrix is diagonalisable if and only if some change of basis (namely one to a basis of eigenvectors) results in a diagonal matrix, and the diagonal entries of that diagonal matrix then are the eigenvalues associated to the respective eigenvectors of the basis used.



    You appear to know the result that whenever the characteristic polynomial of a matrix splits into distinct factors $x-lambda_i$ as given in the question, then the matrix is diagonalisable with those $lambda_i$ as eigenvalues. Since you are given that this holds for $A$ and $B$, both are similar to the same diagonal matrix, and you are done.






    share|cite|improve this answer









    $endgroup$





















      2












      $begingroup$

      As noted in the comments, $A$ and $B$ are similar to the same diagonal matrix, the matrix of eigenvalues. This is true whenever there is a basis consisting of eigenvectors.



      It's easy to see that $P^{-1}AP=D$, where $P$ has columns the eigenvectors, and $D$ is diagonal with the e-values on the diagonal.






      share|cite|improve this answer









      $endgroup$













      • $begingroup$
        Why are they similar to this diagonal matrix?
        $endgroup$
        – avan1235
        1 hour ago










      • $begingroup$
        To see it, multiply on the left by $P$ on both sides. Then note that if ${v_1,dots, v_n}$ is the basis of eigenvectors, you just get $Av_i=lambda_iv_i$, as you go column by column, doing the matrix multiplication.
        $endgroup$
        – Chris Custer
        57 mins ago











      Your Answer





      StackExchange.ifUsing("editor", function () {
      return StackExchange.using("mathjaxEditing", function () {
      StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
      StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
      });
      });
      }, "mathjax-editing");

      StackExchange.ready(function() {
      var channelOptions = {
      tags: "".split(" "),
      id: "69"
      };
      initTagRenderer("".split(" "), "".split(" "), channelOptions);

      StackExchange.using("externalEditor", function() {
      // Have to fire editor after snippets, if snippets enabled
      if (StackExchange.settings.snippets.snippetsEnabled) {
      StackExchange.using("snippets", function() {
      createEditor();
      });
      }
      else {
      createEditor();
      }
      });

      function createEditor() {
      StackExchange.prepareEditor({
      heartbeatType: 'answer',
      autoActivateHeartbeat: false,
      convertImagesToLinks: true,
      noModals: true,
      showLowRepImageUploadWarning: true,
      reputationToPostImages: 10,
      bindNavPrevention: true,
      postfix: "",
      imageUploader: {
      brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
      contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
      allowUrls: true
      },
      noCode: true, onDemand: true,
      discardSelector: ".discard-answer"
      ,immediatelyShowMarkdownHelp:true
      });


      }
      });














      draft saved

      draft discarded


















      StackExchange.ready(
      function () {
      StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3084198%2fsimilarity-of-matrices-based-on-polynomials%23new-answer', 'question_page');
      }
      );

      Post as a guest















      Required, but never shown

























      4 Answers
      4






      active

      oldest

      votes








      4 Answers
      4






      active

      oldest

      votes









      active

      oldest

      votes






      active

      oldest

      votes









      3












      $begingroup$

      The criterion that



      $p_A(x) = p_B(x) = displaystyle prod_1^n (x - x_i) tag 1$



      with



      $i ne j Longrightarrow x_i ne x_j tag 2$



      implies that the eigenvalues of $A$ and $B$ are distinct; hence each matrix is similar to the diagonal matrix



      $D = [delta_{kl}x_l]; tag 3$



      thus there exist invertible matrices $P$ and $Q$ such that



      $PAP^{-1} = D = QBQ^{-1}; tag 4$



      but this implies



      $A = P^{-1}QBQ^{-1}P = (P^{-1}Q)B(P^{-1}Q)^{-1}, tag 5$



      which shows that $A$ and $B$ are similar. $OEDelta$.



      Nota Bene: In a comment to this answer, our OP avan1235 asks why $A$ and $B$ are similar to $D$; this may be seen as follows; considering first the matrix $A$, we see that since the $x_i$ are distinct, each corresponds to a distinct eigenvector $vec e_i$:



      $A vec e_i = x_i vec e_i; tag 5$



      now we may form the matrix $E$ whose columns are the $vec e_i$:



      $E = [vec e_1 ; vec e_2 ; ldots ; vec e_n ]; tag 6$



      it is easy to see that



      $AE = [Avec e_1 ; Avec e_2 ; ldots ; Avec e_n ] = [x_1 vec e_1 ; x_2 vec e_2 ; ldots ; x_n vec e_n ]; tag 7$



      it is also easy to see that



      $ED = [x_1 vec e_1 ; x_2 vec e_2 ; ldots ; x_n vec e_n ]; tag 8$



      thus,



      $AE = ED; tag 9$



      now since the $x_i$ are distinct, the $vec e_i$ are linearly independent, whence $E$ is an invertible matrix; therefore we have



      $E^{-1}AE = D; tag{10}$



      the same logic applies of course to $B$. End of Note.






      share|cite|improve this answer











      $endgroup$









      • 1




        $begingroup$
        Why is each matrix similiar to $D$?
        $endgroup$
        – avan1235
        1 hour ago










      • $begingroup$
        @avan1235: check out the edits to my answer, under the heading Nota Bene!
        $endgroup$
        – Robert Lewis
        57 mins ago


















      3












      $begingroup$

      The criterion that



      $p_A(x) = p_B(x) = displaystyle prod_1^n (x - x_i) tag 1$



      with



      $i ne j Longrightarrow x_i ne x_j tag 2$



      implies that the eigenvalues of $A$ and $B$ are distinct; hence each matrix is similar to the diagonal matrix



      $D = [delta_{kl}x_l]; tag 3$



      thus there exist invertible matrices $P$ and $Q$ such that



      $PAP^{-1} = D = QBQ^{-1}; tag 4$



      but this implies



      $A = P^{-1}QBQ^{-1}P = (P^{-1}Q)B(P^{-1}Q)^{-1}, tag 5$



      which shows that $A$ and $B$ are similar. $OEDelta$.



      Nota Bene: In a comment to this answer, our OP avan1235 asks why $A$ and $B$ are similar to $D$; this may be seen as follows; considering first the matrix $A$, we see that since the $x_i$ are distinct, each corresponds to a distinct eigenvector $vec e_i$:



      $A vec e_i = x_i vec e_i; tag 5$



      now we may form the matrix $E$ whose columns are the $vec e_i$:



      $E = [vec e_1 ; vec e_2 ; ldots ; vec e_n ]; tag 6$



      it is easy to see that



      $AE = [Avec e_1 ; Avec e_2 ; ldots ; Avec e_n ] = [x_1 vec e_1 ; x_2 vec e_2 ; ldots ; x_n vec e_n ]; tag 7$



      it is also easy to see that



      $ED = [x_1 vec e_1 ; x_2 vec e_2 ; ldots ; x_n vec e_n ]; tag 8$



      thus,



      $AE = ED; tag 9$



      now since the $x_i$ are distinct, the $vec e_i$ are linearly independent, whence $E$ is an invertible matrix; therefore we have



      $E^{-1}AE = D; tag{10}$



      the same logic applies of course to $B$. End of Note.






      share|cite|improve this answer











      $endgroup$









      • 1




        $begingroup$
        Why is each matrix similiar to $D$?
        $endgroup$
        – avan1235
        1 hour ago










      • $begingroup$
        @avan1235: check out the edits to my answer, under the heading Nota Bene!
        $endgroup$
        – Robert Lewis
        57 mins ago
















      3












      3








      3





      $begingroup$

      The criterion that



      $p_A(x) = p_B(x) = displaystyle prod_1^n (x - x_i) tag 1$



      with



      $i ne j Longrightarrow x_i ne x_j tag 2$



      implies that the eigenvalues of $A$ and $B$ are distinct; hence each matrix is similar to the diagonal matrix



      $D = [delta_{kl}x_l]; tag 3$



      thus there exist invertible matrices $P$ and $Q$ such that



      $PAP^{-1} = D = QBQ^{-1}; tag 4$



      but this implies



      $A = P^{-1}QBQ^{-1}P = (P^{-1}Q)B(P^{-1}Q)^{-1}, tag 5$



      which shows that $A$ and $B$ are similar. $OEDelta$.



      Nota Bene: In a comment to this answer, our OP avan1235 asks why $A$ and $B$ are similar to $D$; this may be seen as follows; considering first the matrix $A$, we see that since the $x_i$ are distinct, each corresponds to a distinct eigenvector $vec e_i$:



      $A vec e_i = x_i vec e_i; tag 5$



      now we may form the matrix $E$ whose columns are the $vec e_i$:



      $E = [vec e_1 ; vec e_2 ; ldots ; vec e_n ]; tag 6$



      it is easy to see that



      $AE = [Avec e_1 ; Avec e_2 ; ldots ; Avec e_n ] = [x_1 vec e_1 ; x_2 vec e_2 ; ldots ; x_n vec e_n ]; tag 7$



      it is also easy to see that



      $ED = [x_1 vec e_1 ; x_2 vec e_2 ; ldots ; x_n vec e_n ]; tag 8$



      thus,



      $AE = ED; tag 9$



      now since the $x_i$ are distinct, the $vec e_i$ are linearly independent, whence $E$ is an invertible matrix; therefore we have



      $E^{-1}AE = D; tag{10}$



      the same logic applies of course to $B$. End of Note.






      share|cite|improve this answer











      $endgroup$



      The criterion that



      $p_A(x) = p_B(x) = displaystyle prod_1^n (x - x_i) tag 1$



      with



      $i ne j Longrightarrow x_i ne x_j tag 2$



      implies that the eigenvalues of $A$ and $B$ are distinct; hence each matrix is similar to the diagonal matrix



      $D = [delta_{kl}x_l]; tag 3$



      thus there exist invertible matrices $P$ and $Q$ such that



      $PAP^{-1} = D = QBQ^{-1}; tag 4$



      but this implies



      $A = P^{-1}QBQ^{-1}P = (P^{-1}Q)B(P^{-1}Q)^{-1}, tag 5$



      which shows that $A$ and $B$ are similar. $OEDelta$.



      Nota Bene: In a comment to this answer, our OP avan1235 asks why $A$ and $B$ are similar to $D$; this may be seen as follows; considering first the matrix $A$, we see that since the $x_i$ are distinct, each corresponds to a distinct eigenvector $vec e_i$:



      $A vec e_i = x_i vec e_i; tag 5$



      now we may form the matrix $E$ whose columns are the $vec e_i$:



      $E = [vec e_1 ; vec e_2 ; ldots ; vec e_n ]; tag 6$



      it is easy to see that



      $AE = [Avec e_1 ; Avec e_2 ; ldots ; Avec e_n ] = [x_1 vec e_1 ; x_2 vec e_2 ; ldots ; x_n vec e_n ]; tag 7$



      it is also easy to see that



      $ED = [x_1 vec e_1 ; x_2 vec e_2 ; ldots ; x_n vec e_n ]; tag 8$



      thus,



      $AE = ED; tag 9$



      now since the $x_i$ are distinct, the $vec e_i$ are linearly independent, whence $E$ is an invertible matrix; therefore we have



      $E^{-1}AE = D; tag{10}$



      the same logic applies of course to $B$. End of Note.







      share|cite|improve this answer














      share|cite|improve this answer



      share|cite|improve this answer








      edited 58 mins ago

























      answered 1 hour ago









      Robert LewisRobert Lewis

      44.9k22964




      44.9k22964








      • 1




        $begingroup$
        Why is each matrix similiar to $D$?
        $endgroup$
        – avan1235
        1 hour ago










      • $begingroup$
        @avan1235: check out the edits to my answer, under the heading Nota Bene!
        $endgroup$
        – Robert Lewis
        57 mins ago
















      • 1




        $begingroup$
        Why is each matrix similiar to $D$?
        $endgroup$
        – avan1235
        1 hour ago










      • $begingroup$
        @avan1235: check out the edits to my answer, under the heading Nota Bene!
        $endgroup$
        – Robert Lewis
        57 mins ago










      1




      1




      $begingroup$
      Why is each matrix similiar to $D$?
      $endgroup$
      – avan1235
      1 hour ago




      $begingroup$
      Why is each matrix similiar to $D$?
      $endgroup$
      – avan1235
      1 hour ago












      $begingroup$
      @avan1235: check out the edits to my answer, under the heading Nota Bene!
      $endgroup$
      – Robert Lewis
      57 mins ago






      $begingroup$
      @avan1235: check out the edits to my answer, under the heading Nota Bene!
      $endgroup$
      – Robert Lewis
      57 mins ago













      3












      $begingroup$

      We can prove that if $p_A(t)=(t-lambda_1)(t-lambda_2)cdots(t-lambda_n)$ where $lambda_ine lambda_j$ for $ine j$, then eigenvectors $x_iinker(A-lambda_iI)$, $ile n$ are linearly independent, hence form a basis.



      We proceed by induction. Assume $x_i$, $i<k$ are linearly independent. Consider
      $$
      sum_{i=1}^k alpha_i x_i =0.tag{*}
      $$
      By left-multiplying $A$, we get $$
      sum_{i=1}^k alpha_i lambda_i x_i =0.
      $$
      With $text{(*)}$, this leads to
      $$
      sum_{i=1}^{k-1} alpha_i (lambda_i-lambda_k) x_i =0.
      $$
      By the assumption that $x_i$, $i<k$ are linearly independent, it follows $alpha_i(lambda_i-lambda_k)=0$ for all $i<k$. Since $lambda_ine lambda_k$, we have $alpha_i=0$ for all $i<k$, hence $alpha_k=0$. This shows $x_i$, $ile k$ are linearly independent, and by induction, it holds $x_i$, $ile n$ are linearly independent.



      Now, since there exists a basis ${x_i, ile n}$ consisting of eigenvectors of $A$, it follows $A$ is diagonalizable.






      share|cite|improve this answer









      $endgroup$


















        3












        $begingroup$

        We can prove that if $p_A(t)=(t-lambda_1)(t-lambda_2)cdots(t-lambda_n)$ where $lambda_ine lambda_j$ for $ine j$, then eigenvectors $x_iinker(A-lambda_iI)$, $ile n$ are linearly independent, hence form a basis.



        We proceed by induction. Assume $x_i$, $i<k$ are linearly independent. Consider
        $$
        sum_{i=1}^k alpha_i x_i =0.tag{*}
        $$
        By left-multiplying $A$, we get $$
        sum_{i=1}^k alpha_i lambda_i x_i =0.
        $$
        With $text{(*)}$, this leads to
        $$
        sum_{i=1}^{k-1} alpha_i (lambda_i-lambda_k) x_i =0.
        $$
        By the assumption that $x_i$, $i<k$ are linearly independent, it follows $alpha_i(lambda_i-lambda_k)=0$ for all $i<k$. Since $lambda_ine lambda_k$, we have $alpha_i=0$ for all $i<k$, hence $alpha_k=0$. This shows $x_i$, $ile k$ are linearly independent, and by induction, it holds $x_i$, $ile n$ are linearly independent.



        Now, since there exists a basis ${x_i, ile n}$ consisting of eigenvectors of $A$, it follows $A$ is diagonalizable.






        share|cite|improve this answer









        $endgroup$
















          3












          3








          3





          $begingroup$

          We can prove that if $p_A(t)=(t-lambda_1)(t-lambda_2)cdots(t-lambda_n)$ where $lambda_ine lambda_j$ for $ine j$, then eigenvectors $x_iinker(A-lambda_iI)$, $ile n$ are linearly independent, hence form a basis.



          We proceed by induction. Assume $x_i$, $i<k$ are linearly independent. Consider
          $$
          sum_{i=1}^k alpha_i x_i =0.tag{*}
          $$
          By left-multiplying $A$, we get $$
          sum_{i=1}^k alpha_i lambda_i x_i =0.
          $$
          With $text{(*)}$, this leads to
          $$
          sum_{i=1}^{k-1} alpha_i (lambda_i-lambda_k) x_i =0.
          $$
          By the assumption that $x_i$, $i<k$ are linearly independent, it follows $alpha_i(lambda_i-lambda_k)=0$ for all $i<k$. Since $lambda_ine lambda_k$, we have $alpha_i=0$ for all $i<k$, hence $alpha_k=0$. This shows $x_i$, $ile k$ are linearly independent, and by induction, it holds $x_i$, $ile n$ are linearly independent.



          Now, since there exists a basis ${x_i, ile n}$ consisting of eigenvectors of $A$, it follows $A$ is diagonalizable.






          share|cite|improve this answer









          $endgroup$



          We can prove that if $p_A(t)=(t-lambda_1)(t-lambda_2)cdots(t-lambda_n)$ where $lambda_ine lambda_j$ for $ine j$, then eigenvectors $x_iinker(A-lambda_iI)$, $ile n$ are linearly independent, hence form a basis.



          We proceed by induction. Assume $x_i$, $i<k$ are linearly independent. Consider
          $$
          sum_{i=1}^k alpha_i x_i =0.tag{*}
          $$
          By left-multiplying $A$, we get $$
          sum_{i=1}^k alpha_i lambda_i x_i =0.
          $$
          With $text{(*)}$, this leads to
          $$
          sum_{i=1}^{k-1} alpha_i (lambda_i-lambda_k) x_i =0.
          $$
          By the assumption that $x_i$, $i<k$ are linearly independent, it follows $alpha_i(lambda_i-lambda_k)=0$ for all $i<k$. Since $lambda_ine lambda_k$, we have $alpha_i=0$ for all $i<k$, hence $alpha_k=0$. This shows $x_i$, $ile k$ are linearly independent, and by induction, it holds $x_i$, $ile n$ are linearly independent.



          Now, since there exists a basis ${x_i, ile n}$ consisting of eigenvectors of $A$, it follows $A$ is diagonalizable.







          share|cite|improve this answer












          share|cite|improve this answer



          share|cite|improve this answer










          answered 1 hour ago









          SongSong

          9,709627




          9,709627























              3












              $begingroup$

              Matrices are similar whenever they are related by a change of basis, and similarity is an equivalence relation, so it suffices when a separate change of basis applied to each of the matrices results in the same matrix in both cases. A matrix is diagonalisable if and only if some change of basis (namely one to a basis of eigenvectors) results in a diagonal matrix, and the diagonal entries of that diagonal matrix then are the eigenvalues associated to the respective eigenvectors of the basis used.



              You appear to know the result that whenever the characteristic polynomial of a matrix splits into distinct factors $x-lambda_i$ as given in the question, then the matrix is diagonalisable with those $lambda_i$ as eigenvalues. Since you are given that this holds for $A$ and $B$, both are similar to the same diagonal matrix, and you are done.






              share|cite|improve this answer









              $endgroup$


















                3












                $begingroup$

                Matrices are similar whenever they are related by a change of basis, and similarity is an equivalence relation, so it suffices when a separate change of basis applied to each of the matrices results in the same matrix in both cases. A matrix is diagonalisable if and only if some change of basis (namely one to a basis of eigenvectors) results in a diagonal matrix, and the diagonal entries of that diagonal matrix then are the eigenvalues associated to the respective eigenvectors of the basis used.



                You appear to know the result that whenever the characteristic polynomial of a matrix splits into distinct factors $x-lambda_i$ as given in the question, then the matrix is diagonalisable with those $lambda_i$ as eigenvalues. Since you are given that this holds for $A$ and $B$, both are similar to the same diagonal matrix, and you are done.






                share|cite|improve this answer









                $endgroup$
















                  3












                  3








                  3





                  $begingroup$

                  Matrices are similar whenever they are related by a change of basis, and similarity is an equivalence relation, so it suffices when a separate change of basis applied to each of the matrices results in the same matrix in both cases. A matrix is diagonalisable if and only if some change of basis (namely one to a basis of eigenvectors) results in a diagonal matrix, and the diagonal entries of that diagonal matrix then are the eigenvalues associated to the respective eigenvectors of the basis used.



                  You appear to know the result that whenever the characteristic polynomial of a matrix splits into distinct factors $x-lambda_i$ as given in the question, then the matrix is diagonalisable with those $lambda_i$ as eigenvalues. Since you are given that this holds for $A$ and $B$, both are similar to the same diagonal matrix, and you are done.






                  share|cite|improve this answer









                  $endgroup$



                  Matrices are similar whenever they are related by a change of basis, and similarity is an equivalence relation, so it suffices when a separate change of basis applied to each of the matrices results in the same matrix in both cases. A matrix is diagonalisable if and only if some change of basis (namely one to a basis of eigenvectors) results in a diagonal matrix, and the diagonal entries of that diagonal matrix then are the eigenvalues associated to the respective eigenvectors of the basis used.



                  You appear to know the result that whenever the characteristic polynomial of a matrix splits into distinct factors $x-lambda_i$ as given in the question, then the matrix is diagonalisable with those $lambda_i$ as eigenvalues. Since you are given that this holds for $A$ and $B$, both are similar to the same diagonal matrix, and you are done.







                  share|cite|improve this answer












                  share|cite|improve this answer



                  share|cite|improve this answer










                  answered 42 mins ago









                  Marc van LeeuwenMarc van Leeuwen

                  86.7k5107222




                  86.7k5107222























                      2












                      $begingroup$

                      As noted in the comments, $A$ and $B$ are similar to the same diagonal matrix, the matrix of eigenvalues. This is true whenever there is a basis consisting of eigenvectors.



                      It's easy to see that $P^{-1}AP=D$, where $P$ has columns the eigenvectors, and $D$ is diagonal with the e-values on the diagonal.






                      share|cite|improve this answer









                      $endgroup$













                      • $begingroup$
                        Why are they similar to this diagonal matrix?
                        $endgroup$
                        – avan1235
                        1 hour ago










                      • $begingroup$
                        To see it, multiply on the left by $P$ on both sides. Then note that if ${v_1,dots, v_n}$ is the basis of eigenvectors, you just get $Av_i=lambda_iv_i$, as you go column by column, doing the matrix multiplication.
                        $endgroup$
                        – Chris Custer
                        57 mins ago
















                      2












                      $begingroup$

                      As noted in the comments, $A$ and $B$ are similar to the same diagonal matrix, the matrix of eigenvalues. This is true whenever there is a basis consisting of eigenvectors.



                      It's easy to see that $P^{-1}AP=D$, where $P$ has columns the eigenvectors, and $D$ is diagonal with the e-values on the diagonal.






                      share|cite|improve this answer









                      $endgroup$













                      • $begingroup$
                        Why are they similar to this diagonal matrix?
                        $endgroup$
                        – avan1235
                        1 hour ago










                      • $begingroup$
                        To see it, multiply on the left by $P$ on both sides. Then note that if ${v_1,dots, v_n}$ is the basis of eigenvectors, you just get $Av_i=lambda_iv_i$, as you go column by column, doing the matrix multiplication.
                        $endgroup$
                        – Chris Custer
                        57 mins ago














                      2












                      2








                      2





                      $begingroup$

                      As noted in the comments, $A$ and $B$ are similar to the same diagonal matrix, the matrix of eigenvalues. This is true whenever there is a basis consisting of eigenvectors.



                      It's easy to see that $P^{-1}AP=D$, where $P$ has columns the eigenvectors, and $D$ is diagonal with the e-values on the diagonal.






                      share|cite|improve this answer









                      $endgroup$



                      As noted in the comments, $A$ and $B$ are similar to the same diagonal matrix, the matrix of eigenvalues. This is true whenever there is a basis consisting of eigenvectors.



                      It's easy to see that $P^{-1}AP=D$, where $P$ has columns the eigenvectors, and $D$ is diagonal with the e-values on the diagonal.







                      share|cite|improve this answer












                      share|cite|improve this answer



                      share|cite|improve this answer










                      answered 1 hour ago









                      Chris CusterChris Custer

                      11.6k3824




                      11.6k3824












                      • $begingroup$
                        Why are they similar to this diagonal matrix?
                        $endgroup$
                        – avan1235
                        1 hour ago










                      • $begingroup$
                        To see it, multiply on the left by $P$ on both sides. Then note that if ${v_1,dots, v_n}$ is the basis of eigenvectors, you just get $Av_i=lambda_iv_i$, as you go column by column, doing the matrix multiplication.
                        $endgroup$
                        – Chris Custer
                        57 mins ago


















                      • $begingroup$
                        Why are they similar to this diagonal matrix?
                        $endgroup$
                        – avan1235
                        1 hour ago










                      • $begingroup$
                        To see it, multiply on the left by $P$ on both sides. Then note that if ${v_1,dots, v_n}$ is the basis of eigenvectors, you just get $Av_i=lambda_iv_i$, as you go column by column, doing the matrix multiplication.
                        $endgroup$
                        – Chris Custer
                        57 mins ago
















                      $begingroup$
                      Why are they similar to this diagonal matrix?
                      $endgroup$
                      – avan1235
                      1 hour ago




                      $begingroup$
                      Why are they similar to this diagonal matrix?
                      $endgroup$
                      – avan1235
                      1 hour ago












                      $begingroup$
                      To see it, multiply on the left by $P$ on both sides. Then note that if ${v_1,dots, v_n}$ is the basis of eigenvectors, you just get $Av_i=lambda_iv_i$, as you go column by column, doing the matrix multiplication.
                      $endgroup$
                      – Chris Custer
                      57 mins ago




                      $begingroup$
                      To see it, multiply on the left by $P$ on both sides. Then note that if ${v_1,dots, v_n}$ is the basis of eigenvectors, you just get $Av_i=lambda_iv_i$, as you go column by column, doing the matrix multiplication.
                      $endgroup$
                      – Chris Custer
                      57 mins ago


















                      draft saved

                      draft discarded




















































                      Thanks for contributing an answer to Mathematics Stack Exchange!


                      • Please be sure to answer the question. Provide details and share your research!

                      But avoid



                      • Asking for help, clarification, or responding to other answers.

                      • Making statements based on opinion; back them up with references or personal experience.


                      Use MathJax to format equations. MathJax reference.


                      To learn more, see our tips on writing great answers.




                      draft saved


                      draft discarded














                      StackExchange.ready(
                      function () {
                      StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3084198%2fsimilarity-of-matrices-based-on-polynomials%23new-answer', 'question_page');
                      }
                      );

                      Post as a guest















                      Required, but never shown





















































                      Required, but never shown














                      Required, but never shown












                      Required, but never shown







                      Required, but never shown

































                      Required, but never shown














                      Required, but never shown












                      Required, but never shown







                      Required, but never shown







                      Popular posts from this blog

                      How to label and detect the document text images

                      Vallis Paradisi

                      Tabula Rosettana