Applications of basic linear algebra concepts to computer science?












3












$begingroup$


I'm trying to explain linear algebra to some programmers with computer science backgrounds. They took a course on it long ago, but don't seem to remember much. They can follow basic formalism, but none of it seems to stick when they don't see anything in "real life" to relate the concepts to. Do people know any good examples of applications in computer science using basic linear algebra concepts? Some things I was thinking of include:




  • Eigenvalues and eigenspaces

  • Image and kernel

  • Determinants

  • Dual space

  • Transpose

  • Inner products

  • Singular value decomposition










share|cite|improve this question









$endgroup$








  • 5




    $begingroup$
    It seems to me that this is a question you should ask to computer scientists.
    $endgroup$
    – Najib Idrissi
    8 hours ago










  • $begingroup$
    pi.math.cornell.edu/~mec/Winter2009/RalucaRemus/Lecture3/…
    $endgroup$
    – Liviu Nicolaescu
    8 hours ago










  • $begingroup$
    math.boisestate.edu/~wright/courses/m297/google_talk.pdf
    $endgroup$
    – Liviu Nicolaescu
    8 hours ago
















3












$begingroup$


I'm trying to explain linear algebra to some programmers with computer science backgrounds. They took a course on it long ago, but don't seem to remember much. They can follow basic formalism, but none of it seems to stick when they don't see anything in "real life" to relate the concepts to. Do people know any good examples of applications in computer science using basic linear algebra concepts? Some things I was thinking of include:




  • Eigenvalues and eigenspaces

  • Image and kernel

  • Determinants

  • Dual space

  • Transpose

  • Inner products

  • Singular value decomposition










share|cite|improve this question









$endgroup$








  • 5




    $begingroup$
    It seems to me that this is a question you should ask to computer scientists.
    $endgroup$
    – Najib Idrissi
    8 hours ago










  • $begingroup$
    pi.math.cornell.edu/~mec/Winter2009/RalucaRemus/Lecture3/…
    $endgroup$
    – Liviu Nicolaescu
    8 hours ago










  • $begingroup$
    math.boisestate.edu/~wright/courses/m297/google_talk.pdf
    $endgroup$
    – Liviu Nicolaescu
    8 hours ago














3












3








3





$begingroup$


I'm trying to explain linear algebra to some programmers with computer science backgrounds. They took a course on it long ago, but don't seem to remember much. They can follow basic formalism, but none of it seems to stick when they don't see anything in "real life" to relate the concepts to. Do people know any good examples of applications in computer science using basic linear algebra concepts? Some things I was thinking of include:




  • Eigenvalues and eigenspaces

  • Image and kernel

  • Determinants

  • Dual space

  • Transpose

  • Inner products

  • Singular value decomposition










share|cite|improve this question









$endgroup$




I'm trying to explain linear algebra to some programmers with computer science backgrounds. They took a course on it long ago, but don't seem to remember much. They can follow basic formalism, but none of it seems to stick when they don't see anything in "real life" to relate the concepts to. Do people know any good examples of applications in computer science using basic linear algebra concepts? Some things I was thinking of include:




  • Eigenvalues and eigenspaces

  • Image and kernel

  • Determinants

  • Dual space

  • Transpose

  • Inner products

  • Singular value decomposition







linear-algebra computer-science mathematics-education






share|cite|improve this question













share|cite|improve this question











share|cite|improve this question




share|cite|improve this question










asked 8 hours ago









KimKim

450313




450313








  • 5




    $begingroup$
    It seems to me that this is a question you should ask to computer scientists.
    $endgroup$
    – Najib Idrissi
    8 hours ago










  • $begingroup$
    pi.math.cornell.edu/~mec/Winter2009/RalucaRemus/Lecture3/…
    $endgroup$
    – Liviu Nicolaescu
    8 hours ago










  • $begingroup$
    math.boisestate.edu/~wright/courses/m297/google_talk.pdf
    $endgroup$
    – Liviu Nicolaescu
    8 hours ago














  • 5




    $begingroup$
    It seems to me that this is a question you should ask to computer scientists.
    $endgroup$
    – Najib Idrissi
    8 hours ago










  • $begingroup$
    pi.math.cornell.edu/~mec/Winter2009/RalucaRemus/Lecture3/…
    $endgroup$
    – Liviu Nicolaescu
    8 hours ago










  • $begingroup$
    math.boisestate.edu/~wright/courses/m297/google_talk.pdf
    $endgroup$
    – Liviu Nicolaescu
    8 hours ago








5




5




$begingroup$
It seems to me that this is a question you should ask to computer scientists.
$endgroup$
– Najib Idrissi
8 hours ago




$begingroup$
It seems to me that this is a question you should ask to computer scientists.
$endgroup$
– Najib Idrissi
8 hours ago












$begingroup$
pi.math.cornell.edu/~mec/Winter2009/RalucaRemus/Lecture3/…
$endgroup$
– Liviu Nicolaescu
8 hours ago




$begingroup$
pi.math.cornell.edu/~mec/Winter2009/RalucaRemus/Lecture3/…
$endgroup$
– Liviu Nicolaescu
8 hours ago












$begingroup$
math.boisestate.edu/~wright/courses/m297/google_talk.pdf
$endgroup$
– Liviu Nicolaescu
8 hours ago




$begingroup$
math.boisestate.edu/~wright/courses/m297/google_talk.pdf
$endgroup$
– Liviu Nicolaescu
8 hours ago










5 Answers
5






active

oldest

votes


















4












$begingroup$

For eigenvalues and eigenvectors, Google's original PageRank algorithm is a good example. The idea is to build a matrix $A$ where $A_{ij}$ is the probability that you would end up at page $i$ by following a randomly chosen hyperlink on page $j$. An eigenvector with eigenvalue $1$ of this matrix is a stable probability distribution over web pages. Generally a page which is linked to by many sites which are themselves linked to by many sites will have a high page rank.



Singular value decomposition is widely used for image processing. You can represent a black-and-white image by the matrix of values for its pixels. The $k$ largest singular values represent the most significant contributions to the image. You can then compress an image by replacing $A = UDV^T$ with $U'D'(V')^T$, where $D'$ is the $k times k$ matrix submatrix of $D$ of the largest singular values, $U'$ is the $n times k$ submatrix of $U$ consisting of the corresponding $k$ columns of $U$, and $(V')^T$ is the submatrix of $V^T$ consisting of the corresponding $k$ rows of $V^T$.



Computer graphics is another area where linear algebra plays a huge role. Consider the problem of finding the shadow of triangle onto some plane from a light source infinitely far away. This is just a projection map.






share|cite|improve this answer









$endgroup$













  • $begingroup$
    Yes, and more generally the linear algebra that arises from graph theory, via the adjacency matrix, graph Laplacian, etc.
    $endgroup$
    – Somatic Custard
    42 mins ago



















1












$begingroup$

The whole field of linear codes is a huge application of linear algebra (albeit over finite fields rather than $mathbb R$ or $mathbb C$)






share|cite|improve this answer









$endgroup$





















    1












    $begingroup$

    It's a standard pre-requisite in Machine Learning. For example, dimensionality reduction techniques, such as PCA and Johnson-Lindenstrauss all require a basic understanding of projections (and the former relies on the spectral theorem).






    share|cite|improve this answer









    $endgroup$





















      1












      $begingroup$

      Here are a few further suggestions:



      (1) The intersection of planes and lines (relevant in computer graphics) leads to linear equations and thus to kernels of linear maps.



      (2) The area of parallelograms (and thus of triangles) can be computed in terms of determinants. That's maybe a bit too simple because you only need very small matrices, but on the other hand many surfaces in computer graphics are constructed from triangles.



      If you compute the area of triangles in three dimensional space you'll also need to multiply $3times 2$-matrices with their tranposed matrices.



      (3) A simple way to represent directed graphs on a computer is to store their adjacency matriy $A$ as a double array. Transposing the matrix means inverting the direction of all edges.



      Moreover, the entries of the $k$-th power $A^k$ tell you how many paths of length $k$ exist between any two vertices of the graph.



      (4) The pseudo-inverse of a matrix (which is closely related to singular value decomposition) is very important for linear regression/least square fitting.



      (5) Consider a least square problem where you have, in addition, a linear restriction on the fitting parameters (probably, it would be helpful to present this situation in terms of a more concrete example). This means you restrict your parameter space to the kernel of a linear map. Now, use for instance the Gauss algorithm to represent your kernel as the image of another linear map. This yields a coordinate transform which represents your fitting problem without any restriction - so now you can simply solve it by using the pseudo-inverse approach.



      (6) Again an application from computer graphics: have your students build some nice three-dimensional geometric object (for instance, consisting of many triangles) and let them make it "visible" by projecting the object to the screen. (This is going to be a nice exercise in computing intersections of lines and planes because they need to determine which triangles are in front of the others).



      Now, you want to rotate the object - so your students will need to compute rotation matrices and apply them to the coordinates of the triangles. This is a nice illustration of orthogonal matrices and of the action of matrices as linear maps.



      (7) An illustration (though, admittedly not an application) of eigenvalues and eigenspaces: Have your students draw an ellipse with axes parallel to the coordinate axes in $mathbb{R}^2$, then have them rotate the ellipse by using certain orthogonal matrices.



      Afterwards, give them a quadratic equation in two variables, and make them plot the solution set (which should be chosen by you to agree with the rotated ellipse from above). Then have them diagonalize the symmetric matrix representing the corresponding quadratic form and make them compare their result with the rotation matrix above.



      (8) Another example from image processing (in addition to James' example of image compression): many image "improvement" techniques are based on applying filters to the image - which is simply an application of a linear map.



      (9) The discrete Fourier transform - which can by represented by a unitary matrix - is frequently used in image processing.



      (10) In case that you wish to go into numerical analysis: discretization methods for linear partial differential equations usually yield linear equations in finite dimensions which have to be solved numerically.






      share|cite|improve this answer











      $endgroup$





















        0












        $begingroup$

        Numerical linear algebra is a field of computer science.






        share|cite|improve this answer









        $endgroup$













          Your Answer





          StackExchange.ifUsing("editor", function () {
          return StackExchange.using("mathjaxEditing", function () {
          StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
          StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
          });
          });
          }, "mathjax-editing");

          StackExchange.ready(function() {
          var channelOptions = {
          tags: "".split(" "),
          id: "504"
          };
          initTagRenderer("".split(" "), "".split(" "), channelOptions);

          StackExchange.using("externalEditor", function() {
          // Have to fire editor after snippets, if snippets enabled
          if (StackExchange.settings.snippets.snippetsEnabled) {
          StackExchange.using("snippets", function() {
          createEditor();
          });
          }
          else {
          createEditor();
          }
          });

          function createEditor() {
          StackExchange.prepareEditor({
          heartbeatType: 'answer',
          autoActivateHeartbeat: false,
          convertImagesToLinks: true,
          noModals: true,
          showLowRepImageUploadWarning: true,
          reputationToPostImages: 10,
          bindNavPrevention: true,
          postfix: "",
          imageUploader: {
          brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
          contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
          allowUrls: true
          },
          noCode: true, onDemand: true,
          discardSelector: ".discard-answer"
          ,immediatelyShowMarkdownHelp:true
          });


          }
          });














          draft saved

          draft discarded


















          StackExchange.ready(
          function () {
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmathoverflow.net%2fquestions%2f322691%2fapplications-of-basic-linear-algebra-concepts-to-computer-science%23new-answer', 'question_page');
          }
          );

          Post as a guest















          Required, but never shown

























          5 Answers
          5






          active

          oldest

          votes








          5 Answers
          5






          active

          oldest

          votes









          active

          oldest

          votes






          active

          oldest

          votes









          4












          $begingroup$

          For eigenvalues and eigenvectors, Google's original PageRank algorithm is a good example. The idea is to build a matrix $A$ where $A_{ij}$ is the probability that you would end up at page $i$ by following a randomly chosen hyperlink on page $j$. An eigenvector with eigenvalue $1$ of this matrix is a stable probability distribution over web pages. Generally a page which is linked to by many sites which are themselves linked to by many sites will have a high page rank.



          Singular value decomposition is widely used for image processing. You can represent a black-and-white image by the matrix of values for its pixels. The $k$ largest singular values represent the most significant contributions to the image. You can then compress an image by replacing $A = UDV^T$ with $U'D'(V')^T$, where $D'$ is the $k times k$ matrix submatrix of $D$ of the largest singular values, $U'$ is the $n times k$ submatrix of $U$ consisting of the corresponding $k$ columns of $U$, and $(V')^T$ is the submatrix of $V^T$ consisting of the corresponding $k$ rows of $V^T$.



          Computer graphics is another area where linear algebra plays a huge role. Consider the problem of finding the shadow of triangle onto some plane from a light source infinitely far away. This is just a projection map.






          share|cite|improve this answer









          $endgroup$













          • $begingroup$
            Yes, and more generally the linear algebra that arises from graph theory, via the adjacency matrix, graph Laplacian, etc.
            $endgroup$
            – Somatic Custard
            42 mins ago
















          4












          $begingroup$

          For eigenvalues and eigenvectors, Google's original PageRank algorithm is a good example. The idea is to build a matrix $A$ where $A_{ij}$ is the probability that you would end up at page $i$ by following a randomly chosen hyperlink on page $j$. An eigenvector with eigenvalue $1$ of this matrix is a stable probability distribution over web pages. Generally a page which is linked to by many sites which are themselves linked to by many sites will have a high page rank.



          Singular value decomposition is widely used for image processing. You can represent a black-and-white image by the matrix of values for its pixels. The $k$ largest singular values represent the most significant contributions to the image. You can then compress an image by replacing $A = UDV^T$ with $U'D'(V')^T$, where $D'$ is the $k times k$ matrix submatrix of $D$ of the largest singular values, $U'$ is the $n times k$ submatrix of $U$ consisting of the corresponding $k$ columns of $U$, and $(V')^T$ is the submatrix of $V^T$ consisting of the corresponding $k$ rows of $V^T$.



          Computer graphics is another area where linear algebra plays a huge role. Consider the problem of finding the shadow of triangle onto some plane from a light source infinitely far away. This is just a projection map.






          share|cite|improve this answer









          $endgroup$













          • $begingroup$
            Yes, and more generally the linear algebra that arises from graph theory, via the adjacency matrix, graph Laplacian, etc.
            $endgroup$
            – Somatic Custard
            42 mins ago














          4












          4








          4





          $begingroup$

          For eigenvalues and eigenvectors, Google's original PageRank algorithm is a good example. The idea is to build a matrix $A$ where $A_{ij}$ is the probability that you would end up at page $i$ by following a randomly chosen hyperlink on page $j$. An eigenvector with eigenvalue $1$ of this matrix is a stable probability distribution over web pages. Generally a page which is linked to by many sites which are themselves linked to by many sites will have a high page rank.



          Singular value decomposition is widely used for image processing. You can represent a black-and-white image by the matrix of values for its pixels. The $k$ largest singular values represent the most significant contributions to the image. You can then compress an image by replacing $A = UDV^T$ with $U'D'(V')^T$, where $D'$ is the $k times k$ matrix submatrix of $D$ of the largest singular values, $U'$ is the $n times k$ submatrix of $U$ consisting of the corresponding $k$ columns of $U$, and $(V')^T$ is the submatrix of $V^T$ consisting of the corresponding $k$ rows of $V^T$.



          Computer graphics is another area where linear algebra plays a huge role. Consider the problem of finding the shadow of triangle onto some plane from a light source infinitely far away. This is just a projection map.






          share|cite|improve this answer









          $endgroup$



          For eigenvalues and eigenvectors, Google's original PageRank algorithm is a good example. The idea is to build a matrix $A$ where $A_{ij}$ is the probability that you would end up at page $i$ by following a randomly chosen hyperlink on page $j$. An eigenvector with eigenvalue $1$ of this matrix is a stable probability distribution over web pages. Generally a page which is linked to by many sites which are themselves linked to by many sites will have a high page rank.



          Singular value decomposition is widely used for image processing. You can represent a black-and-white image by the matrix of values for its pixels. The $k$ largest singular values represent the most significant contributions to the image. You can then compress an image by replacing $A = UDV^T$ with $U'D'(V')^T$, where $D'$ is the $k times k$ matrix submatrix of $D$ of the largest singular values, $U'$ is the $n times k$ submatrix of $U$ consisting of the corresponding $k$ columns of $U$, and $(V')^T$ is the submatrix of $V^T$ consisting of the corresponding $k$ rows of $V^T$.



          Computer graphics is another area where linear algebra plays a huge role. Consider the problem of finding the shadow of triangle onto some plane from a light source infinitely far away. This is just a projection map.







          share|cite|improve this answer












          share|cite|improve this answer



          share|cite|improve this answer










          answered 8 hours ago









          JamesJames

          894512




          894512












          • $begingroup$
            Yes, and more generally the linear algebra that arises from graph theory, via the adjacency matrix, graph Laplacian, etc.
            $endgroup$
            – Somatic Custard
            42 mins ago


















          • $begingroup$
            Yes, and more generally the linear algebra that arises from graph theory, via the adjacency matrix, graph Laplacian, etc.
            $endgroup$
            – Somatic Custard
            42 mins ago
















          $begingroup$
          Yes, and more generally the linear algebra that arises from graph theory, via the adjacency matrix, graph Laplacian, etc.
          $endgroup$
          – Somatic Custard
          42 mins ago




          $begingroup$
          Yes, and more generally the linear algebra that arises from graph theory, via the adjacency matrix, graph Laplacian, etc.
          $endgroup$
          – Somatic Custard
          42 mins ago











          1












          $begingroup$

          The whole field of linear codes is a huge application of linear algebra (albeit over finite fields rather than $mathbb R$ or $mathbb C$)






          share|cite|improve this answer









          $endgroup$


















            1












            $begingroup$

            The whole field of linear codes is a huge application of linear algebra (albeit over finite fields rather than $mathbb R$ or $mathbb C$)






            share|cite|improve this answer









            $endgroup$
















              1












              1








              1





              $begingroup$

              The whole field of linear codes is a huge application of linear algebra (albeit over finite fields rather than $mathbb R$ or $mathbb C$)






              share|cite|improve this answer









              $endgroup$



              The whole field of linear codes is a huge application of linear algebra (albeit over finite fields rather than $mathbb R$ or $mathbb C$)







              share|cite|improve this answer












              share|cite|improve this answer



              share|cite|improve this answer










              answered 8 hours ago









              Robert IsraelRobert Israel

              41.9k51120




              41.9k51120























                  1












                  $begingroup$

                  It's a standard pre-requisite in Machine Learning. For example, dimensionality reduction techniques, such as PCA and Johnson-Lindenstrauss all require a basic understanding of projections (and the former relies on the spectral theorem).






                  share|cite|improve this answer









                  $endgroup$


















                    1












                    $begingroup$

                    It's a standard pre-requisite in Machine Learning. For example, dimensionality reduction techniques, such as PCA and Johnson-Lindenstrauss all require a basic understanding of projections (and the former relies on the spectral theorem).






                    share|cite|improve this answer









                    $endgroup$
















                      1












                      1








                      1





                      $begingroup$

                      It's a standard pre-requisite in Machine Learning. For example, dimensionality reduction techniques, such as PCA and Johnson-Lindenstrauss all require a basic understanding of projections (and the former relies on the spectral theorem).






                      share|cite|improve this answer









                      $endgroup$



                      It's a standard pre-requisite in Machine Learning. For example, dimensionality reduction techniques, such as PCA and Johnson-Lindenstrauss all require a basic understanding of projections (and the former relies on the spectral theorem).







                      share|cite|improve this answer












                      share|cite|improve this answer



                      share|cite|improve this answer










                      answered 2 hours ago









                      Aryeh KontorovichAryeh Kontorovich

                      2,3461629




                      2,3461629























                          1












                          $begingroup$

                          Here are a few further suggestions:



                          (1) The intersection of planes and lines (relevant in computer graphics) leads to linear equations and thus to kernels of linear maps.



                          (2) The area of parallelograms (and thus of triangles) can be computed in terms of determinants. That's maybe a bit too simple because you only need very small matrices, but on the other hand many surfaces in computer graphics are constructed from triangles.



                          If you compute the area of triangles in three dimensional space you'll also need to multiply $3times 2$-matrices with their tranposed matrices.



                          (3) A simple way to represent directed graphs on a computer is to store their adjacency matriy $A$ as a double array. Transposing the matrix means inverting the direction of all edges.



                          Moreover, the entries of the $k$-th power $A^k$ tell you how many paths of length $k$ exist between any two vertices of the graph.



                          (4) The pseudo-inverse of a matrix (which is closely related to singular value decomposition) is very important for linear regression/least square fitting.



                          (5) Consider a least square problem where you have, in addition, a linear restriction on the fitting parameters (probably, it would be helpful to present this situation in terms of a more concrete example). This means you restrict your parameter space to the kernel of a linear map. Now, use for instance the Gauss algorithm to represent your kernel as the image of another linear map. This yields a coordinate transform which represents your fitting problem without any restriction - so now you can simply solve it by using the pseudo-inverse approach.



                          (6) Again an application from computer graphics: have your students build some nice three-dimensional geometric object (for instance, consisting of many triangles) and let them make it "visible" by projecting the object to the screen. (This is going to be a nice exercise in computing intersections of lines and planes because they need to determine which triangles are in front of the others).



                          Now, you want to rotate the object - so your students will need to compute rotation matrices and apply them to the coordinates of the triangles. This is a nice illustration of orthogonal matrices and of the action of matrices as linear maps.



                          (7) An illustration (though, admittedly not an application) of eigenvalues and eigenspaces: Have your students draw an ellipse with axes parallel to the coordinate axes in $mathbb{R}^2$, then have them rotate the ellipse by using certain orthogonal matrices.



                          Afterwards, give them a quadratic equation in two variables, and make them plot the solution set (which should be chosen by you to agree with the rotated ellipse from above). Then have them diagonalize the symmetric matrix representing the corresponding quadratic form and make them compare their result with the rotation matrix above.



                          (8) Another example from image processing (in addition to James' example of image compression): many image "improvement" techniques are based on applying filters to the image - which is simply an application of a linear map.



                          (9) The discrete Fourier transform - which can by represented by a unitary matrix - is frequently used in image processing.



                          (10) In case that you wish to go into numerical analysis: discretization methods for linear partial differential equations usually yield linear equations in finite dimensions which have to be solved numerically.






                          share|cite|improve this answer











                          $endgroup$


















                            1












                            $begingroup$

                            Here are a few further suggestions:



                            (1) The intersection of planes and lines (relevant in computer graphics) leads to linear equations and thus to kernels of linear maps.



                            (2) The area of parallelograms (and thus of triangles) can be computed in terms of determinants. That's maybe a bit too simple because you only need very small matrices, but on the other hand many surfaces in computer graphics are constructed from triangles.



                            If you compute the area of triangles in three dimensional space you'll also need to multiply $3times 2$-matrices with their tranposed matrices.



                            (3) A simple way to represent directed graphs on a computer is to store their adjacency matriy $A$ as a double array. Transposing the matrix means inverting the direction of all edges.



                            Moreover, the entries of the $k$-th power $A^k$ tell you how many paths of length $k$ exist between any two vertices of the graph.



                            (4) The pseudo-inverse of a matrix (which is closely related to singular value decomposition) is very important for linear regression/least square fitting.



                            (5) Consider a least square problem where you have, in addition, a linear restriction on the fitting parameters (probably, it would be helpful to present this situation in terms of a more concrete example). This means you restrict your parameter space to the kernel of a linear map. Now, use for instance the Gauss algorithm to represent your kernel as the image of another linear map. This yields a coordinate transform which represents your fitting problem without any restriction - so now you can simply solve it by using the pseudo-inverse approach.



                            (6) Again an application from computer graphics: have your students build some nice three-dimensional geometric object (for instance, consisting of many triangles) and let them make it "visible" by projecting the object to the screen. (This is going to be a nice exercise in computing intersections of lines and planes because they need to determine which triangles are in front of the others).



                            Now, you want to rotate the object - so your students will need to compute rotation matrices and apply them to the coordinates of the triangles. This is a nice illustration of orthogonal matrices and of the action of matrices as linear maps.



                            (7) An illustration (though, admittedly not an application) of eigenvalues and eigenspaces: Have your students draw an ellipse with axes parallel to the coordinate axes in $mathbb{R}^2$, then have them rotate the ellipse by using certain orthogonal matrices.



                            Afterwards, give them a quadratic equation in two variables, and make them plot the solution set (which should be chosen by you to agree with the rotated ellipse from above). Then have them diagonalize the symmetric matrix representing the corresponding quadratic form and make them compare their result with the rotation matrix above.



                            (8) Another example from image processing (in addition to James' example of image compression): many image "improvement" techniques are based on applying filters to the image - which is simply an application of a linear map.



                            (9) The discrete Fourier transform - which can by represented by a unitary matrix - is frequently used in image processing.



                            (10) In case that you wish to go into numerical analysis: discretization methods for linear partial differential equations usually yield linear equations in finite dimensions which have to be solved numerically.






                            share|cite|improve this answer











                            $endgroup$
















                              1












                              1








                              1





                              $begingroup$

                              Here are a few further suggestions:



                              (1) The intersection of planes and lines (relevant in computer graphics) leads to linear equations and thus to kernels of linear maps.



                              (2) The area of parallelograms (and thus of triangles) can be computed in terms of determinants. That's maybe a bit too simple because you only need very small matrices, but on the other hand many surfaces in computer graphics are constructed from triangles.



                              If you compute the area of triangles in three dimensional space you'll also need to multiply $3times 2$-matrices with their tranposed matrices.



                              (3) A simple way to represent directed graphs on a computer is to store their adjacency matriy $A$ as a double array. Transposing the matrix means inverting the direction of all edges.



                              Moreover, the entries of the $k$-th power $A^k$ tell you how many paths of length $k$ exist between any two vertices of the graph.



                              (4) The pseudo-inverse of a matrix (which is closely related to singular value decomposition) is very important for linear regression/least square fitting.



                              (5) Consider a least square problem where you have, in addition, a linear restriction on the fitting parameters (probably, it would be helpful to present this situation in terms of a more concrete example). This means you restrict your parameter space to the kernel of a linear map. Now, use for instance the Gauss algorithm to represent your kernel as the image of another linear map. This yields a coordinate transform which represents your fitting problem without any restriction - so now you can simply solve it by using the pseudo-inverse approach.



                              (6) Again an application from computer graphics: have your students build some nice three-dimensional geometric object (for instance, consisting of many triangles) and let them make it "visible" by projecting the object to the screen. (This is going to be a nice exercise in computing intersections of lines and planes because they need to determine which triangles are in front of the others).



                              Now, you want to rotate the object - so your students will need to compute rotation matrices and apply them to the coordinates of the triangles. This is a nice illustration of orthogonal matrices and of the action of matrices as linear maps.



                              (7) An illustration (though, admittedly not an application) of eigenvalues and eigenspaces: Have your students draw an ellipse with axes parallel to the coordinate axes in $mathbb{R}^2$, then have them rotate the ellipse by using certain orthogonal matrices.



                              Afterwards, give them a quadratic equation in two variables, and make them plot the solution set (which should be chosen by you to agree with the rotated ellipse from above). Then have them diagonalize the symmetric matrix representing the corresponding quadratic form and make them compare their result with the rotation matrix above.



                              (8) Another example from image processing (in addition to James' example of image compression): many image "improvement" techniques are based on applying filters to the image - which is simply an application of a linear map.



                              (9) The discrete Fourier transform - which can by represented by a unitary matrix - is frequently used in image processing.



                              (10) In case that you wish to go into numerical analysis: discretization methods for linear partial differential equations usually yield linear equations in finite dimensions which have to be solved numerically.






                              share|cite|improve this answer











                              $endgroup$



                              Here are a few further suggestions:



                              (1) The intersection of planes and lines (relevant in computer graphics) leads to linear equations and thus to kernels of linear maps.



                              (2) The area of parallelograms (and thus of triangles) can be computed in terms of determinants. That's maybe a bit too simple because you only need very small matrices, but on the other hand many surfaces in computer graphics are constructed from triangles.



                              If you compute the area of triangles in three dimensional space you'll also need to multiply $3times 2$-matrices with their tranposed matrices.



                              (3) A simple way to represent directed graphs on a computer is to store their adjacency matriy $A$ as a double array. Transposing the matrix means inverting the direction of all edges.



                              Moreover, the entries of the $k$-th power $A^k$ tell you how many paths of length $k$ exist between any two vertices of the graph.



                              (4) The pseudo-inverse of a matrix (which is closely related to singular value decomposition) is very important for linear regression/least square fitting.



                              (5) Consider a least square problem where you have, in addition, a linear restriction on the fitting parameters (probably, it would be helpful to present this situation in terms of a more concrete example). This means you restrict your parameter space to the kernel of a linear map. Now, use for instance the Gauss algorithm to represent your kernel as the image of another linear map. This yields a coordinate transform which represents your fitting problem without any restriction - so now you can simply solve it by using the pseudo-inverse approach.



                              (6) Again an application from computer graphics: have your students build some nice three-dimensional geometric object (for instance, consisting of many triangles) and let them make it "visible" by projecting the object to the screen. (This is going to be a nice exercise in computing intersections of lines and planes because they need to determine which triangles are in front of the others).



                              Now, you want to rotate the object - so your students will need to compute rotation matrices and apply them to the coordinates of the triangles. This is a nice illustration of orthogonal matrices and of the action of matrices as linear maps.



                              (7) An illustration (though, admittedly not an application) of eigenvalues and eigenspaces: Have your students draw an ellipse with axes parallel to the coordinate axes in $mathbb{R}^2$, then have them rotate the ellipse by using certain orthogonal matrices.



                              Afterwards, give them a quadratic equation in two variables, and make them plot the solution set (which should be chosen by you to agree with the rotated ellipse from above). Then have them diagonalize the symmetric matrix representing the corresponding quadratic form and make them compare their result with the rotation matrix above.



                              (8) Another example from image processing (in addition to James' example of image compression): many image "improvement" techniques are based on applying filters to the image - which is simply an application of a linear map.



                              (9) The discrete Fourier transform - which can by represented by a unitary matrix - is frequently used in image processing.



                              (10) In case that you wish to go into numerical analysis: discretization methods for linear partial differential equations usually yield linear equations in finite dimensions which have to be solved numerically.







                              share|cite|improve this answer














                              share|cite|improve this answer



                              share|cite|improve this answer








                              edited 2 hours ago

























                              answered 2 hours ago









                              Jochen GlueckJochen Glueck

                              2,83911322




                              2,83911322























                                  0












                                  $begingroup$

                                  Numerical linear algebra is a field of computer science.






                                  share|cite|improve this answer









                                  $endgroup$


















                                    0












                                    $begingroup$

                                    Numerical linear algebra is a field of computer science.






                                    share|cite|improve this answer









                                    $endgroup$
















                                      0












                                      0








                                      0





                                      $begingroup$

                                      Numerical linear algebra is a field of computer science.






                                      share|cite|improve this answer









                                      $endgroup$



                                      Numerical linear algebra is a field of computer science.







                                      share|cite|improve this answer












                                      share|cite|improve this answer



                                      share|cite|improve this answer










                                      answered 7 hours ago









                                      Ben McKayBen McKay

                                      14.3k22860




                                      14.3k22860






























                                          draft saved

                                          draft discarded




















































                                          Thanks for contributing an answer to MathOverflow!


                                          • Please be sure to answer the question. Provide details and share your research!

                                          But avoid



                                          • Asking for help, clarification, or responding to other answers.

                                          • Making statements based on opinion; back them up with references or personal experience.


                                          Use MathJax to format equations. MathJax reference.


                                          To learn more, see our tips on writing great answers.




                                          draft saved


                                          draft discarded














                                          StackExchange.ready(
                                          function () {
                                          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmathoverflow.net%2fquestions%2f322691%2fapplications-of-basic-linear-algebra-concepts-to-computer-science%23new-answer', 'question_page');
                                          }
                                          );

                                          Post as a guest















                                          Required, but never shown





















































                                          Required, but never shown














                                          Required, but never shown












                                          Required, but never shown







                                          Required, but never shown

































                                          Required, but never shown














                                          Required, but never shown












                                          Required, but never shown







                                          Required, but never shown







                                          Popular posts from this blog

                                          How to label and detect the document text images

                                          Vallis Paradisi

                                          Tabula Rosettana