"how numbers are stored and used in computers"
Linear algebra is the study of structure and change through linear transformations. It first appeared in ancient Chinese mathematics as a technique that we know today as Gaussian elimination - a step-by-step method for solving systems of linear equations by reducing them to a simpler "diagonalized" form.
Try changing the system of equations below to see the result of the elimination process.
These guides to linear algebra use interactive visualizations, similar to the one above, to provide a more intuitive explanation of the underlying concepts.
Vectors are a sequence of numbers that often represent a magnitude and direction in a coordinate system.
Matrices are rectangular arrays of numbers that are often used to represent linear transformations.
Linear transformations are functions that map vectors to other vectors, often represented as matrices.
Eigenvalues and eigenvectors are important concepts in linear algebra. Eigenvalues are the values that scale the eigenvectors when a linear transformation is applied to them. Eigenvectors are the vectors that are scaled by the eigenvalues when a linear transformation is applied to them.