Eigenvector

In Linear Algebra, an Eigenvector is a Coordinate Vector for which, applying the given Linear Transformation has the same effect as multiplying that vector by a scalar. In other words, vectors for which the Output Space of the given linear transformation is proportional to the Input Space.

  • Let be a Square Matrix that represents a linear transformation.
  • Let be some Coordinate Vector that we need to solve for.
  • Let be some Scalar that we need to solve for.
  • By solving the following equation, we find eigenstuff
  • Eigenvalues are nonzero scalar values that solve the equation
  • Eigenvectors are coordinate vectors that solve the equation
  • Eigenspaces are the spaces that extend from the eigenvectors (the spans of the eigenvectors)
  • Each eigenvector has a corresponding eigenvalue, i.e. each eigenvector has a specific amount that the linear transformation scales it by
  • The following GIF shows a linear transformation scaling 2D space. The red lines are the eigenspaces of the transformation. Any coordinate vector part of those spaces is an eigenvector.

Eigenvalue

is an eigenvalue for the transformation

  • Solve for in , which yields the Characteristic Equation for this system, e.g. in a 2D systems it is .
  • point same direction
  • point opposite direction
  • can be complex even if nothing else in the equation is
  • Eigenvalues cannot be determined from the reduced version of a matrix
    • i.e. row reductions change the eigenvalues of a matrix
  • The diagonal elements of a triangular matrix are its eigenvalues.
  • A invertible iff 0 is not an eigenvalue of A.
  • Stochastic matrices have an eigenvalue equal to 1.
  • If are eigenvectors that correspond to distinct eigenvalues, then are linearly independent

Defective

An eigenvalue is defective if and only if it does not have a complete Set of Linearly Independent eigenvectors.
Due to ‘s contribution,

Neutral Eigenvalue

Eigenspace

  • the span of the eigenvectors that correspond to a particular eigenvalue