Eigenvalues and eigenvectors are key concepts in linear algebra. They help us understand how matrices transform vectors and solve systems of linear equations. These tools are crucial for analyzing linear transformations and their effects on vector spaces.

In this part, we'll learn how to find eigenvalues and eigenvectors, and use them to diagonalize matrices. We'll also explore their applications in solving differential equations and analyzing system stability. These ideas connect linear algebra to other math and science fields.

Eigenvalues and eigenvectors of matrices

Definition and properties

  • An of a square matrix A is a nonzero vector x such that Ax = λx for some scalar λ
    • The scalar λ is called the corresponding to the eigenvector x
    • Together, the eigenvalue and eigenvector form an eigenpair (λ, x)
  • The set of all eigenvectors corresponding to a particular eigenvalue, together with the zero vector, form an
    • The dimension of the eigenspace is called the of the eigenvalue
  • A matrix can have repeated eigenvalues, meaning the same eigenvalue with multiple linearly independent eigenvectors
    • The number of times an eigenvalue appears as a root of the is called its
  • Not all square matrices have eigenvalues and eigenvectors
    • Matrices that do are called
    • A matrix is diagonalizable if and only if the sum of the dimensions of its eigenspaces equals the size of the matrix

Geometric interpretation

  • Geometrically, an eigenvector is a vector that, when the represented by the matrix is applied, remains parallel to its original direction
    • The eigenvalue represents the factor by which the eigenvector is scaled under the transformation
  • In a 2D plane, the eigenvectors of a matrix are the axes along which the matrix performs a simple scaling operation
    • These axes are not necessarily orthogonal (perpendicular) to each other unless the matrix is symmetric
  • In higher dimensions, eigenvectors can be thought of as defining a new coordinate system in which the matrix takes on a simpler form ()

Computing eigenvalues and eigenvectors

Characteristic equation

  • To find the eigenvalues of an n x n matrix A, solve the characteristic equation det(A - λI) = 0, where I is the n x n identity matrix
    • Expand the to obtain a polynomial in λ, called the characteristic polynomial
    • The degree of the characteristic polynomial equals the size of the matrix
  • The solutions to the characteristic equation (the roots of the characteristic polynomial) are the eigenvalues of the matrix A
    • The eigenvalues can be real or complex numbers
    • The sum of the eigenvalues equals the trace of the matrix (sum of the diagonal elements)
    • The product of the eigenvalues equals the determinant of the matrix

Finding eigenvectors

  • For each eigenvalue λ, solve the equation (A - λI)x = 0 to find the corresponding eigenvectors
    • This equation represents a homogeneous system of linear equations
    • The solution set of this system is the eigenspace corresponding to the eigenvalue λ
  • Eigenvectors are not unique; if x is an eigenvector, then any nonzero scalar multiple of x is also an eigenvector for the same eigenvalue
    • To find a basis for the eigenspace, solve the system (A - λI)x = 0 and express the solution in terms of free variables
  • Normalize eigenvectors by dividing them by their magnitude to obtain unit eigenvectors, which have a magnitude of 1
    • Unit eigenvectors are unique up to a sign change (multiplication by -1)

Matrix diagonalization using eigenvalues

Diagonalizability conditions

  • A square matrix A is diagonalizable if it has n linearly independent eigenvectors, where n is the size of the matrix
    • Equivalently, A is diagonalizable if the sum of the dimensions of its eigenspaces equals n
  • If A is diagonalizable, then A = PDP^(-1), where D is a diagonal matrix with the eigenvalues of A on its main diagonal, and P is a matrix whose columns are the corresponding eigenvectors of A
    • The eigenvectors in P must be linearly independent
    • If the eigenvectors are chosen to be unit vectors, then P is an orthogonal matrix (P^(-1) = P^T)
  • The process of finding matrices P and D is called diagonalization or eigendecomposition

Diagonalization procedure

  1. Find the eigenvalues of the matrix A by solving the characteristic equation det(A - λI) = 0
  2. For each distinct eigenvalue λ_i, find a basis for the corresponding eigenspace by solving (A - λ_iI)x = 0
  3. Construct the matrix P by arranging the eigenvectors (or their multiples) as columns
    • Ensure that the eigenvectors in P are linearly independent
    • If desired, normalize the eigenvectors to obtain an orthogonal matrix P
  4. Construct the diagonal matrix D by placing the eigenvalues λ_i on the main diagonal, in the same order as their corresponding eigenvectors in P
  5. Verify that A = PDP^(-1) (or A = PDP^T if P is orthogonal)

Repeated eigenvalues

  • If a matrix has repeated eigenvalues, it is diagonalizable if and only if it has a full set of linearly independent eigenvectors (i.e., the geometric multiplicity equals the algebraic multiplicity for each eigenvalue)
    • If the geometric multiplicity is less than the algebraic multiplicity for any eigenvalue, the matrix is not diagonalizable
  • When a matrix has repeated eigenvalues, the eigenspaces corresponding to these eigenvalues may intersect non-trivially
    • In this case, extra care must be taken to ensure that the eigenvectors chosen for P are linearly independent

Eigenvalues and eigenvectors for differential equations

First-order linear systems

  • A system of n linear first-order differential equations can be written in the form x' = Ax, where x is a vector of functions and A is an n x n matrix of constants
    • The vector x represents the state of the system, and the matrix A describes how the state evolves over time
  • If A is diagonalizable, the system can be solved using the eigenvalues and eigenvectors of A
    • The eigenvalues of A determine the stability and long-term behavior of the system
    • The eigenvectors of A determine the principal directions or modes of the system's evolution

Diagonalization method

  1. Diagonalize the matrix A to obtain A = PDP^(-1), where D is a diagonal matrix of eigenvalues and P is a matrix of corresponding eigenvectors
  2. Substitute A = PDP^(-1) into the original equation to get x' = PDP^(-1)x
  3. Define a new vector y = P^(-1)x, so that x = Py
    • The vector y represents the state of the system in the basis of eigenvectors
  4. Rewrite the system in terms of y to obtain y' = Dy, which is a system of n independent linear differential equations
    • Each equation in the system has the form y_i' = λ_i y_i, where λ_i is an eigenvalue from the diagonal of D
  5. Solve each equation in the system y' = Dy separately, using the eigenvalues from the diagonal of D
    • The solution to y_i' = λ_i y_i is y_i(t) = c_i e^(λ_i t), where c_i is a constant determined by initial conditions
  6. Reconstruct the solution to the original system by substituting the solutions for y back into x = Py
    • The final solution will be a linear combination of exponential functions, with coefficients determined by the initial conditions and the eigenvectors

Stability analysis

  • The stability of the system x' = Ax can be determined by examining the eigenvalues of A
    • If all eigenvalues have negative real parts, the system is asymptotically stable (solutions converge to zero as t → ∞)
    • If any eigenvalue has a positive real part, the system is unstable (solutions grow unbounded as t → ∞)
    • If all eigenvalues have non-positive real parts and at least one eigenvalue has a zero real part, the system is marginally stable (solutions remain bounded but may not converge to zero)
  • The eigenvectors corresponding to the eigenvalues with zero real parts determine the long-term behavior of the system
    • These eigenvectors form a basis for the center subspace, which contains all solutions that remain bounded but do not necessarily converge to zero

Key Terms to Review (18)

Algebraic multiplicity: Algebraic multiplicity refers to the number of times a specific eigenvalue appears as a root of the characteristic polynomial of a matrix. This concept is important because it helps to determine the behavior of the corresponding eigenvectors and the overall structure of the matrix in terms of its diagonalizability and stability. Understanding algebraic multiplicity allows for deeper insights into how matrices transform space and their potential applications in various fields.
Characteristic Polynomial: The characteristic polynomial is a polynomial associated with a square matrix that encapsulates the eigenvalues of that matrix. It is derived from the determinant of the matrix subtracted by a scalar multiple of the identity matrix, and its roots are precisely the eigenvalues. This polynomial plays a crucial role in determining the behavior of linear transformations represented by the matrix.
Determinant: The determinant is a scalar value that can be computed from the elements of a square matrix and provides important information about the matrix, such as whether it is invertible and its scaling factor in linear transformations. In the context of linear transformations, the determinant indicates how much a transformation scales area or volume, while in eigenvalues and eigenvectors, it helps determine the nature of solutions to linear equations represented by matrices.
Diagonal matrix: A diagonal matrix is a special type of square matrix where all the entries outside the main diagonal are zero, meaning that only the diagonal elements can be non-zero. This characteristic makes diagonal matrices particularly significant in linear algebra, especially in relation to eigenvalues and eigenvectors, as they simplify many operations and computations.
Diagonalizable: A matrix is called diagonalizable if it can be expressed in the form of a diagonal matrix through a similarity transformation, meaning there exists an invertible matrix such that when it is multiplied by the diagonal matrix and its inverse, it results in the original matrix. This property is closely tied to the concepts of eigenvalues and eigenvectors, as diagonalization simplifies many operations involving matrices, particularly when raising them to powers or solving systems of linear equations.
Eigenspace: An eigenspace is a subspace associated with a particular eigenvalue of a linear transformation or matrix. It consists of all eigenvectors corresponding to that eigenvalue, along with the zero vector. The concept of eigenspaces is crucial as they provide insight into the geometric interpretation of eigenvalues and eigenvectors, revealing how transformations affect different directions in vector spaces.
Eigenvalue: An eigenvalue is a scalar associated with a linear transformation represented by a matrix, indicating how much the corresponding eigenvector is stretched or compressed during that transformation. When a matrix acts on its eigenvector, the output is simply the eigenvector scaled by its eigenvalue. This concept is crucial in understanding various properties of linear transformations and plays a vital role in applications like stability analysis, quantum mechanics, and facial recognition.
Eigenvector: An eigenvector is a non-zero vector that changes at most by a scalar factor when a linear transformation is applied to it. In the context of linear algebra, these vectors are fundamental as they help determine the behavior of a matrix when it's transformed, providing insight into its properties and characteristics, particularly related to eigenvalues.
Geometric multiplicity: Geometric multiplicity refers to the number of linearly independent eigenvectors associated with a given eigenvalue of a matrix. It provides insight into the structure of the eigenspace related to that eigenvalue, indicating how many distinct directions in which a transformation can stretch or compress vectors. This concept is crucial for understanding the behavior of matrices and their transformations, particularly in relation to diagonalization and stability analysis.
Linear transformation: A linear transformation is a function between two vector spaces that preserves the operations of vector addition and scalar multiplication. This means that if you take any two vectors and apply the transformation, the result will be the same as if you transformed each vector separately and then added them together. Linear transformations can be represented using matrices, which helps in understanding their properties and effects on vectors.
Matrix diagonalization: Matrix diagonalization is the process of converting a square matrix into a diagonal form, where all off-diagonal elements are zero. This transformation simplifies many matrix operations, especially when raising matrices to powers or solving systems of differential equations. Diagonalization relies on the existence of eigenvalues and eigenvectors, as these components allow us to represent the original matrix in a more manageable form.
Power method: The power method is an iterative algorithm used to find the dominant eigenvalue and its corresponding eigenvector of a matrix. It works by repeatedly multiplying a random vector by the matrix and normalizing it, leading to convergence toward the eigenvector associated with the largest eigenvalue. This method is particularly useful when dealing with large matrices where calculating eigenvalues directly can be computationally expensive.
Principal Component Analysis: Principal Component Analysis (PCA) is a statistical technique used to reduce the dimensionality of a dataset while preserving as much variance as possible. It transforms the original variables into a new set of uncorrelated variables, known as principal components, which are ordered by the amount of variance they capture. PCA is closely linked to eigenvalues and eigenvectors, as the principal components are derived from the eigenvectors of the covariance matrix of the data.
QR Algorithm: The QR Algorithm is a numerical method used to compute the eigenvalues and eigenvectors of a matrix. By decomposing a matrix into an orthogonal matrix Q and an upper triangular matrix R, the algorithm iteratively improves approximations of eigenvalues, converging to accurate results. This method is particularly significant in linear algebra for its efficiency in handling large matrices and its applications in various fields such as engineering and data analysis.
Stability analysis: Stability analysis is a mathematical approach used to determine the stability of equilibrium points in dynamical systems. It assesses whether small disturbances from an equilibrium state will grow or diminish over time, helping to predict the long-term behavior of a system. This concept is vital in various fields, especially in understanding how systems respond to changes and the implications of those responses.
Symmetric matrix: A symmetric matrix is a square matrix that is equal to its transpose, meaning that the elements are mirrored across the main diagonal. This property leads to several important characteristics, including real eigenvalues and orthogonal eigenvectors, which are especially relevant when analyzing the behavior of linear transformations and solving systems of equations.
Unit eigenvector: A unit eigenvector is an eigenvector that has been normalized to have a length of one. This means that its magnitude is equal to one, making it easier to work with in various mathematical applications, particularly in linear transformations and systems of equations. Unit eigenvectors are crucial because they maintain the direction of the original eigenvector while simplifying calculations and interpretations related to eigenvalues and linear mappings.
λ for eigenvalue: The symbol λ represents an eigenvalue, a scalar that indicates how much a corresponding eigenvector is stretched or compressed during a linear transformation represented by a matrix. Eigenvalues play a crucial role in understanding the behavior of linear transformations, revealing important properties about the matrix such as stability, oscillation modes, and more.
© 2024 Fiveable Inc. All rights reserved.
AP® and SAT® are trademarks registered by the College Board, which is not affiliated with, and does not endorse this website.