Last updated on July 5th, 2025
When a square matrix acts on a vector, it changes its direction and size. However, some vectors called eigenvectors change only their size. The difference in the size is indicated by eigenvalue (λ). Therefore, eigenvalues show how much an eigenvector is scaled.
An eigenvalue is the scalar by which the eigenvector is scaled. Mathematically, eigenvalues are defined as:
For a square matrix A, a scalar , and a non-zero column vector v to satisfy the below mentioned condition,
Av = v
Then,
v must be equal to eigenvector of A
must be equal to eigenvalue of A.
A comprehensive understanding of the properties of eigenvalues is fundamental for accurate interpretation of linear transformations and facilitation of matrix operations.
A square matrix of order nn can have a maximum of n eigenvalues.
All n eigenvalues of an identity matrix are equal to 1.
For both triangular and diagonal matrices, the eigenvalues are the elements present on the main diagonal.
The sum of the eigenvalues of a matrix is equal to the sum of its diagonal elements.
When the eigenvalues are multiplied together, we get the determinant.
Hermitian and symmetric matrices have real eigenvalues.
The eigenvalues of skew-Hermitian and skew-symmetric matrices are restricted to purely imaginary values or zeroes.
A matrix and its transpose have the same eigenvalues.
Consider two square matrices A and B. If they are of the same order, then AB and BA have the same, non-zero eigenvalues. However, their zero eigenvalues might differ.
An orthogonal matrix’s eigenvalues have an absolute value of 1. They can be real, i.e., 1/-1 or complex conjugate pairs.
For any scalar k, the eigenvalues of the matrix kA are obtained by multiplying each eigenvalue of matrix A by K.
If λ is an eigenvalue of matrix A, then λk is an eigenvalue of Ak, provided that A is diagonalizable
For an invertible matrix A, each eigenvalue becomes 1 for the inverse of the matrix A-1.
If λ is a non-zero eigenvalue of A, then |A| / λ is an eigenvalue of the adjoint of A.
It is also important to understand the Cayley-Hamilton theorem, which states that “every square matrix satisfies its own characteristic equation.”
For a characteristic polynomial of A:
p()=det(A-I)=n+a1n-1+ . . . +an-1+an
Then the Cayley-Hamilton theorem states:
p(A)=An+a1An-1+ . . . +an-1A+anI=0
As the properties suggest, if λ is an eigenvalue for given square matrix A, then
Av = λv
If identity matrix I and matrix A are of the same order, then:
Av = λ(Iv) (v = Iv)
Av - λ(Iv) = 0
v is the common factor, so,
v(A - λI ) = 0
This is a homogeneous system. The existence of v ≠ 0 implies that det(A - λI) = 0. This is the characteristic equation.
Here, det(A - λI) is known as the characteristic polynomial and λ is the eigenvalue.
To find eigenvalues of a square matrix:
Step 1: Consider a square matrix A.
Step 2: Let I be the identity matrix of the same order as A.
Step 3: Subtract λI from A.
Step 4: Find the determinant.
Step 5: Equate determinant = 0 and find the value of λ.
Using the steps mentioned above, let's solve an example:
Let's take the matrix:
Let λ represent the eigenvalues.
Identity matrix I:
Finding the determinant:
|A - λI| = ( 3 - λ ) ( 3 - λ ) - ( 2 ) ( 2 ) = ( 3 - λ )2 - 4
= 9 - 6λ + λ2 - 4 = 2 - 6 = 5
Characteristic equation:
2 - 6 + 5 = 0
Factoring it, we get:
( - 5 ) ( - 1 ) = 0
=5 , =1
The eigenvalues for the given matrix are 5 and 1.
In this section, we will use the steps mentioned in the previous segment to find the eigenvalues of a 3 × 3 matrix. Let’s consider the following matrix:
Characteristic equation A - I = 0
Subtracting from each diagonal entry, we get:
Now, determinant:
det ( A - I ) = ( 2 - ) ( 3 - ) ( 5 - )
Solving for: (2-) (3 - ) ( 5 - ) = 0
= 2, 3, 5
The eigenvalues of matrix A are:
= 2, 3, 5
Eigenvalues provide information about the structure and behavior of different systems across various disciplines like engineering, data science, and more. Some real-world applications of eigenvalues are as follows:
Mechanical Vibrations
Eigenvalues determine natural frequencies of systems, crucial in engineering designs.
Principal Component Analysis (PCA)
In data science, eigenvalues help identify principal components for dimensionality reduction.
Quantum Mechanics
Eigenvalues represent observable quantities like energy levels.
Stability Analysis
In control systems, eigenvalues indicate system stability.
Facial Recognition
The eigenfaces technique is a popular method in facial recognition that uses eigenvalues and eigenvectors for image recognition.
It is important to learn to solve problems related to eigenvalues as they play a crucial role in linear algebra. However, it is also likely for students to make some mistakes while working with them. This section, where we’ve handpicked the most common mistakes, will help you avoid them.
Jaskaran Singh Saluja is a math wizard with nearly three years of experience as a math teacher. His expertise is in algebra, so he can make algebra classes interesting by turning tricky equations into simple puzzles.
: He loves to play the quiz with kids through algebra to make kids love it.