Last updated on July 9th, 2025
A skew symmetric matrix is a rectangular matrix, where the transpose of elements is equal to its negative. In this type of matrix, the diagonal elements will always be zero. This article will explain skew symmetric matrices in detail.
In linear algebra, a symmetric matrix is defined by its symmetry along the main diagonal. If a square matrix A = AT(transpose of A), then the matrix is symmetric. In other words, the value at row i and column j equals the value at row j and column i, i.e., A[i, j] = A [j, i].
Matrix A is a skew symmetric matrix if A = -AT, where AT is the transpose of A. If A = [aij]n is a skew symmetric matrix, then aij = -aji. It means that all elements present diagonally in a skew-symmetric matrix are zero.
Now, let’s learn how to represent skew symmetric matrices. Let B = [bij]n be an n × n matrix, Matrix B is a skew symmetric bij = -bji for all 1 ≤ i, j ≤ n. Where n is the natural number and bij is the element at the i-th row and j-th column.
All the diagonal elements of a skew symmetric matrix are always zero, as bii = -bii as i = j
Adding bii to both sides: bii + bii = 0
2bii = 0
bii = 0
Skew symmetric matrices follow certain properties so that the concept can be used in algebra, physics, etc. In this section, we will learn the properties of skew symmetric matrices.
In this section, we will be discussing the theorems on skew symmetric matrices. These theorems are useful in matrix decomposition, transformations, and applications.
Theorem 1: For a square matrix A, A - AT is skew symmetric
For any square matrix A, we should prove that A - AT is skew symmetric.
We can take help from the properties of skew symmetric matrices, such as:
For A - AT to be skew symmetric, let’s prove that B = A - AT
To prove B = A - AT, let’s take transpose on both sides.
BT = (A - AT)T
Now, we need to use the transpose of a different property.
BT = AT + (-AT)T
BT = AT - (AT)T
We know that (AT)T = A
Therefore, BT = AT - A
BT = -(A - AT) BT = -B
So, -B = -(A - AT)
We can now say that B = (A - AT) is skew-symmetric.
Theorem 2: Decomposing a Square Matrix
Like all matrices, square matrices also encode complex relationships. It is important to decompose a square matrix into similar components to understand their properties and structure. A square matrix is expressed as the sum of a symmetric matrix, and a skew-symmetric matrix will be proved in this theorem. This decomposition is significant in applications like physics.
In this theorem, we will be using the following properties:
If A is a square matrix, it can be written as:
A = 1/2 (A + AT) + 1/2 (A - AT)
Let’s consider,
P = 1/2 (A + AT)
Q = 1/2 (A - AT)
Taking the transpose of both P and Q
PT = (1/2 (A + AT))T
= ½(AT + (AT)T)
= ½(AT + A)
P = ½(AT + A)
So, PT = P
QT = (1/2 (A - AT))T
= ½(AT - (AT)T)
= ½ (AT - A)
= ½ (AT - A)
-Q = ½ (A - AT)
So, QT = - Q
So, the square matrix A can be written as the sum of a symmetric matrix P and a skew symmetric matrix Q.
Theorem 3: For a skew symmetric matrix A and any matrix B, the matrix BTAB is skew symmetric.
Let’s take the help of the following properties:
In this theorem, we shall prove (BTAB)T = -BTAB
Start with:
(BT AB)T = (BT (AB))T
Now use the transpose of a product rule:
= (AB)T (BT)T
Since (AB)T = BT AT we get:
(BT AB)T = BTATB
Since AT = -A, we can make the substitution.
Therefore, (BT AB)T = BT(-A)B
(BT AB)T = -BT AB
So, if A is a skew symmetric matrix, then BT AB is a skew symmetric matrix.
According to the theorem of skew symmetric matrices, the sum of any symmetric and skew symmetric matrix results in a square matrix.
Determinant of Skew Symmetric Matrix
For any skew symmetric matrix with an odd order, the determinant is always zero. Let’s verify this with an example:
Finding the determinant of a matrix A
|A| = m11 × Cofactor11 + m12 × cofactor12 + m13 × cofactor13
So, |A| = 0 × Cofactor11 + 2 × cofactor12 + (-4) × cofactor13
|A| = 2 × cofactor12 + (-4) × cofactor13
Finding cofactor C12, eliminating row 1 and column 2,
So, -2 3
4 0
det = (-2)(0) - (4)(3) = 0 - 12 = -12
So, C12 = (-1)1 + 2 × (-12)
= -1 × -12 = 12
Finding cofactor C13, eliminating row 1 and column 3,
So, -2 0
4 -3
det = (-2)(-3) - (4)(0) = 6 - 0 = 6
So, C13 = (-1)1 + 3 × (6)
= 1 × 6 = 6
So, |A| = 2 × cofactor12 + (-4) × cofactor13
= 2 × 12 + (-4) × 6
= 24 + -24
= 0
So, the determinant of a skew symmetric matrix with an odd order is 0.
For all skew symmetric matrices, the eigenvalue is always zero or imaginary. For a skew symmetric matrix A, the eigenvalue λ of A with a corresponding eigenvector x can be represented as:
Ax = λx
Before we proceed further, let us understand what eigenvalue and eigenvector are. In a matrix, a vector changes direction when a specific matrix acts on it. But an eigenvector doesn’t change direction, it only gets stretched or shrunk. The amount by which it gets stretched or shrunk is known as eigenvalue.
Multiplying both sides by the conjugate transpose of x, x -T:
x -T Ax = x -T = ||x2||
Here,
x -T Ax is a dot product, so it is commutative, xT- Ax = (Ax )T- x= xT AT- x
As A is a skew symmetric, AT = -A
Substituting AT = -A:
xT AT x = -xT(-A)x = -xT Ax
Taking the conjugate of Ax,
Ax = x
Now, we have
-xT Ax = -xT x = - ||x||2
- ||x||2 = ||x||2
As ||x||2 ≠ 0, we get
= -
The value of is either 0 or an imaginary number.
In real life, we use the skew symmetric matrix in fields like physics, mathematics, computer graphics, etc. In this section, we will learn some real-life applications of skew symmetric matrices.
Students mostly make errors when working with skew symmetric matrices. Here are some of the common mistakes and ways to avoid them in skew symmetric matrices.
Jaskaran Singh Saluja is a math wizard with nearly three years of experience as a math teacher. His expertise is in algebra, so he can make algebra classes interesting by turning tricky equations into simple puzzles.
: He loves to play the quiz with kids through algebra to make kids love it.