Summarize this article:
203 LearnersLast updated on October 22, 2025

A skew symmetric matrix is a rectangular matrix, where the transpose of elements is equal to its negative. In this type of matrix, the diagonal elements will always be zero. This article will explain skew symmetric matrices in detail.
In linear algebra, a symmetric matrix is defined by its symmetry along the main diagonal. If a square matrix A = AT (transpose of A), then the matrix is symmetric. In other words, the value at row i and column j equals the value at row j and column i, i.e., A [i, j] = A [j, i].
Matrix A is a skew symmetric matrix if A = -AT, where AT is the transpose of A. If A = [aij]n is a skew symmetric matrix, then aij = -aji. It means that all elements present diagonally in a skew-symmetric matrix are zero.
Now, let’s learn how to represent skew symmetric matrices. Let B = [bij]n be an n × n matrix, Matrix B is skew symmetric, bij = -bji for all 1 ≤ i, j ≤ n. Where n is the natural number and bij is the element at the i-th row and j-th column.
All the diagonal elements of a skew symmetric matrix are always zero, as bii = -bii as i = j
Adding bii to both sides: bii + bii = 0
2bii = 0
bii = 0
Skew symmetric matrices follow certain properties so that the concept can be used in algebra, physics, etc. In this section, we will learn the properties of skew symmetric matrices.
In this section, we will be discussing the theorems on skew symmetric matrices. These theorems are useful in matrix decomposition, transformations, and applications.
Theorem 1: For a square matrix A, A - AT is skew symmetric
For any square matrix A, we should prove that A - AT is skew symmetric.
We can take help from the properties of skew symmetric matrices, such as:
For A - AT to be skew symmetric, let’s prove that B = A - AT
To prove B = A - AT, let’s take transpose on both sides.
BT = (A - AT)T = AT - A
BT = AT - (AT)T
We know that (AT)T = A
Therefore, BT = AT - A
BT = -(A - AT) BT = -B
So, -B = -(A - AT)
We can now say that B = (A - AT) is skew-symmetric.
Theorem 2: Decomposing a Square Matrix
Like all matrices, square matrices also encode complex relationships. It is important to decompose a square matrix into similar components to understand their properties and structure. A square matrix is expressed as the sum of a symmetric matrix, and a skew-symmetric matrix will be proved in this theorem. This decomposition is significant in applications like physics.
In this theorem, we will be using the following properties:
If A is a square matrix, it can be written as:
A = \(\frac{1}{2}\) (A + AT) + 1/2 (A - AT)
Let’s consider,
P = \(\frac{1}{2}\) (A + AT)
Q = \(\frac{1}{2}\) (A - AT)
Taking the transpose of both P and Q
PT = (\(\frac{1}{2}\) (A + AT))T
= ½(AT + (AT)T)
= ½(AT + A)
P = ½(AT + A)
So, PT = P
QT = (\(\frac{1}{2}\) (A - AT))T
= ½(AT - (AT)T)
= ½ (AT - A)
= ½ (AT - A)
-Q = ½ (A - AT)
So, QT = - Q
So, the square matrix A can be written as the sum of a symmetric matrix P and a skew symmetric matrix Q.
Theorem 3: For a skew symmetric matrix A and any matrix B, the matrix BTAB is skew symmetric.
Let’s take the help of the following properties:
In this theorem, we shall prove (BTAB)T = -BTAB
Start with:
(BT AB)T = (BT (AB))T
Now use the transpose of a product rule:
= (AB)T (BT)T
Since (AB)T = BT AT we get:
(BT AB)T = BTATB
Since AT = -A, we can make the substitution.
Therefore, (BT AB)T = BT(-A)B
(BT AB)T = -BT AB
So, if A is a skew symmetric matrix, then BT AB is a skew symmetric matrix.
According to the theorem of skew symmetric matrices, the sum of any symmetric and skew symmetric matrix results in a square matrix.
For example, for a square matrix A,
A = \(\begin{bmatrix} 1 & 4 & 7 \\[0.3em] 2 & 5 & 8 \\[0.3em] 3 & 6 &9 \end{bmatrix}\)
The matrix A can be represented as the sum of B (a symmetric matrix) and C (skew symmetric matrix)
B = ½ (A + AT) = \(\begin{bmatrix} 1 & 3 &5 \\[0.3em] 3& 5 & 7\\[0.3em] 5 & 7&9 \end{bmatrix}\)
C = ½(A - AT) = \(\begin{bmatrix} 0& 1&2\\[0.3em] -1& 0 & 1\\[0.3em] -2 & -1 & 0 \end{bmatrix}\)
Adding B and C
B + C = \(\begin{bmatrix} 1& 4&7\\[0.3em] 2& 5 &8\\[0.3em] 3 & 6& 9 \end{bmatrix}\)
Determinant of Skew Symmetric Matrix
For any skew symmetric matrix with an odd order, the determinant is always zero. Let’s verify this with an example:
A = \(\begin{bmatrix} 0&2&-4\\[0.3em] -2& 0 & 3\\[0.3em] 4 & -3 & 0 \end{bmatrix}\)
Finding the determinant of a matrix A
|A| = m11 × Cofactor11 + m12 × cofactor12 + m13 × cofactor13
So, |A| = 0 × Cofactor11 + 2 × cofactor12 + (-4) × cofactor13
|A| = 2 × cofactor12 + (-4) × cofactor13
Finding cofactor C12, eliminating row 1 and column 2,
So, -2 3
4 0
det = (-2)(0) - (4)(3) = 0 - 12 = -12
So, C12 = (-1){1+2} × (-12) = 12
= -1 × -12 = 12
Finding cofactor C13, eliminating row 1 and column 3,
So, -2 0
4 -3
det = (-2)(-3) - (4)(0) = 6 - 0 = 6
So, C13 = (-1){1 + 3} × (6)
= 1 × 6 = 6
So, |A| = 2 × cofactor12 + (-4) × cofactor13
= 2 × 12 + (-4) × 6
= 24 + -24
= 0
So, the determinant of a skew symmetric matrix with an odd order is 0.
Example for \( A = \begin{bmatrix} 0 & 2 & -4 \\ -2 & 0 & 3 \\ 4 & -3 & 0 \end{bmatrix} \)
|A| = a12C12 + a13C13 = 2 . (-1)1 + 2 \(\begin{bmatrix} -2& 0\\[0.3em] 4& 0 \\[0.3em] \end{bmatrix}\) = 2
(-12) + (-4) . 0 =-24 + 0 = 0
For all skew symmetric matrices, the eigenvalue is always zero or imaginary. For a skew symmetric matrix A, the eigenvalue λ of A with a corresponding eigenvector x can be represented as:
Ax = λx
Before we proceed further, let us understand what an eigenvalue and an eigenvector are. In a matrix, a vector changes direction when a specific matrix acts on it. But an eigenvector doesn’t change direction, it only gets stretched or shrunk. The amount by which it gets stretched or shrunk is known as eigenvalue.
Multiplying both sides by the conjugate transpose of x, x -T:
x -T Ax = x -T = \(\lambda\) ||x2||
Here,
x -T Ax is a dot product, so it is commutative,
xT- Ax = (Ax )T- x= xT AT- x
As A is a skew symmetric, AT = -A
Substituting AT = -A:
xT AT x = -xT(-A)x = -xT Ax
Taking the conjugate of Ax,
Ax = x
Now, we have
-xT Ax = -xT \(\lambda\)x = - \(\bar{\lambda}\)||x||2
- \(\bar{\lambda}\)||x||2 = \(\lambda\)||x||2
As ||x||2 ≠ 0, we get
\( \lambda = - \bar{\lambda} \)
The value of \(\lambda \) is either 0 or an imaginary number.
Skew-symmetric matrices are an important concept in linear algebra, especially useful in physics, computer graphics, and engineering. Here are some simple tips to help students master them with confidence.
Students mostly make errors when working with skew symmetric matrices. Here are some of the common mistakes and ways to avoid them in skew symmetric matrices.
In real life, we use the skew symmetric matrix in fields like physics, mathematics, computer graphics, etc. In this section, we will learn some real-life applications of skew symmetric matrices.
Check if the matrix is skew symmetric: A =[0 2| -2 0]
The matrix A is skew symmetric
To verify skew symmetric matrix, we check if AT = -A
Find AT of A: \( A^\top = \begin{bmatrix} 0 & -2 \\ 2 & 0 \end{bmatrix} \)
Finding -A:
\( -A = \begin{pmatrix} 0 & -2 \\ 2 & 0 \end{pmatrix} \)
Here, AT = -A.
So, A is a skew symmetric matrix
Is the matrix skew symmetric: B = [0 3 1|-3 0 -5| -1 5 0]
The matrix B is skew.
To check if the matrix is skew symmetric matrix, we check if BT = -B
Finding \( B^\top = \begin{bmatrix} 0 & -3 & -1 \\ 3 & 0 & 5 \\ 1 & -5 & 0 \end{bmatrix} \)
Finding \( -B = \begin{bmatrix} 0 & -3 & -1 \\ 3 & 0 & 5 \\ 1 & -5 & 0 \end{bmatrix} \)
Here, -B = BT, so B is a skew symmetric matrix.
Find the transpose of the matrix C = [0 -8 7|8 0 1| -7 -1 0]
The transpose of the matrix C is:
\( C = \begin{bmatrix} 0 & 8 & -7 \\ -8 & 0 & -1 \\ 7 & 1 & 0 \end{bmatrix} \)
To find the transpose of a matrix, we turn the rows into columns.
The first row becomes the first column, so here the first column is 0, -8, 7.
The second row becomes the first column, so here the second column is 8, 0, 1.
The third row becomes the first column, so here the third column is -7, -1, 0
Find the determinant of the skew matrix: M = [0 4 7| -4 0 2| -7 -2 0]
The determinant of the skew matrix M is 0.
The skew symmetric matrix of odd order has determinant zero.
Check if the matrix D is skew symmetric: D = [0 6|6 0]
D is not skew symmetric.
To check if the matrix is skew symmetric or not, we verify if DT = -D.
Here, \( D^\top = \begin{bmatrix} 0 & 6 \\ 6 & 0 \end{bmatrix} \)
\( -D = \begin{bmatrix} 0 & -6 \\ -6 & 0 \end{bmatrix} \)
As DT is not equal to -D, it is not a skew symmetric matrix.
Jaskaran Singh Saluja is a math wizard with nearly three years of experience as a math teacher. His expertise is in algebra, so he can make algebra classes interesting by turning tricky equations into simple puzzles.
: He loves to play the quiz with kids through algebra to make kids love it.






