Orthogonal Matrix

When the product of a matrix to its transpose gives identity value, then the matrix is said to be an orthogonal matrix. This is the general definition of an orthogonal matrix. Before discussing it briefly, let us first know what are matrices, in terms of Mathematics. Matrix is a rectangular array, which consists of numbers, expressions and symbols and rows and columns in it. Let us see an example of a matrix;

\(\begin{bmatrix} 2 & 3 & 4\\ 4 & 5 & 6 \end{bmatrix}\)

In the above matrix, you can see there are two rows and 3 columns. So basically the matrix is defined in such format;

\(\begin{bmatrix} a_{11}& a_{12} & a_{13} & ….a_{1n}\\ a_{21} & a_{22} & a_{23} & ….a_{2n}\\ . & . & .\\ . & . & .\\ . & . & .\\ a_{m1} & a_{m2} & a_{m3} & ….a_{mn} \end{bmatrix}\)

Where n is the number of columns and m is the number of rows, aij are its elements such that i=1,2,3,…n & j=1,2,3,…m

If m=n, which means the number of rows and number of columns are equal then the matrix is called a square matrix.

For example, \(\begin{bmatrix} 2 & 4 & 6\\ 1 & 3 & -5\\ -2 & 7 & 9 \end{bmatrix}\)

This is a square matrix, which has 3 rows and 3 columns.

In the same way, there are a lot of concepts related to matrices and different types of matrices such as row matrix, column matrix, rectangular matrix, diagonal matrix, scalar matrix, zero or null matrix, unit or identity matrix, upper triangular matrix & lower triangular matrix and their properties to learn, which are used in linear algebra. An orthogonal matrix is a specially featured matrix, defined on the basis of using the square matrix. In this article, a brief explanation of orthogonal matrix is given with its definition and properties. Let us begin learning.

Orthogonal Matrix Definition

We know that a square matrix has an equal number of rows and columns. A square matrix with real numbers or elements is said to be an orthogonal matrix, if its transpose is equal to its inverse matrix or we can say, when the product of a square matrix and its transpose gives an identity matrix, then the square matrix is known as orthogonal matrix.

Suppose A is a square matrix with real elements and of n x n order and AT is the transpose of A.Then according to the definition, if, AT = A-1 is satisfied, then,

A AT = I

Where ‘I’ is the identity matrix, A-1 is the inverse of matrix A and ‘n’ denotes the number of rows and columns.

Orthogonal Matrix Properties

  • This property of orthogonal is only applicable for the square matrix.
  • The orthogonal matrix has all real elements in it.
  • The orthogonal matrix is a symmetric matrix always.
  • All identity matrices are an orthogonal matrix.
  • The product of two orthogonal matrices is also an orthogonal matrix
  • The collection of orthogonal matrix of order n x n, in a group, is called an orthogonal group and is denoted by ‘O’.
  • The transpose of the orthogonal matrix is also orthogonal. Thus, if matrix A is orthogonal, then is AT is also an orthogonal matrix.
  • In the same way, the inverse of the orthogonal matrix which is A-1 is also an orthogonal matrix.
  • The determinant of the orthogonal matrix has value ±1.
  • The eigenvalues of the orthogonal matrix also have value as ±1 and its eigenvectors would also be orthogonal and real.

Determinant of Orthogonal Matrix

The number which is associated with the matrix is the determinant of a matrix. The determinant of a square matrix is represented inside vertical bars. Let Q be a square matrix having real elements and P is the determinant, then,

Q = \(\begin{bmatrix} a1 & a2 \\ b1 & b2 & \end{bmatrix}\)

And |Q| = \(\begin{vmatrix} a1 & a2 \\ b1 & b2 \end{vmatrix}\)

|Q| = a1.b2 – a2.b1

If Q is an orthogonal matrix, then,

|Q| = ±1

Therefore, for value of determinant for orthogonal matrix will be either +1 or -1.

Let us see some examples of an orthogonal matrix.

Example: Prove Q = \(\begin{bmatrix} cosZ & sinZ \\ -sinZ & cosZ\\ \end{bmatrix}\) is orthogonal matrix.

Solution: Given, Q = \(\begin{bmatrix} cosZ & sinZ \\ -sinZ & cosZ\\ \end{bmatrix}\)

So, QT = Q = \(\begin{bmatrix} cosZ & -sinZ \\ sinZ & cosZ\\ \end{bmatrix}\)

Now let us find Q-1.

Q-1 = \(\frac{Adj(Q)}{|Q|}\)

Q-1 = \(\frac{\begin{bmatrix} cosZ & -sinZ\\ sinZ & cosZ \end{bmatrix}}{cos^2Z + sin^2 Z}\)

Q-1 = \(\frac{\begin{bmatrix} cosZ & -sinZ\\ sinZ & cosZ \end{bmatrix}}{1}\)

Q-1 = \(\begin{bmatrix} cosZ & -sinZ \\ sinZ & cosZ\\ \end{bmatrix}\)

Since QT = Q-1

Therefore, Q is an orthogonal matrix

Practise This Question

Most of us like playing cricket. If you notice a cricket pitch, it has lines drawn at its ends which denote the batting crease. These lines are actually line segments as their length is  _______.

Leave a Comment

Your email address will not be published. Required fields are marked *