Important Matrices and Determinants Formulas for JEE Main and Advanced

Any rectangular arrangement of numbers in m rows and n columns is called a matrix of order m×n. Matrices and determinants is an important topic for the JEE exam. These formulas will help students to have a quick revision before the exam. Students can expect 2-3 questions from this topic.

Matrices and Determinants Formulas

Matrices

Any rectangular arrangement of numbers in m rows and n columns is called a matrix of order m×n.

A = [a11a12a13a1j.a1na21a22a23a2j.a2n.am1am2am3amj.amn]\begin{bmatrix} a_{11} &a_{12} &a_{13} &… & a_{1j} &…. &a_{1n} \\ a_{21} & a_{22} & a_{23} &… & a_{2j} &…. &a_{2n} \\ … & … & … & … & … &…. & …\\ a_{m1}& a_{m2} &a_{m3} &… & a_{mj} & …. & a_{mn} \end{bmatrix}

Where aij denotes the element of the ith row and jth column. The above matrix is denoted as [aij]m×n . The elements a11, a22, a33 etc are called diagonal elements. Their sum is called the trace of A denoted by Tr(A).

2. Basic Definitions

(i) Row matrix: A matrix having one row is called a row matrix.

(ii) Column matrix: A matrix having one column is called a column matrix.

(iii) Square matrix: A matrix of order m×n is called square matrix if m = n.

(iv) Zero matrix: A = [aij]m×n is called a zero matrix, if aij = 0 for all i and j.

(v) Upper triangular matrix: A = [aij]m×n is said to be upper triangular, if aij= 0 for i > j.

(vi) Lower triangular matrix: A = [aij]m×n is said to be lower triangular, if aij = 0 for i < j.

(vii) Diagonal matrix: A square matrix [aij]m×n is said to be diagonal, if aij = 0 for i ≠ j.

(viii) Scalar matrix: A diagonal matrix A = [aij]m×n is said to be scalar, if aij = k for i = j.

(ix) Unit matrix (Identity matrix): A diagonal matrix A = [aij]n is a unit matrix, if aij = 1 for i = j.

(x) Comparable matrices: Two matrices A and B are comparable, if they have the same order.

3. Equality of matrices: Two matrices A = [aij]m×n and B = [bij]p×q are are said to be equal, if m = p and n = q and aij = bij ∀ i and j.

4. Multiplication of a matrix by a scalar: Let λ be a scalar, then λA = [bij]m×n where bij= λaij ∀ i and j.

5. Addition of matrices: Let A = [aij]m×n and B = [bij]m×n be two matrices, then A+B = [aij]m×n+ [bij]m×n = [cij]m×n where cij = aij+bij ∀ i and j.

6. Subtraction of matrices: A-B = A+(-B), where -B = ( -1)B.

7. Properties of addition and scalar multiplication:

(i) λ(A+B) = λA+λB

(ii) λA = Aλ

(iii) (λ12)A = λ1A+λ2A

8. Multiplication of matrices: Let A = [aij]m×p and B = [bij]p×n , then AB = [cij]m×n where cij = k=1paikbkj\sum_{k=1}^{p}a_{ik}\: b_{kj} .

9. Properties of matrix multiplication:

(i) AB ≠ BA

(ii) (AB)C = A(BC)

(iii) AIn = A = InA

(iv) For every non singular square matrix A (i.e., | A |≠ 0 ) there exists a unique matrix B so that AB = In = BA. In this case we say that A and B are multiplicative inverses of one another. I.e., B = A-1 or A = B-1 .

10. Transpose of a Matrix.

Let A = [aij]m×n then Aor AT the transpose of A is defined as A’ = [bij]m×n where bij = aij ∀ i and j.

(i) (A’)’ = A

(ii) (λA)’ = λA’

(iii) (A+B)’ = A’+B’

(iv) (A-B)’ = A’-B’

(v) (AB)’ = A’B’

(vi) For a square matrix A, if A’ = A , then A is said to be a symmetric matrix.

(vii) For a square matrix A, if A’ = -A , then A is said to be a skew symmetric matrix.

11. Submatrix of a matrix:

Let A be a given matrix. The matrix obtained by deleting some rows and columns of A is called a submatrix of A.

12. Properties of determinant:

(i) | A | = | A’ | for any square matrix A.

(ii) If two rows or two columns are identical, then | A | = 0.

(iii) If | λA | = λn| A |, when A = [aij]n.

(iv) If A and B are two square matrices of the same order, then | AB |= | A || B |

13. Singular and Non-singular matrix:

A square matrix A is said to be singular, if | A | = 0.

A square matrix A is said to be non-singular, if | A | ≠ 0.

14. Cofactor and adjoint matrix.

Let A = [aij]n be a square matrix. The matrix obtained by replacing each element of A by the corresponding cofactor is called the cofactor matrix of A. The transpose of the matrix of the cofactor of A is called the adjoint of A, denoted as adj A.

15. Properties of adj A.

(i) A . adj A = | A |In = (adj A)A where A = [aij]n.

(ii) | adj A | = | A |n-1 , where n is the order of A.

(iii) If A is a symmetric matrix, then adj A is also symmetric.

(iv) If A is singular, then adj A is also singular.

(v) Let A be non singular matrix, then adjAA\frac{adj \: A}{\left | A \right |} is the multiplicative inverse of A and is denoted by A-1.

(vi) (A-1)T = (AT)-1 for any non singular matrix.

(vii) (A-1)-1 = A, if A is non singular.

(viii) A-1 is always non singular.

(ix) (adj AT ) = (adj A)T

(x) Let k be a non zero scalar and A be a non singular matrix, then (kA)-1 = 1kA1\frac{1}{k}A^{-1}

(xi) | A-1 | = 1A\frac{1}{\left | A \right |} for | A | ≠ 0.

(xii) Let A be a non singular matrix, then AB = AC⇒ B = C and BA = CA ⇒ B = C.

16. System of linear equations and matrices:

System of linear equations AX = B is said to be consistent if it has at least one solution.

(i) System of linear equations and matrix inverse:

(a) If A is non-singular, solution is given by X = A-1B.

(b) If A is singular, adj(A) B = 0 and no two columns of A are proportional, then the system has infinitely many solutions.

(c) If A is singular and (adj A)B ≠ 0, then the system has no solution.

(ii) Homogeneous system and matrix inverse:

If the above system is homogeneous, n equations in n unknowns, then in the matrix form it is AX = 0. ( since b1 = b2 =….. bn = 0), where A is a square matrix.

If A is non-singular, the system has only one trivial solution. X = 0.

If A is singular, then the system has infinitely many solutions (including the trivial solution) and hence it has non-trivial solutions.

(iii) Elementary row transformation of Matrix:

The following operations on a matrix are called elementary row transformations.

(a) Interchanging two rows.
(b) Multiplication of all the elements of a row by a non zero scalar.

(c) Addition of a constant multiple of a row to another row.

17. Characteristic Polynomial and characteristic equation.

Let A be a square matrix, then the polynomial | A-xI | is called the characteristic polynomial of A and the equation | A-xI | = 0 is called the characteristic equation of A.

18. Cayley Hamilton theorem:

Every square matrix A satisfies its characteristic equation. I.e., a0xn+a1xn-1+…..+an-1x+an = 0 is the characteristic equation of A, then a0An+a1An-1+……..+an-1A+anI = 0

19. More definitions on matrices:

(i) Nilpotent matrix: A square matrix A is called nilpotent if Ap = 0 for some positive integer. If p is the smallest such positive integer, then p is called its nilpotency.

(ii) Idempotent matrix: A square matrix A is said to be idempotent if, A2 = A.

(iii) involutory matrix: A square matrix A is said to be involutory if, A2 = I.

(iv) Orthogonal matrix: A square matrix A is said to be orthogonal if, ATA = I = AAT.

(v) Unitary matrix: A square matrix A is said to be unitary if A(Aˉ)T=IA\: (\bar{A})^{T}= I where Aˉ\bar{A} is the complex conjugate of A

Determinants

We write the expression a1b2-a2b1 as a1b1a2b2\begin{vmatrix} a_{1} & b_{1}\\ a_{2}& b_{2} \end{vmatrix} and a1b1a2b2\begin{vmatrix} a_{1} & b_{1}\\ a_{2}& b_{2} \end{vmatrix} is called a determinant of order 2.

1. Expansion of determinant: a1b1c1a2b2c2a3b3c3\begin{vmatrix} a_{1} & b_{1} & c_{1}\\ a_{2}& b_{2} & c_{2}\\ a_{3}& b_{3} & c_{3} \end{vmatrix} is called the determinant of order three. Its value can be found as

D = a1b2c2b3c3a_{1}\begin{vmatrix} b_{2} & c_{2}\\ b_{3} & c_{3} \end{vmatrix}a2b1c1b3c3a_{2}\begin{vmatrix} b_{1} & c_{1}\\ b_{3} & c_{3} \end{vmatrix} + a3b1c1b2c2a_{3}\begin{vmatrix} b_{1} & c_{1}\\ b_{2} & c_{2} \end{vmatrix}

2. Minors: The minor of aij is obtained by deleting ith row and jth column from the determinant. It is denoted by Mij.

3. Cofactor: Cofactor of element aij is Cij = (-1)i+j Mij

D = a11M11 – a12M12+a13M13 = a11C11+a12C12+a13C13

4. Transpose of a Determinant: The transpose of a determinant is a determinant obtained after interchanging the rows and columns.

D = a1b1c1a2b2c2a3b3c3\begin{vmatrix} a_{1} & b_{1} & c_{1}\\ a_{2}& b_{2} & c_{2}\\ a_{3}& b_{3} & c_{3} \end{vmatrix} and DT=a1a2a3b1b2b3c1c2c3D^{T}= \begin{vmatrix} a_{1} & a_{2} & a_{3}\\ b_{1}& b_{2} & b_{3}\\ c_{1}& c_{2} & c_{3} \end{vmatrix}

5. Symmetric, Skew symmetric, Asymmetric Determinants:

(i) A determinant is symmetric if it is identical to its transpose. The ith row is identical to its ith column. I.e. aij = aji for all values of i and j.

(ii) A determinant is skew symmetric if it is identical to its transpose having the sign of each element inverted. I.e. aij = -aji for all values of i and j.

(iii) A determinant is asymmetric if it is neither symmetric nor skew symmetric.

6. Properties of determinants:

(i) D = D’

(ii) If a determinant has all the elements zero in any row or column, then D = 0

(iii) If any two rows or columns of a determinant be interchanged, then D’ = -D.

(iv) If a determinant has any two rows or columns identical, then D = 0.

(v) If all the elements of any row or column be multiplied by the same number k, then D’ = kD.

(vi) If each element of any row or column can be expressed as a sum of two terms, then the determinant can be expressed as the sum of two determinants. i.e. a1+xb1+yc1+za2b2c2a3b3c3\begin{vmatrix} a_{1}+x & b_{1}+y & c_{1}+z\\ a_{2}& b_{2} & c_{2}\\ a_{3}& b_{3} & c_{3} \end{vmatrix} = a1b1c1a2b2c2a3b3c3\begin{vmatrix} a_{1} & b_{1} & c_{1}\\ a_{2}& b_{2} & c_{2}\\ a_{3}& b_{3} & c_{3} \end{vmatrix} + xyza2b2c2a3b3c3\begin{vmatrix} x & y & z\\ a_{2}& b_{2} & c_{2}\\ a_{3}& b_{3} & c_{3} \end{vmatrix}

(vii) The value of a determinant is not altered by adding to the elements of any row or column a constant multiple of the corresponding elements of any other row or column.

7. Multiplication of two determinants: a1b1a2b2×l1m1l2m2\begin{vmatrix}_{} a_{1} &b_{1} \\ a_{2}& b_{2} \end{vmatrix} \times \begin{vmatrix} l_{1} &m_{1}\\l_{2}&m_{2}\end{vmatrix} = a1l1+b1l2a1m1+b1m2a2l1+b2l2a2m1+b2m2\begin{vmatrix} a_{1}l_{1}+b_{1} l_{2}&a_{1}m_{1}+b_{1} m_{2} \\ a_{2}l_{1}+b_{2} l_{2} & a_{2}m_{1}+b_{2} m_{2} \end{vmatrix}

8. Summation of determinants:

Let Δ(r) = f(r)g(r)h(r)a1a2a3b1b2b3\begin{vmatrix} f(r) & g(r) &h(r) \\ a_{1}&a_{2} &a_{3} \\ b_{1}& b_{2} & b_{3} \end{vmatrix} , then r=1nΔ(r)=\sum_{r=1}^{n}\Delta (r) =

Where a1, a2, a3, b1, b2, b3 are constants independent of r.

9. Integration of a determinant:

Let Δ(x)=f(x)g(x)h(x)a1b1c1a2b2c2\Delta (x) = \begin{vmatrix} f(x)&g(x) & h(x)\\ a_{1} & b_{1}&c_{1} \\ a_{2}& b_{2} & c_{2} \end{vmatrix} ,

thenabΔ(x)dx=abf(x)abg(x)abh(x)a1b1c1a2b2c2\int_{a}^{b}\Delta (x) \: dx= \begin{vmatrix} \int_{a}^{b}f(x)&\int_{a}^{b}g(x) &\int_{a}^{b} h(x)\\ a_{1} & b_{1}&c_{1} \\ a_{2}& b_{2} & c_{2} \end{vmatrix} Where a1, a2, a3, b1, b2, b3 are constants independent of x.

10. Differentiation of determinant:

Let Δ(x)=f1(x)f2(x)f3(x)g1(x)g2(x)g3(x)h1(x)h2(x)h3(x)\Delta (x) = \begin{vmatrix} f_{1}(x)&f_{2}(x) & f_{3}(x)\\ g_{1}(x) & g_{2}(x)&g_{3}(x) \\ h_{1}(x)& h_{2}(x) & h_{3}(x) \end{vmatrix}

then, Δ(x)=f1(x)f2(x)f3(x)g1(x)g2(x)g3(x)h1(x)h2(x)h3(x)\Delta ‘(x) = \begin{vmatrix} f_{1}'(x)&f_{2}'(x) & f_{3}'(x)\\ g_{1}(x) & g_{2}(x)&g_{3}(x) \\ h_{1}(x)& h_{2}(x) & h_{3}(x) \end{vmatrix} + f1(x)f2(x)f3(x)g1(x)g2(x)g3(x)h1(x)h2(x)h3(x)\begin{vmatrix} f_{1}(x)&f_{2}(x) & f_{3}(x)\\ g_{1}'(x) & g_{2}'(x)&g_{3}'(x) \\ h_{1}(x)& h_{2}(x) & h_{3}(x) \end{vmatrix} + f1(x)f2(x)f3(x)g1(x)g2(x)g3(x)h1(x)h2(x)h3(x)\begin{vmatrix} f_{1}(x)&f_{2}(x) & f_{3}(x)\\ g_{1}(x) & g_{2}(x)&g_{3}(x) \\ h_{1}'(x)& h_{2}'(x) & h_{3}'(x) \end{vmatrix}

11. Cramer’s rule:

(i) Two variables:

(a) Consistent equations: Definite and unique solution. [Intersecting lines]

(b) Inconsistent equation: No solution. [Parallel line]

(c) Dependent equation: Infinite solutions. [Identical lines]

Let a1x+b1y+c1 = 0 and a2x+b2y+c2 = 0 then, a1a2=b1b2c1c2\frac{a_{1}}{a_{2}} = \frac{b_{1}}{b_{2}}\neq \frac{c_{1}}{c_{2}} ⇒ given equations are inconsistent.

If a1a2=b1b2=c1c2\frac{a_{1}}{a_{2}} = \frac{b_{1}}{b_{2}}= \frac{c_{1}}{c_{2}} ⇒ given equations are dependent.

(ii) Three variables:

Let a1x+b1y+c1z = d1 , a2x+b2y+c2z = d2 and a3x+b3y+c3z = d3, then x= D1/D, y = D2/D and z = D3/D, where D = a1b1c1a2b2c2a3b3c3\begin{vmatrix} a_{1} & b_{1} & c_{1}\\ a_{2} & b_{2} &c_{2} \\ a_{3} & b_{3} & c_{3} \end{vmatrix} ; D1=d1b1c1d2b2c2d3b3c3D_{1}= \begin{vmatrix} d_{1} & b_{1} & c_{1}\\ d_{2} & b_{2} &c_{2} \\ d_{3} & b_{3} & c_{3} \end{vmatrix} ; D2=a1d1c1a2d2c2a3d3c3D_{2}= \begin{vmatrix} a{1} & d_{1} & c_{1}\\ a_{2} & d_{2} &c_{2} \\ a_{3} & d_{3} & c_{3} \end{vmatrix} ; D3=a1b1d1a2b2d2a3b3d3D_{3}= \begin{vmatrix} a{1} & b_{1} & d_{1}\\ a_{2} & b_{2} &d_{2} \\ a_{3} & b_{3} & d_{3} \end{vmatrix}

(iii) Consistency of a system of equations:

(a) If D ≠ 0 and at least one of D1, D2, D3 ≠ 0, then the given system of equations are consistent and has a unique non trivial solution.

(b) If D ≠ 0 and D1 = D2 = D3 = 0, then the given system of equations are consistent and has trivial solution only.

(c) If D = D1 = D2 = D3 = 0, then the given system of equations have either infinite solutions or no solution.

(d) If D = 0 but atleast one of D1, D2, D3 is not zero then the equations are inconsistent and have no solution.

(e) If a given system of linear equations has only zero solution for all its variables then the given equations are said to have trivial solution.

(iv) Three equation in two variable:

If x and y are not zero, then the condition for a1x+b1y+c1 = 0; a2x+b2y+c2 = 0 and a3x+b3y+c3 = 0 to be consistent in x and y is a1b1c1a2b2c2a3b3c3=0\begin{vmatrix} a_{1} & b_{1} & c_{1}\\ a_{2} & b_{2} &c_{2} \\ a_{3} & b_{3} & c_{3} \end{vmatrix} = 0

13. Application of Determinants:

(i) Area of a triangle whose vertices are (xr, yr) ;r = 1,2,3 is D = 12x1y11x2y21x3y31\frac{1}{2} \begin{vmatrix} x_{1} & y_{1} & 1\\ x_{2} & y_{2} &1 \\ x_{3} & y_{3} & 1 \end{vmatrix} . If D = 0, then the three points are collinear.