Eigenvalues are associated with eigenvectors in Linear algebra. Both terms are used in the analysis of linear transformations. Eigenvalues are the special set of scalar values which is associated with the set of linear equations most probably in the matrix equations. The eigenvectors are also termed as characteristic roots. It is a non-zero vector which can be changed at most by its scalar factor after the application of linear transformations. And the corresponding factor which scales the eigenvectors is called an eigenvalue.
Table of contents:
Eigenvalues are the special set of scalars associated with the system of linear equations. It is mostly used in matrix equations. ‘Eigen’ is a German word which means ‘proper’ or ‘characteristic’. Therefore, the term eigenvalue can be termed as characteristics value, characteristics root, proper values or latent roots as well. In simple words, the eigenvalue is a scalar that is used to transform the eigenvector. The basic equation is
AX = λX
The number or scalar value “λ” is an eigenvalue of A.
In Mathematics, eigenvector corresponds to the real non zero eigenvalues which point in the direction stretched by the transformation whereas eigenvalue is considered as a factor by which it is stretched. In case, if the eigenvalue is negative, the direction of the transformation is negative.
What are EigenVectors?
Eigenvectors are the vectors (non-zero) which do not change the direction when any linear transformation is applied. It changes by only a scalar factor. In a brief, we can say, if A is a linear transformation from a vector space V and X is a vector in V, which is not a zero vector, then v is an eigenvector of A if A(X) is a scalar multiple of X.
An eigenspace of vector X consists of a set of all eigenvectors with the equivalent eigenvalue collectively with the zero vector. Though, the zero vector is not an eigenvector.
Let us say A is an “n × n” matrix and λ is an eigenvalue of matrix A, then X, a non-zero vector, is called as eigenvector if it satisfies the given below expression;
AX = λX
X is an eigenvector of A corresponding to eigenvalue, λ.
- There could be infinitely many Eigenvectors, corresponding to one eigenvalue.
- For distinct eigenvalues, the eigenvectors are linearly dependent.
Eigenvalues of a Square Matrix
Suppose, An×n is a square matrix, then [A- λI] is called an eigen or characteristic matrix, which is an indefinite or undefined scalar. Where determinant of Eigen matrix can be written as, |A- λI| and |A- λI| = 0 is the eigen equation or characteristics equation, where “I” is the identity matrix. The roots of an eigen matrix are called eigen roots.
Properties of Eigenvalues
- Eigenvectors with Distinct Eigenvalues are Linearly Independent
- Singular Matrices have Zero Eigenvalues
- If A is a square matrix, then λ = 0 is not an eigenvalue of A
- For scalar multiple of matrix: If A is a square matrix and λ is an eigenvalue of A. Then, aλ is an eigenvalue of aA.
- For Matrix powers: If A is square matrix and λ is an eigenvalue of A and n≥0 is an integer, then λn is an eigenvalue of An.
- For polynomial of matrix: If A is square matrix, λ is an eigenvalue of A and p(x) is a polynomial in variable x, then p(λ) is the eigenvalue of matrix p(A).
- Inverse Matrix: If A is square matrix, λ is an eigenvalue of A, then λ-1 is an eigenvalue of A-1
- Transpose matrix: If A is square matrix, λ is an eigenvalue of A, then λ is an eigenvalue of At
Download BYJU’S-The Learning App and get personalised video content to understand the maths fundamental in an easy way.