wiz-icon
MyQuestionIcon
MyQuestionIcon
1
You visited us 1 times! Enjoying our articles? Unlock Full Access!
Question

If A and B are different matrices satisfying A3=B3andA2B=B2A, then

A
det(A2+B2) must be zero
No worries! We‘ve got your back. Try BYJU‘S free classes today!
B
det (A-B) must be zero
No worries! We‘ve got your back. Try BYJU‘S free classes today!
C
Both det(A2+B2) & det(A-B) must be zero
No worries! We‘ve got your back. Try BYJU‘S free classes today!
D
At least one of det (A2+B2) or det (A-B) must be zero
Right on! Give the BNAT exam to get a 100% scholarship for BYJUS courses
Open in App
Solution

The correct option is D At least one of det (A2+B2) or det (A-B) must be zero
A3=B3(i) and
A2B=B2A(ii)on subtracting
equation (i) and (ii) we get
A3A2B=B3B2AA2(AB)=B2(AB)
(A2+B2)(AB)=O
|(A2+B2).(AB)|=0
det(A2+B2)=0 or det(AB)=0

flag
Suggest Corrections
thumbs-up
21
Join BYJU'S Learning Program
Join BYJU'S Learning Program
CrossIcon