wiz-icon
MyQuestionIcon
MyQuestionIcon
1
You visited us 1 times! Enjoying our articles? Unlock Full Access!
Question

If α and βare the roots of the equation x2+ax+b=0, then (1/α2)+(1/β2)=


A

(a2-2b)b2

Right on! Give the BNAT exam to get a 100% scholarship for BYJUS courses
B

(b2-2a)b2

No worries! We‘ve got your back. Try BYJU‘S free classes today!
C

(a2+2b)b2

No worries! We‘ve got your back. Try BYJU‘S free classes today!
D

(b2+2a)b2

No worries! We‘ve got your back. Try BYJU‘S free classes today!
Open in App
Solution

The correct option is A

(a2-2b)b2


Find the value of (1/α2)+(1/β2):

Given that α and βare the roots of the equation x2+ax+b=0,

(α+β)=-a;α.β=b

Now, 1α2+1β2=(β2+α2)(α.β)2

1α2+1β2=(β2+α2+2α.β-2α.β)(α.β)21α2+1β2=(β+α)2-2α.β(α.β)21α2+1β2=(-a)2-2b(b)21α2+1β2=a2-2bb2

Hence, the correct option is A.


flag
Suggest Corrections
thumbs-up
16
Join BYJU'S Learning Program
similar_icon
Related Videos
thumbnail
lock
Relations Between Roots and Coefficients
MATHEMATICS
Watch in App
Join BYJU'S Learning Program
CrossIcon