CameraIcon
CameraIcon
SearchIcon
MyQuestionIcon
MyQuestionIcon
1
You visited us 1 times! Enjoying our articles? Unlock Full Access!
Question

If one root of the equation x2+ax+b=0 is a root of the equation x2+cx+d=0, prove that the other roots satisfy the equation x2+x(2ac)+(a2ac+d)=0.

Open in App
Solution

Both the equations have coefficient of unity and hence common root is obtained by subtracting the two equations.
x(ac)+(bd)=0
α=(bd)(ac)
If the other other root of first equation be β, then
α+β=a β=aα
or β=a+bdac=a2+ac+bdac
β(ac)+a2acb+d=0..........(1)
Since β is a root of (1)st
β2+αβ+b=0........(2)
Add the above two, we get the results (1) and (2).

flag
Suggest Corrections
thumbs-up
0
Join BYJU'S Learning Program
similar_icon
Related Videos
thumbnail
lock
Relation of Roots and Coefficients
MATHEMATICS
Watch in App
Join BYJU'S Learning Program
CrossIcon