wiz-icon
MyQuestionIcon
MyQuestionIcon
1
You visited us 1 times! Enjoying our articles? Unlock Full Access!
Question

Let f(x)=x2+ax+b, where a, b ϵ R. If f(x)=0 has all its roots imaginary, then the roots of f(x)+f(x)+f"(x)=0 are

A
Real and distinct
No worries! We‘ve got your back. Try BYJU‘S free classes today!
B
Imaginary
Right on! Give the BNAT exam to get a 100% scholarship for BYJUS courses
C
Equal
No worries! We‘ve got your back. Try BYJU‘S free classes today!
D
Rational and equal
No worries! We‘ve got your back. Try BYJU‘S free classes today!
Open in App
Solution

The correct option is B Imaginary
Given, f(x)=x2+ax+b has imaginary roots.
Discriminant, D<0a24b<0

Now, f(x)=2x+a
f(x)=2

Also, f(x)+f(x)+f"(x)=0 ..........(i)
x2+ax+b+2x+a+2=0

x2+(a+2)x+b+a+2=0

x=(a+2)±(a+2)24(a+b+2)2

x=(a+2)±a24b42

Since, a24b<0

a24b4<0

Hence, eqn(i) has imaginary roots.

flag
Suggest Corrections
thumbs-up
0
Join BYJU'S Learning Program
similar_icon
Related Videos
thumbnail
lock
Nature of Roots
MATHEMATICS
Watch in App
Join BYJU'S Learning Program
CrossIcon