CameraIcon
CameraIcon
SearchIcon
MyQuestionIcon
MyQuestionIcon
1
You visited us 1 times! Enjoying our articles? Unlock Full Access!
Question

Let f(x) be a function such that f(a1)=0,f(a2)=1, f(a3)=2,f(a4)=3 and f(a5)=0; where aiR and ai<aj i<j. Let g(x) be a function defined as g(x)=f(x)2+f(x)f′′(x) on [a1,a5]. If f(x) be thrice differentiable, then g(x)=0 has at least

A
4 real roots
No worries! We‘ve got your back. Try BYJU‘S free classes today!
B
5 real roots
No worries! We‘ve got your back. Try BYJU‘S free classes today!
C
6 real roots
Right on! Give the BNAT exam to get a 100% scholarship for BYJUS courses
D
7 real roots
No worries! We‘ve got your back. Try BYJU‘S free classes today!
Open in App
Solution

The correct option is C 6 real roots
g(x)=f(x)2+f(x)f′′(x)=ddx(f(x)f(x))

Since, f(a1)=0,f(a2)=1,f(a3)=2,f(a4)=3,f(a5)=0
f(x)=0 has at least 4 roots in [a1,a5].


Let roots are a1,m,n,a5; where a2<m<a3 and a3<n<a4.

And f(x)=0 has at least three roots, say b1,b2,b3; where a1<b1<m, m<b2<n and n<b3<a5.

There are at least 7 roots of f(x)f(x)=0
There are at least 6 roots of ddx(f(x)f(x))=0
There are at least 6 roots of g(x)=0

flag
Suggest Corrections
thumbs-up
0
Join BYJU'S Learning Program
similar_icon
Related Videos
thumbnail
lock
Theorems
MATHEMATICS
Watch in App
Join BYJU'S Learning Program
CrossIcon