If the function f(x) and g(x) are continuous in [a, b] and differentiable in (a, b), then the equation ∣∣∣f(a)f(b)g(a)g(b)∣∣∣=(b−a)∣∣∣f(a)f′(x)g(a)g′(x)∣∣∣ has, in the interval [a, b].
A
Atleast one root
Right on! Give the BNAT exam to get a 100% scholarship for BYJUS courses
B
Exactly one root
No worries! We‘ve got your back. Try BYJU‘S free classes today!
C
Atmost one root
No worries! We‘ve got your back. Try BYJU‘S free classes today!
D
No root
No worries! We‘ve got your back. Try BYJU‘S free classes today!
Open in App
Solution
The correct option is A Atleast one root
Consider Lagrange's mean value theorem for f(x) and g(x) in (b,a).
∴f′(x)=f(b)−f(a)b−a and g′(x)=g(b)−g(a)b−a have atleast one real solution each.
Hence, a linear combination of these equations should also have atleast one real solution.