If the functions f(x) and g(x) are continuous on [a, b] and differentiable on (a, b), then in the interval (a, b), the equation ∣∣∣f′(x)f(a)g′(x)g(a)∣∣∣=1a−b∣∣∣f(a)f(b)g(a)g(b)∣∣∣
A
has at least one root
Right on! Give the BNAT exam to get a 100% scholarship for BYJUS courses
B
has exactly one root
No worries! We‘ve got your back. Try BYJU‘S free classes today!
C
has at most one root
No worries! We‘ve got your back. Try BYJU‘S free classes today!
D
no root
No worries! We‘ve got your back. Try BYJU‘S free classes today!
Open in App
Solution
The correct option is A has at least one root ∣∣∣f′(x)f(a)g′(x)g(a)∣∣∣=1a−b∣∣∣f(a)f(b)g(a)g(b)∣∣∣ ⇒f′(x)g(a)−f(a)g′(x)=f(a)g(b)−g(a)f(b)a−b
So from lagrange's mean value theorem there exits at least root ∈(a,b)