Given the function f(x)=ex+ln(x+1)−ax, where a∈R. If there exists two distinct roots x1,x2(x1<x2) of f′(x)=0, then
A
f(x2)−f(x1)<0
Right on! Give the BNAT exam to get a 100% scholarship for BYJUS courses
B
f(x2)−f(x1)>0
No worries! We‘ve got your back. Try BYJU‘S free classes today!
C
a>2
Right on! Give the BNAT exam to get a 100% scholarship for BYJUS courses
D
a<2
No worries! We‘ve got your back. Try BYJU‘S free classes today!
Open in App
Solution
The correct options are Af(x2)−f(x1)<0 Ca>2 f(x)=ex+ln(x+1)−ax Clearly, x>−1 f′(x)=ex+1x+1−a We can easily conclude that f(−1+)→−∞ and f′(−1+)→+∞. That means that f′ has positive values for these regions.
f′′(x)=ex−1(x+1)2 Critical point of f′:f′′(x)=0⇒x=0 f′′′(x)=ex+2(x+1)3>0 for all x>−1 ⇒f′′ is strictly increasing. This means f′′(x)=0 only when x=0. So, f′ reaches it's minimum at x=0.
If f′(0)>0, then there are no x1,x2. So, we have f′(0)<0 ⇒e0+10+1−a<0 ⇒1+1<a ⇒a>2
Since x1<x2 and {f′(0)<0} while f′(0)>x for x→−1 and x→+∞, then x1<0<x2 but also f(x1)>f(0)>f(x2) which means f(x2)−f(x1)<0. Thus, 2lna>2ln2>0>f(x2)−f(x1)