wiz-icon
MyQuestionIcon
MyQuestionIcon
1
You visited us 1 times! Enjoying our articles? Unlock Full Access!
Question

If g(x)=f(x)+f(x1) and f′′(x)<0,0x1 then find the intervals of monotonicity of g(x).

A
xϵ[12,1]
Right on! Give the BNAT exam to get a 100% scholarship for BYJUS courses
B
xϵ[12,1]
No worries! We‘ve got your back. Try BYJU‘S free classes today!
C
xϵ[13,1]
No worries! We‘ve got your back. Try BYJU‘S free classes today!
D
xϵ[13,1]
No worries! We‘ve got your back. Try BYJU‘S free classes today!
Open in App
Solution

The correct option is A xϵ[12,1]
g(x)=f(x)f(1x) Now f′′(x)<0 when 0x1
Hence f'(x) is a decreasing function when 0x1.
Now g(x) is increasing or decreasing according g(x)=f(x)f(1x)>0 or <0 or f(x)>f(1x) for increasing i.e.f(1x)<f(x)...(1)

Since f' is a decreasing function then by definition later value is less than the earlier value i.e.f(a)<f(b) when a>b. or f(x)<f(1x) for decreasing ...(2)
Hence (1)(1x)>x or 1>2x or x<12
xϵ[0,12] for g(x) to be increasing and x>(1x) or 2x>1 or x>12 xϵ[12,1] for g(x) to be decreasing.

flag
Suggest Corrections
thumbs-up
0
Join BYJU'S Learning Program
similar_icon
Related Videos
thumbnail
lock
Sets and Their Representations
MATHEMATICS
Watch in App
Join BYJU'S Learning Program
CrossIcon