wiz-icon
MyQuestionIcon
MyQuestionIcon
1
You visited us 1 times! Enjoying our articles? Unlock Full Access!
Question

Use Rolle's theorem to prove that equation ax2+bx=a3+b2 has a root between 0 and 1.

Open in App
Solution

we know that According to the Rolle's Theorem
F(x) should continuous in [a,b]
F(x) should differentiable in (a,b)
And F(a)=F(b)

Then, there exist at least one value of x, (letC) where C(a,b) such that F(C)=0

Let

F(x)=ax2+bx(a3+b2)F(x)=[ax2+bx(a3+b2)]dx=ax33+bx22(a3+b2)x+C

Now,
F(x) is continuous in [0,1] as it is polynomial
F(x) is also differentiable in (0,1)

F(0)=CF(1)=a3+b2(a3+b2)+C=CF(0)=F(1)

Hence,
All conditions are fulfilled, so there exist a C so that F(x)=ax2+bx(a3+b2) has a root.








flag
Suggest Corrections
thumbs-up
0
Join BYJU'S Learning Program
similar_icon
Related Videos
thumbnail
lock
Theorems for Differentiability
MATHEMATICS
Watch in App
Join BYJU'S Learning Program
CrossIcon