g(x)=(f′(x))2+f"(x)f(x)
g(x)=ddx(f(x).f′(x))
Now, let h(x)=f(x).f′(x)
Between any two roots of h(x) there lies one root of h′(x)=0 .. By Rolle's theorem
⇒g(x)=0,h(x)=0
Now, f(x) is zero in at least four places, and f′(x) is zero in at least three places.
Now, h(x) is zero in at least 7 places, hence h′(x) is zero in atleast 6 places.