wiz-icon
MyQuestionIcon
MyQuestionIcon
1
You visited us 1 times! Enjoying our articles? Unlock Full Access!
Question

a stone dropped from a height h reaches at the earth surface in 1 second. If the same stone is taken to the moon and drop freely from a height h then in what time will it reach the ground?

Open in App
Solution

By equation of motion,

s= ut+½at²

Here in this case in both earth and moon, initial velocity of stone u=0.

So writing conditions on earth.
s=h
t=1sec
a=g
u=0

So on applying values the equation becomes

h=(0×1)+½(g×1²)
h=½g ----------->(1)
On moon we know that gravitational force is (1/6)th times gravitational in earth, also the accelaration due to gravity.

So conditions on moon are

s=h
u=0
a=g/6
t=?

Applying values,

h= (0×t)+½((g/6)×t²)
h= (gt²)/12 ----------->(2)

Equating (1) and (2)

g/2 = gt²/12
t² = 12/2
t = √6sec.

So it means in moon it take √6 second to reach the ground.
√6 is the required answer
​​​​​​

flag
Suggest Corrections
thumbs-up
59
Join BYJU'S Learning Program
similar_icon
Related Videos
thumbnail
lock
Variation in g
PHYSICS
Watch in App
Join BYJU'S Learning Program
CrossIcon