wiz-icon
MyQuestionIcon
MyQuestionIcon
1
You visited us 1 times! Enjoying our articles? Unlock Full Access!
Question

In an experiment of simple pendulum time period measured was 50 sec for 25 vibration when the length of the simple pendulum was taken 100 cm. If the least count of stop watch is 0.1 s and that of meter scale is 0.1 cm. Calculate the maximum percentage error in the measurement of value of g.


A

No worries! We‘ve got your back. Try BYJU‘S free classes today!
B

No worries! We‘ve got your back. Try BYJU‘S free classes today!
C

Right on! Give the BNAT exam to get a 100% scholarship for BYJUS courses
D

none of these

No worries! We‘ve got your back. Try BYJU‘S free classes today!
Open in App
Solution

The correct option is C


The time period of a simple pendulum is given by T = 2πLg or T2=4π2lg or g=4π2lT2 As 4 and π are constant, maximum permissible error in g is given by Now gg=ll+2TT Here l=0.1 cm, L = 1m = 100 cm, T = 0.1s, T = 50s.

gg=0.1100+2(0.150) = 0.1100+(0.125);gg×100=[0.1100+0.125]×100=0.1+0.4=0.5%


flag
Suggest Corrections
thumbs-up
24
similar_icon
Similar questions
View More
Join BYJU'S Learning Program
similar_icon
Related Videos
thumbnail
lock
Error and Uncertainty
PHYSICS
Watch in App
Join BYJU'S Learning Program
CrossIcon