wiz-icon
MyQuestionIcon
MyQuestionIcon
1
You visited us 1 times! Enjoying our articles? Unlock Full Access!
Question

It is claimed that two cesium clocks, if allowed to run for 100 years, free from any disturbance, may differ by only about 0.02 s. What does this imply for the accuracy of the standard cesium clock in measuring a time-interval of 1 s ?

Open in App
Solution

Given: 100 years are required to show the difference of 0.02 seconds.

Convert years into seconds,

Timeinseconds=100×365×24×60×60 =3.15× 10 9 sec

Fractional error = Differenceintime( s ) Timeintervel( s )

Cesium clock shows the difference of 0.02 sec in 3.15× 10 9 sec,

Then in 1 sec the clock shows difference of 0.02 3.15× 10 9 sec.

0.02 3.15× 10 9 sec=6.35× 10 10 sec = 10 11 sec

Therefore, in 1s the difference is 10 11 .

DegreeofAccuracy= 1 10 11 = 10 11

Thus, the degree of accuracy shown by the cesium clock in 1s is from 10 11 .


flag
Suggest Corrections
thumbs-up
0
Join BYJU'S Learning Program
similar_icon
Related Videos
thumbnail
lock
Error and Uncertainty
PHYSICS
Watch in App
Join BYJU'S Learning Program
CrossIcon