A person measures the depth of a well by measuring the time interval between dropping a stone and receiving the sound of impact with the bottom of the well. The error in his measurement of time is δT=0.01 seconds and he measures the depth of the well to be L=20 meters. Take the acceleration due to gravity g=10 ms−2 and the velocity of sound is 300 ms−1. Then the fractional error in the measurement, δL/L, is closest to
1%
Here time taken for the stone to reach the surface
T=√2Lg+LC
where C is the speed of sound
Now, diffrentiating the above equation
δtδL=√2g×12√L+1C
δL=δt√2g×12√L+1C
here δt=0.01
⇒ δL= 0.01√210×12√20+1300
δL=316
Then the fractional error in the measurement, δL/L, is 316×20×100=1%