CameraIcon
CameraIcon
SearchIcon
MyQuestionIcon


Question

A sound wave has frequency of $$2\, kHz$$ and wavelength of 35 cm. If an observer is $$1.4\,km$$ away from the source, after what time interval could the observer hear the sound?


A
2s
loader
B
20s
loader
C
0.5s
loader
D
4s
loader

Solution

The correct option is A $$2s$$
Speed of wave = frequency x wavelength
speed = $$\dfrac{distance}{time}$$

$$\therefore$$ $$\dfrac{distance}{time}$$ = $$0.35\times2000$$ = $$700\ m/s$$

$$time$$ = $$\dfrac{1400}{700}$$ = $$2s$$

General Knowledge

Suggest Corrections
thumbs-up
 
0


similar_icon
Similar questions
View More


similar_icon
People also searched for
View More



footer-image