Two sources of sound, and , emitting waves of equal wavelength , are placed with a separation of between them. A detector can be moved on a line parallel to and at a distance of from it. Initially, the detector is equidistant from the two sources. Assuming that the waves emitted by the sources are in phase, find the minimum distance through which the detector should be shifted to detect a minimum of sound.
Step 1: Given
Wavelength of sound sources:
Distance between sources:
Perpendicular distance of detector from is
Distance by which detector is shifted (assume):
Step 2: Formula Used
Path difference for minima is given by , where is number of mode and is wavelength.
Pythagoras theorem:
Step 3: Find the minimum distance for hearing
Calculate the minimum hearing distance using the formula. For minimum path difference, .
Find AB using Pythagoras theorem. Base will be because the detector was in the middle before and then moved by a distance of towards AB.
Find BC using Pythagoras theorem. Base will be because the detector was in the middle before and then moved by a distance of away from BC.
Calculate the path difference by AB from BC. Substitute the values of AB and BC using Pythagoras theorem.
Hence, the detector should be shifted through .