A source $ S$and a detector $ D$ are placed at a distance $ d$ apart. A big cardboard is placed at a distance $ \surd 2d$ from the source and the detector as shown in figure (below). The source emits a wave of wavelength $ d/2$ which is received by the detector after reflection from the cardboard. It is found to be in phase with the direct wave received from the source. By what minimum distance should the cardboard be shifted away so that the reflected wave becomes out of phase with the direct wave?
Step 1: Given data
The wavelength of the source:
Let the path differece is .
Let the minimum distance reflected wave becomes out of phase with the direct wave is .
Step 2: Find the minimum distance that the reflected wave becomes out of phase with the direct wave
From the figure,
The initial path difference between sound waves received by the detector before shifting the cardboard,
path difference Distance travelled by the sound wave from Sourse to Detector Distance between Source and Detector .
The distance travelled by the sound wave from Sourse to Detector is shown in figure,
If it is shifted a distance then the path difference will be
Since the minimum distance should be so that the reflected wave becomes out of phase with the direct wave,
Therefore,
Step 3. Now, According to the question,
Hence, the minimum distance the cardboard be shifted away is .