Suppose a monochromatic X-ray, beam of wavelength 100 pm is sent through a Young's double slit and the interference pattern is observed on a photographic plate placed 40 cm away from the slit. What should be the separation between the slits so that the successive maxima on the screen are separated by a distance of 0.1 mm ?
Given λ=10pm=100×10−12 m
D= 40 cm = 40 ×10−2 m
β=0.1mm=0.1×10−3 m
β=λDd
d =λDβ
= 100×10−12×40×10−210−3×0.1
= 4 ×10−7 m.