Loss of Interference in Two-slit Experiment

Most recent answer: 10/22/2014

Q:
Howdy. I've been familiar with the double slit experiment for some time, and I have a question which somebody must have answered before. The standard experiment consists of an electron "gun", a barrier with two slits or holes, and a detector. I think of the detector as a wall like a CRT screen. Both slits are open and we see an interference pattern. There's lots of conclusions about covering one slit or another, or putting detectors at one hole or another. My question is this: What happens as we slowly move the detector closer to the slitted barrier? At D = some "long" distance we see the interference, but at D = 0 (distance between detector and barrier = 0) I assume there can't be interference or superposition and we are doing the same thing as putting a detector at each slit. What distance does the interference disappear? My intuition says it has to do with the wavelength.
- Dom (age 45)
Aurora, CO
A:

The loss of interference that you're describing is pretty easy to describe. It's not even a quantum effect, but would be found for any sorts of classical waves. Far away from the slits, the waves that come through from each slit are about equal in magnitude. So when they add in phase you get 4 times the intensity you'd get from one slit. When the add exactly out of phase they cancel and you get almost no intensity.

Right behind the slits you get regions where almost all the amplitude comes from one slit or the other, so the interference isn't important. There's a gradual transition from this low-interference region to the full-interference region farther away. Let's describe it.

At a few slit-widths away from the slits, the amplitudes fall off as 1/r, where r is the distance from a slit. (That corresponds to the usual 1/r2 fall-off for intensity.)  Thus they fall as 1/(x2+(y +/- b)2)1/2, where x is the distance back behind the slit plane, y is the height up or down on the viewing screen, and and b is the distance from the mid-point out to the middle of each slit. The first interference minimum is found where these two distances differ by a half-wavelength λ/2. If that distance is a significant fraction of the distance from the slits, about x, then the amplitudes from the slits will be enough different to reduce the interference significantly. So, as you move the viewing screen to within a few wavelengths  of the slits, you'll lose even the middle part of the interference pattern. Parts farther from the middle have more unequal distances from the slits and will fade at even larger distances.

Mike W. 

posted without vetting until Lee returns from the Serengeti 


(published on 10/22/2014)