There's a very close relation between the quantum uncertainty and the diffraction. The spread of the diffraction pattern is essentially just that given by the uncertainty principle.
However, if light was really just a classical wave, we wouldn't call it the uncertainty principle. If a classical wave were confined in a narrow space, its wave equation would force the transmitted wave to spread out in a range of directions. However, neither the confinement nor the direction spread would be 'uncertain'- the wave is very definitely truly spread out over the range, and would be detected spread out over the range.
In quantum mechanics those classical ranges become 'uncertainties', because experiments designed to detect the light don't pick up a signal smoothly over the whole range. Instead they produce a bunch of photon blips, randomly distributed over that range. We don't know where the next one will show up. Hence we call the range the 'uncertainty'.
Since the experiment you're talking about probably doesn't use single-photon detectors (although it could) it shows the spread of results from the uncertainty principle but not the randomness responsible for the name.
(published on 04/05/2010)