Measuring Light
Most recent answer: 10/22/2007
- Anonymous
To do this, we often use a special material called a semiconductor. Semiconductors come in many types, and are used to make everything from computer chips to the circuits in your TV set. The type of semiconductor used in light meters is specially made so that when a photon hits it a tiny bit of electricity flows for a very very short time. Since there are a lot of photons hitting the semiconductor, the tiny bits of electrical current from each photon add together to give a steady current that is big enough that we can measure it with an ordinary current meter. If more light hits the semiconductor, more current will flow and the meter will read a bigger value. This means that people looking at the meter (like you) will know that the intensity of light just went up.
Adam
(published on 10/22/2007)
Follow-Up #1: photons and light meters
- bruce blosser (age 68)
fort bragg, ca USA
The particular type of photodetector that Adam described does depend on the light frequency, as you point out. Individual photons below some frequency will not have enough energy to create a conduction electron and thus will not register on that type of meter. Higher frequencies will show some sensitivity of the probability of getting a conduction electron to the particular frequency, so the output will depend not only on the total intensity but also on the distribution of frequencies.
I'm not sure how you get from that the conclusion that "all intensities would read the same". More photons still make more electrical current.
There are other types of light meter besides the one Adam mentioned. Some use the total heat deposited by the incoming light. These are sensitive to the total energy flow in the light and, in principle, insensitive to the frequency distribution.
Mike W.
(published on 12/20/2016)