Resistivity vs. Temperature

Most recent answer: 07/09/2011

Q:
On a microscopic level, why does an increase of temperature increase the resistivity of a wire? Also, how can the electrons be thought of as waves?
- Phillip (age 17)
Brisbane, Qld, Australia
A:
There are a variety of ways that increasing temperature affects resistivity on conductors, some increasing it and some decreasing it. In a standard metal wire near room temperature, as you say the main effect is to increase resistivity. The mechanism can be thought of in several ways.

In an absolutely regular crystal, the electron waves would travel without bouncing off anything, like light waves in a pure diamond crystal. The farther they can travel without bouncing, the more easily they can carry current.

When we say the wire is hotter, we mean that the atoms in it are jiggling around. They're not quite in the regular positions. That makes the electron waves scatter of the irregularities, light light scattering of some dirt on glass. An equivalent way of thinking of this is that the hot wire has little sound waves traveling around in it, and an electron and a sound wave can bounce off each other.

Your more fundamental question concerns why we can think of electrons as waves. One of the big reasons is simply that thinking of them as waves allows us to correctly calculate things like how impurities or heat changes the way they conduct in metals. Of course it also allows us to calculate many other properties correctly. More importantly, the wave picture works for all properties of all tiny things. It's really all we have.

Mike W.



(published on 07/09/2011)