Q:

I have been thinking about how cell phones work....and the way I understand it, I cant comprehend how they actually work.
I know cell phones send and receive radio waves, but it seems ridiculously unlikely that individual photons from a phone would actually make it to a phone tower. Do cell phones emit radiowaves isotropically? And how many photons would it have to emit to ensure that even one photon actually made it to the tower?
Similarly, when the tower sends out a signal what are the chances that those RF photons happen to strike the tiny antenna on your cell phone?????

- john (age 22)

St louis, MO

- john (age 22)

St louis, MO

A:

We rarely think of cell phone electromagnetic waves as being made of photons. The reason is that there are so many of these tiny lumps that the lumpiness of the transmitted energy is completely negligible. Let's go through a little calculation. A typical cell-phone frequency is around 10^{9} Hz. Multiplying by Planck's constant we find a photon energy in the range of 10^{-24} J. The emitted power is typically in the range of 1 W= 1 J/s, so the phone emits about 10^{24} photons/s. That's a very large number.

Let's say that the cell tower was about a km away. Over each square meter at that distance, about 10^{17} photons pass per second. There's little point in trying to pay attention to the quantum nature of a flow like that.

Although the radiation from a cell phone isn't isotropic, it is very spread out, not particularly directional. There's no special beaming of the waves. I would guess that the cell towers try to direct their radiation general downward, to avoid wasting power in directions where it won't be used.

Mike W.

Let's say that the cell tower was about a km away. Over each square meter at that distance, about 10

Although the radiation from a cell phone isn't isotropic, it is very spread out, not particularly directional. There's no special beaming of the waves. I would guess that the cell towers try to direct their radiation general downward, to avoid wasting power in directions where it won't be used.

Mike W.

*(published on 12/17/2011)*

Q:

I'm having trouble understanding the concept of "waves" in terms of photons. For purposes of communications (cell phones/radio) it seems like a wave can carry an unlimited amount of information (I remember learning about waves that envelope a signal) but a photon can only carry information about its energy.
So I was wondering how information is transmitted in "waves" and how this information is imparted into individual photons? When a photon interacts with an antenna what information is extracted from that? Is it a binary signal?

- John (age 22)

St Louis, MO

- John (age 22)

St Louis, MO

A:

First of all, one can't carry an unlimited amount of information on a single wave. You need a range of frequencies in order to carry a single telephone conversation. For example 40Hz - 5kHz for reasonable quality voice communication. These frequencies can be superimposed on a very high carrier frequency as is done in broadcast radio, but you can't get around the fact that the bandwidth remains the same.

A single photon contains only one 'bit' of information: either it's there or it isn't. So in order to transmit more information than just on or off you have to encode the information in a coherent collection photons.

LeeH

Here's another way to see the same thing. As Lee says, once you've specified the bandwidth you've specified how many independent channels can be used per unit time. (The bandwidth per independent channel is essentially the inverse of the signal time.) Now you might think that even in a single channel the exact amplitude of the signal would give an infinite number of choices and hence an infinite number of possible messages. However, the graininess of light (its photon properties) puts limits on how well-defined that amplitude can be. If, for example, your detector is counting photons then you have just a finite set of possible integer values, up to the maximum intensity that the apparatus can handle. So that limits the amount of information. Mike W.

A single photon contains only one 'bit' of information: either it's there or it isn't. So in order to transmit more information than just on or off you have to encode the information in a coherent collection photons.

LeeH

Here's another way to see the same thing. As Lee says, once you've specified the bandwidth you've specified how many independent channels can be used per unit time. (The bandwidth per independent channel is essentially the inverse of the signal time.) Now you might think that even in a single channel the exact amplitude of the signal would give an infinite number of choices and hence an infinite number of possible messages. However, the graininess of light (its photon properties) puts limits on how well-defined that amplitude can be. If, for example, your detector is counting photons then you have just a finite set of possible integer values, up to the maximum intensity that the apparatus can handle. So that limits the amount of information. Mike W.

*(published on 06/19/2012)*

Q:

Describing cell phone traffic (radio communications in the 700 - 1900 MHz range) in terms of photons is fine, but from a human's perception, not an easy approach.
A photon exhibits properties of a wave and a particle. In the radio spectrum, you could simply forget about the particle properties and have a perfectly fine understanding by thinking only about waves. You are confusing yourself by imagining photon particles being emitted by your cell phone.
If you'd like to learn more, there are great youtubes. Try "Susskind Stanford" and "EPR Bell".

- Robert Hadow (age 54)

Morristown, NJ 07960

- Robert Hadow (age 54)

Morristown, NJ 07960

A:

We agree that for the most part there's no reason to pay attention to quantum graininess in the cell phone range. When people are pushing communication to its limits, using cryogenic detectors, etc. the quantum shot can be a limiting factor. See, e.g. this paper on detecting microwave photons:

Mike W.

*(published on 12/16/2013)*