Doppler Shift

Most recent answer: 03/02/2013

Q:
Let's say a satellite leaves Earth headed for Mars broadcasting a continuous signal of high/low beeps. The high beep lasts for 1 second and the low beep lasts for one second and then repeats. The satellite is traveling 'very slowly' so its internal clock is the same as the Earth receiver. It keeps broadcasting its high and low signal every two seconds and we keep receiving a high and low signal every two seconds. Let's say it takes the satellite 5 yrs. It will have broadcasted a continuous signal of 80 million high and low beeps and we would have received a continuous signal of 80 million high and low beeps. How does one account for the (8 minute?) time delay if their internal clocks remain the same? If it stops sending the signal right when it reaches Mars, it would have sent it's signal for 5yrs and we would have received a signal for 5yr and 8 min even though our internal clocks remain the same.
- Jesse (age 30)
mpls, mn
A:
That's a nice question.

So where did the 8 min go? The times between the arrivals of each blip are slightly stretched, because each blip has a little farther to travel.  But you might say that if it's going slowly enough (low v) that's hardly any stretch. Yes, but the slower it goes the more blips there are before it arrives. So it ends up being an 8-min delay.

What about the relativistic time effect? Why doesn't that argument keep it important even for slow trips? The reason is that the relativistic effect is proportional to (v/c)2, to lowest order,  so even though the trip time goes as 1/v, the net effect goes as v.

The effect that you've asked about is really just the Doppler effect, and to lowest order it goes as v/c. Since the trip time goes as 1/v, it remains even as v gets small.

Mike W.

(published on 03/02/2013)