Q:

Is the universe expanding at a steady rate, or an accelerated one?

- Harriet

New Zealand

- Harriet

New Zealand

A:

Thanks for the question Harriet,

The universe is expanding at an accelerated rate. There are several pieces of evidence, but the simplest one comes from comparing the distance to stars with the rate at which they're moving away from us. If (for a fixed time from the start) the distance were just a constant times the velocity, that would fit a simple picture where each object has been moving away from us at its own fixed rate.

We actually have very good measures of how fast stars were moving away from us when the emitted the light we're now seeing. The frequencies of the special colors emitted by atoms shift depending on that rate (the Doppler shift), and those can be measured accurately.

The hard part is to figure out how far away the star was when it emitted the light we're now seeing. Astronomers determine this using the brightness of stars called 'standard candles', especially type 1a supernovas. These supernovas emit about the same amount of light, and the differences from the average correlate well with the lifetime of the supernova's initial bright period. Since that's measured, a good estimate of the light emission can be obtained. Then how bright it is tells us how far away it was.

The plot of distance versus velocity doesn't quite fit the no-acceleration picture. It looks like in the first few billion years the expansion was slowing down, but for the last few billion it's been speeding up.

Great question,

-Zach and mbw

The universe is expanding at an accelerated rate. There are several pieces of evidence, but the simplest one comes from comparing the distance to stars with the rate at which they're moving away from us. If (for a fixed time from the start) the distance were just a constant times the velocity, that would fit a simple picture where each object has been moving away from us at its own fixed rate.

We actually have very good measures of how fast stars were moving away from us when the emitted the light we're now seeing. The frequencies of the special colors emitted by atoms shift depending on that rate (the Doppler shift), and those can be measured accurately.

The hard part is to figure out how far away the star was when it emitted the light we're now seeing. Astronomers determine this using the brightness of stars called 'standard candles', especially type 1a supernovas. These supernovas emit about the same amount of light, and the differences from the average correlate well with the lifetime of the supernova's initial bright period. Since that's measured, a good estimate of the light emission can be obtained. Then how bright it is tells us how far away it was.

The plot of distance versus velocity doesn't quite fit the no-acceleration picture. It looks like in the first few billion years the expansion was slowing down, but for the last few billion it's been speeding up.

Great question,

-Zach and mbw

*(published on 12/12/2009)*

Q:

Im trying to understand this and would appreciate your help. My understanding is that the further away a galaxy is the faster its moving away from us. Considering that the further away a galaxy is the further back in time we are seeing how do we know from such "old" measurements that its accelerating? Couldnt the argument be made that since the closer (newer) observations are slower than the distant(older) observations that its slowing down?

- Chad (age 38)

Asheboro NC

- Chad (age 38)

Asheboro NC

A:

Figuring out the acceleration requires a careful comparison of the redshifts and the apparent distances, based on brightness. Say that there was no acceleration. Then for smallish distances, the redshift and the distance would be exactly proportional. For larger distances, there's a slightly more complicated relativistic formula, but still just a calculation of how distance should depend on redshift. So to judge acceleration people look at the *differences* from that calculated curve. For smallish redshifts, objects seem to be too close, as if they weren't moving away that quickly until recently. That means the expansion has been accelerating. For big redshifts, the effect reverses, indicating that the expansion was decelerating a few billion years ago. That's just what's expected if there's a constant background acceleration. Recently, with the matter whose gravity causes deceleration being all spread out, the acceleration wins. When the matter was more concentrated, the deceleration was winning.

Here's a figure showing how the aparent distance depends on redshift, and then showing the small deviation from what would be expected in a no-acceleration picture.

Mike W.

*(published on 08/14/2013)*