Sure. The wattage rating for bulbs tells you about how many watts of power they draw if connected directly to a standard 115 V (or so) power supply. The power is equal to the current I times the voltage V. Since I=V/R, where R is the resistance, you can also write the power as V^2/R or as I^2*R. So the higher wattage bulb is the one with lower resistance.
If you put two bulbs in series, their combined resistance is bigger than either one separately. Therefore less current will flow through them than would flow if they were hooked up normally. In this case, the 30 W bulb has twice the resistance of the 60 W bulb, so the total resistance is 1.5 times that of the 30 W bulb and 3 times that of the 60 W bulb. So now the 30 W bulb will get only 2/3 as much current as it does when plugged in all by itself, and the 60 W bulb will get 1/3 of the current it gets when plugged in all by itself. The 30 W bulb will draw 13.3 W of power, and the 60 W bulb will draw only 6.6 W. The 30 W bulb will be a lot less bright than usual, but the 60 W bulb will barely glow at all.
By the way, I've made a somewhat false simplification in this discussion. I've pretended like each bulb has a fixed R regardless of how much current goes through it. Actually, as a bulb heats up its resistance goes way up. However, the general idea is the same as that I described in simple terms.
Another fact that helps to understand what you see is that the visible light that comes out depends very strongly on how hot the filament gets. If it's not hot enough, the light that comes out will mostly be invisible infrared. I bet the barely visible 60 W bulb looks pretty reddish, not white, because the filament isn't hot enough to give off the blue part of the visible spectrum.
(published on 10/22/2007)