I know this doesn't exactly answer the question, but part of the theory behind using the bulb in series is that the resistance of the bulb increases with temperature, which in turn is dependent on the current being drawn through it.
For example, I just measured the resistance of a 40 watt bulb.
It's 24 Ohms at room temperature.
If that remained constant, it would draw 5 amps at 120V, or 600 Watts.
Obviously it takes a resistance of 360 Ohms to draw ~.333A for 40W.
I don't think this range is strictly linear, either.
The point here is that the bulb should not glow much, if at all, or the voltage passed on to the load (amp) will drop below that required to operate properly.
The exception, of course, is troubleshooting a problem that's causing excessive current draw and blowing fuses.
Choice of wattage for the bulb will depend on normal idle current requirements for the amp under test.
Start small, possibly as low as 15 watts, and work up till the bulb no longer glows more than just dimly.
Or just save all the hassle and get a "variac" with a nice meter.
[/img]

You do not have the required permissions to view the files attached to this post.