I thought I'd pipe in, mostly since I love lighting my projects, and love seeing LED-lit projects. If you're curious how to select a bias resistor for the LEDs, here's the method I use:
A) Select your LED.
B) Locate a datasheet for your LED.
C) Choose an LED current
i. The datasheet will usually list an optimal one, but nobody said you can't go lower.
ii. Do not exceed the maximum, or you'll see bright, pretty colors followed quickly by a dead LED.
iii. I usually find somewhere in the 10-20 mA range to be quite acceptable for a 5mm LED.
D) Locate the I-V relationship curve (graph) on the datasheet.
i. Look up the current you chose on the curve and find the corresponding voltage.
E) Now you know how many volts will drop across your LED at the current you chose (the voltage you just looked up). Subtract this voltage from your supply voltage (I'm assuming 6.3 V since this is on your heaters), and this will give you the voltage that will be dropped across your bias resistor.
F) Compute the resistor value (Ohm's law: R = V/I. "V" is the voltage dropped across the resistor from part E, and "I" is the current you chose in part C).
If you want more than one LED and are concerned about power, you can wire multiple LEDs in series, so long as the voltage across them all (the voltage from part D multiplied by the number of LEDs) does not exceed your rail voltage. Then to find your bias resistor you just subtract the total voltage dropped (again, the V from part D times the number of LEDs in series) from the supply voltage and use Ohm's law. I'm guessing it wouldn't be too hard to get 3 LEDs in series running off a 6.3V heater supply, depending on the color. That would use 1/3 the power as if you put them all in parallel (biased to the same current).
Can't wait to see your tubes glow!