Skip to main content
1 of 3

Another LED Circuit Design Question - higher voltage and bigger resistor vs lower voltage and smaller resistor

I'm an avid DIY'er recycling some LEDs from a TV. It uses 50 LEDs in series, supplied by 160V, and the PSU says 160V @ 400ma. So my target is to provide 3.2V @ 400ma in my own circuit. I would like to drive them with my 12V constant voltage power supply, which is adjustable from 10V up to 14V.

I understand how to calculate resistor size for a given series of LEDs. Tons of articles explain that. I also understand that LEDs have an aggressive I-V curve, so a series resistor acting as a current limiter is highly recommended (required depending on who you ask). What I don't understand is what happens to the circuit upon voltage fluctuations.

Take three sample circuit designs for my case:

  1. I could wire 3 LEDs per series, provide 12.08 volts, with a 6.2-ohm resistor. That's only 79% efficient, requires a 2-watt resistor, and generates a good bit of heat.
  2. I could wire 4 LEDs per series, provide 13.2 volts, with a 1-ohm resistor. That's 97% efficient, cheaper, easier, and runs cooler.

How do I calculate the effect of supply voltage fluctuations on each of those (or infinite alternative) circuit design options? How much protection does the nice and efficient 1-ohm resistor actually provide? Would that really be any better than just under-supplying 12.4V to 4 LEDs without a resistor (which still provides ample brightness)?