1
\$\begingroup\$

I'm an avid DIY'er recycling some LEDs from a TV. It uses 50 LEDs in series, supplied by 160 V, and the PSU says 160 V @ 400 mA. So my target is to provide 3.2 V @ 400 mA in my own circuit. I would like to drive them with my 12 V constant voltage power supply, which is adjustable from 10 V up to 14 V.

I understand how to calculate resistor size for a given series of LEDs. Tons of articles explain that. I also understand that LEDs have an aggressive I-V curve, so a series resistor acting as a current limiter is highly recommended (required depending on who you ask). What I don't understand is what happens to the circuit upon voltage fluctuations.

Take a couple sample circuit designs for my case:

  1. I could wire 3 LEDs per series, provide 12.08 volts, with a 6.2 ohm resistor. That's only 79% efficient, requires a 2 watt resistor, and generates a good bit of heat.
  2. I could wire 4 LEDs per series, provide 13.2 volts, with a 1 ohm resistor. That's 97% efficient, cheaper, easier, and runs cooler.

A couple of pre-emptive clarifications...

  • I don't know the specifications of the LEDs in question. They are from a Sharp LC-55Q7030U, and I can't find details on what the manufacturer used.

  • And I could use the power supply from the TV, which appears to be a much higher quality current-limited source, but then I have to figure out how to trigger power-on, control its PWM circuit, and then figure out how to do the same for the next 2 different TVs I want to control with the same switch & dimmer. Doing some soldering feels simpler to my under-educated brain.

Question:

How do I calculate the effect of supply voltage fluctuations on each of those (or infinite alternative) circuit design options? How much protection does the nice and efficient 1 ohm resistor actually provide? Would that really be any better than just under-supplying 12.4 V to 4 LEDs without a resistor (which still provides ample brightness)?

\$\endgroup\$
3

3 Answers 3

4
\$\begingroup\$

First - under-supplying (starving) the LEDs with no series resistor is not a good idea. The conduction "knee" of an LED is not perfectly square. With no current limiting, the LEDs probably will fail.

What I don't understand is what happens to the circuit upon voltage fluctuations.

What happens is that the current through the LEDs varies dramatically.

Neither of those options is a good idea. The problem is that the actual forward voltage (Vf) for each LED varies, so the total Vf for a string or 3 or 4 could be off by several tenths of a volt. The larger the voltage drop across the resistor, the move consistent (and safe) the circuit. try this.

Assume a perfect 3.2 V LED. With Ohm's Law, calculate the resistance value for one resistor, one LED, and 12 V. Then, for the same resistor value, calculate the LED current at 11 V and 13 V. Look at the percentage change in current for the high and low supply values.

Now, do all of that assuming three LEDs in series. You will see a much larger variation in LED current at the high and low supply voltages. A circuit designer for such a system probably has a goal or spec for the max current variation allowed for good product performance, and can work backwards through the calculations (or just recalculate for all cases) to determine the optimum resistor value. Then use a resistor at that value with a high enough power rating, and clamp it to a heatsink, or place it next for an air vent.

I suggest you adjust your goals. Driving an LED with a DC voltage source and a resistor never is very efficient. 50% is not considered an excessive value. The "right" way to stabilize the LED current across power supply variations is to replace the single resistor with a constant-current circuit. And the "right" way to minimize the heat dissipated in the current limiting function is to use a switching circuit. but each of those options increases circuit complexity.

Update:

Single 3.2V & 400ma LED, 12V source, 22-ohm resistor, +1V source fluctuation gives me resistor current of 445ma, 11% change. 3 of the same LEDs, 6-ohm resistor, +1V source fluctuation give me resistor current of 567ma, 42% change. Is that the right way to look at the impact on the LEDs themselves?

Yes, that is the right way. I haven't checked your numbers, but they smell correct. For whatever the system voltage is, the smaller the percentage of it across the resistor, the larger the effects of voltage changes and individual LED parameter differences on the LED current.

\$\endgroup\$
1
  • \$\begingroup\$ I just want to confirm I'm understanding. Single 3.2V & 400ma LED, 12V source, 22-ohm resistor, +1V source fluctuation gives me resistor current of 445ma, 11% change. 3 of the same LEDs, 6-ohm resistor, +1V source fluctuation give me resistor current of 567ma, 42% change. Is that the right way to look at the impact on the LEDs themselves? I had been trying to calculate an estimate of if voltage goes up, LED current draw attempt skyrockets, resistor compensates by drawing down voltage, and eventually the system would have to come to some kind of equilibrium. \$\endgroup\$ Commented Jan 21, 2022 at 14:10
2
\$\begingroup\$

Short answer: the lower the voltage drop on your resistor, the more voltage fluctiations in your supply voltage will "hurt".

That's because the voltage drop across the LEDs can be assumed constant. So the whole voltage change will be on your current limiting resistor and the current will change according to ohms law. So if you have a voltage drop of 1V across your resistor and your source invreases by 1V you now suddently have 2V across your resistor thus doubling the current. Now assume you have a 10V drop on your current limiting resistor and the input voltage changes by 1V. You'll only have the current change around 10%.

Also the voltage across LEDs is highly temperature dependent. So with really low voltage drops on you will have massive changes with temperature.

So in conclusion:

  1. You want your voltage drop and resistor value as low as possible because it's just wasted energy and creates heat
  2. you want the voltage drop on your resistor as high as possible because this helps with stability
  3. find the solution that's optimal for you

Professional LED sources are current sources that are basically switching power supplies that regulate on current instead ov voltage. This helps increase stability and reduce waste heat (and thus increase efficiency).

\$\endgroup\$
1
\$\begingroup\$

White LEDs change their forward voltage by about 4mV per degree, so with 3 in series your forward voltage changes 12mV per degree, or 1.2v per 100C. Unless you can control their temperature precisely, that circuit with 400mV across the resistor is not going to work very well because small changes in temperature will have a large effect on brightness.

Even worse, the change in forward voltage is negative, meaning the LEDs will get brighter, dissipate more heat, which will make them even brighter, and thus even hotter in a feedback loop. If you make the resistor too small you can end up in thermal runaway. For this reason, it is common to have the resistor dissipate ~20% of the total power in the circuit.

If you need higher efficiency than that, buy or build a constant current driver. This can adjust for temperature effects without having to dissipate heat.

\$\endgroup\$

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.