Skip to main content
added 3 characters in body
Source Link

I'm an avid DIY'er recycling some LEDs from a TV. It uses 50 LEDs in series, supplied by 160 V, and the PSU says 160 V @ 400 mA. So my target is to provide 3.2 V @ 400 mA in my own circuit. I would like to drive them with my 12 V constant voltage power supply, which is adjustable from 10 V up to 14 V.

I understand how to calculate resistor size for a given series of LEDs. Tons of articles explain that. I also understand that LEDs have an aggressive I-V curve, so a series resistor acting as a current limiter is highly recommended (required depending on who you ask). What I don't understand is what happens to the circuit upon voltage fluctuations.

Take threea couple sample circuit designs for my case:

  1. I could wire 3 LEDs per series, provide 12.08 volts, with a 6.2 ohm resistor. That's only 79% efficient, requires a 2 watt resistor, and generates a good bit of heat.
  2. I could wire 4 LEDs per series, provide 13.2 volts, with a 1 ohm resistor. That's 97% efficient, cheaper, easier, and runs cooler.

A couple of pre-emptive clarifications...

  • I don't know the specifications of the LEDs in question. They are from a Sharp LC-55Q7030U, and I can't find details on what the manufacturer used.

  • And I could use the power supply from the TV, which appears to be a much higher quality current-limited source, but then I have to figure out how to trigger power-on, control its PWM circuit, and then figure out how to do the same for the next 2 different TVs I want to control with the same switch & dimmer. Doing some soldering feels simpler to my under-educated brain.

Question:

How do I calculate the effect of supply voltage fluctuations on each of those (or infinite alternative) circuit design options? How much protection does the nice and efficient 1 ohm resistor actually provide? Would that really be any better than just under-supplying 12.4 V to 4 LEDs without a resistor (which still provides ample brightness)?

I'm an avid DIY'er recycling some LEDs from a TV. It uses 50 LEDs in series, supplied by 160 V, and the PSU says 160 V @ 400 mA. So my target is to provide 3.2 V @ 400 mA in my own circuit. I would like to drive them with my 12 V constant voltage power supply, which is adjustable from 10 V up to 14 V.

I understand how to calculate resistor size for a given series of LEDs. Tons of articles explain that. I also understand that LEDs have an aggressive I-V curve, so a series resistor acting as a current limiter is highly recommended (required depending on who you ask). What I don't understand is what happens to the circuit upon voltage fluctuations.

Take three sample circuit designs for my case:

  1. I could wire 3 LEDs per series, provide 12.08 volts, with a 6.2 ohm resistor. That's only 79% efficient, requires a 2 watt resistor, and generates a good bit of heat.
  2. I could wire 4 LEDs per series, provide 13.2 volts, with a 1 ohm resistor. That's 97% efficient, cheaper, easier, and runs cooler.

A couple of pre-emptive clarifications...

  • I don't know the specifications of the LEDs in question. They are from a Sharp LC-55Q7030U, and I can't find details on what the manufacturer used.

  • And I could use the power supply from the TV, which appears to be a much higher quality current-limited source, but then I have to figure out how to trigger power-on, control its PWM circuit, and then figure out how to do the same for the next 2 different TVs I want to control with the same switch & dimmer. Doing some soldering feels simpler to my under-educated brain.

Question:

How do I calculate the effect of supply voltage fluctuations on each of those (or infinite alternative) circuit design options? How much protection does the nice and efficient 1 ohm resistor actually provide? Would that really be any better than just under-supplying 12.4 V to 4 LEDs without a resistor (which still provides ample brightness)?

I'm an avid DIY'er recycling some LEDs from a TV. It uses 50 LEDs in series, supplied by 160 V, and the PSU says 160 V @ 400 mA. So my target is to provide 3.2 V @ 400 mA in my own circuit. I would like to drive them with my 12 V constant voltage power supply, which is adjustable from 10 V up to 14 V.

I understand how to calculate resistor size for a given series of LEDs. Tons of articles explain that. I also understand that LEDs have an aggressive I-V curve, so a series resistor acting as a current limiter is highly recommended (required depending on who you ask). What I don't understand is what happens to the circuit upon voltage fluctuations.

Take a couple sample circuit designs for my case:

  1. I could wire 3 LEDs per series, provide 12.08 volts, with a 6.2 ohm resistor. That's only 79% efficient, requires a 2 watt resistor, and generates a good bit of heat.
  2. I could wire 4 LEDs per series, provide 13.2 volts, with a 1 ohm resistor. That's 97% efficient, cheaper, easier, and runs cooler.

A couple of pre-emptive clarifications...

  • I don't know the specifications of the LEDs in question. They are from a Sharp LC-55Q7030U, and I can't find details on what the manufacturer used.

  • And I could use the power supply from the TV, which appears to be a much higher quality current-limited source, but then I have to figure out how to trigger power-on, control its PWM circuit, and then figure out how to do the same for the next 2 different TVs I want to control with the same switch & dimmer. Doing some soldering feels simpler to my under-educated brain.

Question:

How do I calculate the effect of supply voltage fluctuations on each of those (or infinite alternative) circuit design options? How much protection does the nice and efficient 1 ohm resistor actually provide? Would that really be any better than just under-supplying 12.4 V to 4 LEDs without a resistor (which still provides ample brightness)?

Added clarification from comment into the question. While I'm here, added space between value and units (e.g. 12.4 V) per ISO 80000-1 and best practice, to help readability.
Source Link
SamGibson
  • 18.5k
  • 5
  • 42
  • 65

I'm an avid DIY'er recycling some LEDs from a TV. It uses 50 LEDs in series, supplied by 160V160 V, and the PSU says 160V160 V @ 400ma400 mA. So my target is to provide 3.2V2 V @ 400ma400 mA in my own circuit. I would like to drive them with my 12V12 V constant voltage power supply, which is adjustable from 10V10 V up to 14V14 V.

I understand how to calculate resistor size for a given series of LEDs. Tons of articles explain that. I also understand that LEDs have an aggressive I-V curve, so a series resistor acting as a current limiter is highly recommended (required depending on who you ask). What I don't understand is what happens to the circuit upon voltage fluctuations.

Take three sample circuit designs for my case:

  1. I could wire 3 LEDs per series, provide 12.08 volts, with a 6.2-ohm ohm resistor. That's only 79% efficient, requires a 2-watt watt resistor, and generates a good bit of heat.
  2. I could wire 4 LEDs per series, provide 13.2 volts, with a 1-ohm ohm resistor. That's 97% efficient, cheaper, easier, and runs cooler.

A couple of pre-emptive clarifications...

  • I don't know the specifications of the LEDs in question. They are from a Sharp LC-55Q7030U, and I can't find details on what the manufacturer used.

  • And I could use the power supply from the TV, which appears to be a much higher quality current-limited source, but then I have to figure out how to trigger power-on, control its PWM circuit, and then figure out how to do the same for the next 2 different TVs I want to control with the same switch & dimmer. Doing some soldering feels simpler to my under-educated brain.

Question:

How do I calculate the effect of supply voltage fluctuations on each of those (or infinite alternative) circuit design options? How much protection does the nice and efficient 1-ohm ohm resistor actually provide? Would that really be any better than just under-supplying 12.4V4 V to 4 LEDs without a resistor (which still provides ample brightness)?

I'm an avid DIY'er recycling some LEDs from a TV. It uses 50 LEDs in series, supplied by 160V, and the PSU says 160V @ 400ma. So my target is to provide 3.2V @ 400ma in my own circuit. I would like to drive them with my 12V constant voltage power supply, which is adjustable from 10V up to 14V.

I understand how to calculate resistor size for a given series of LEDs. Tons of articles explain that. I also understand that LEDs have an aggressive I-V curve, so a series resistor acting as a current limiter is highly recommended (required depending on who you ask). What I don't understand is what happens to the circuit upon voltage fluctuations.

Take three sample circuit designs for my case:

  1. I could wire 3 LEDs per series, provide 12.08 volts, with a 6.2-ohm resistor. That's only 79% efficient, requires a 2-watt resistor, and generates a good bit of heat.
  2. I could wire 4 LEDs per series, provide 13.2 volts, with a 1-ohm resistor. That's 97% efficient, cheaper, easier, and runs cooler.

How do I calculate the effect of supply voltage fluctuations on each of those (or infinite alternative) circuit design options? How much protection does the nice and efficient 1-ohm resistor actually provide? Would that really be any better than just under-supplying 12.4V to 4 LEDs without a resistor (which still provides ample brightness)?

I'm an avid DIY'er recycling some LEDs from a TV. It uses 50 LEDs in series, supplied by 160 V, and the PSU says 160 V @ 400 mA. So my target is to provide 3.2 V @ 400 mA in my own circuit. I would like to drive them with my 12 V constant voltage power supply, which is adjustable from 10 V up to 14 V.

I understand how to calculate resistor size for a given series of LEDs. Tons of articles explain that. I also understand that LEDs have an aggressive I-V curve, so a series resistor acting as a current limiter is highly recommended (required depending on who you ask). What I don't understand is what happens to the circuit upon voltage fluctuations.

Take three sample circuit designs for my case:

  1. I could wire 3 LEDs per series, provide 12.08 volts, with a 6.2 ohm resistor. That's only 79% efficient, requires a 2 watt resistor, and generates a good bit of heat.
  2. I could wire 4 LEDs per series, provide 13.2 volts, with a 1 ohm resistor. That's 97% efficient, cheaper, easier, and runs cooler.

A couple of pre-emptive clarifications...

  • I don't know the specifications of the LEDs in question. They are from a Sharp LC-55Q7030U, and I can't find details on what the manufacturer used.

  • And I could use the power supply from the TV, which appears to be a much higher quality current-limited source, but then I have to figure out how to trigger power-on, control its PWM circuit, and then figure out how to do the same for the next 2 different TVs I want to control with the same switch & dimmer. Doing some soldering feels simpler to my under-educated brain.

Question:

How do I calculate the effect of supply voltage fluctuations on each of those (or infinite alternative) circuit design options? How much protection does the nice and efficient 1 ohm resistor actually provide? Would that really be any better than just under-supplying 12.4 V to 4 LEDs without a resistor (which still provides ample brightness)?

Source Link

Another LED Circuit Design Question - higher voltage and bigger resistor vs lower voltage and smaller resistor

I'm an avid DIY'er recycling some LEDs from a TV. It uses 50 LEDs in series, supplied by 160V, and the PSU says 160V @ 400ma. So my target is to provide 3.2V @ 400ma in my own circuit. I would like to drive them with my 12V constant voltage power supply, which is adjustable from 10V up to 14V.

I understand how to calculate resistor size for a given series of LEDs. Tons of articles explain that. I also understand that LEDs have an aggressive I-V curve, so a series resistor acting as a current limiter is highly recommended (required depending on who you ask). What I don't understand is what happens to the circuit upon voltage fluctuations.

Take three sample circuit designs for my case:

  1. I could wire 3 LEDs per series, provide 12.08 volts, with a 6.2-ohm resistor. That's only 79% efficient, requires a 2-watt resistor, and generates a good bit of heat.
  2. I could wire 4 LEDs per series, provide 13.2 volts, with a 1-ohm resistor. That's 97% efficient, cheaper, easier, and runs cooler.

How do I calculate the effect of supply voltage fluctuations on each of those (or infinite alternative) circuit design options? How much protection does the nice and efficient 1-ohm resistor actually provide? Would that really be any better than just under-supplying 12.4V to 4 LEDs without a resistor (which still provides ample brightness)?