I've always found circuits containing LEDs hard to understand, please bear with me. I know most people find it easy, but I'm confused by them so some of my assumptions might not be correct, please correct me if that's the case.
So onto the question: Since LEDs are, after all, diodes, they essentially act as conductors with forward voltage, right? Which is why we need a pull-down resistor to regulate the current that flows through the circuit.
For example, let's say we have an LED with a Vf of 2V2 V and an operating current of 20mA20 mA. (I think those numbers are reasonable right? Again, if not, please let me know.) And our power supply is a constant 4V. This means we need the resistor to draw 20mA@2V20 mA at 2 V, so it would be a 100Ohm100 Ω resistor, with 40mW40 mW going through it. That's a tiny power usage, but half of the power supplied is wasted through heat. So in this case, isn't the best case efficiency 50%? Which isn't really efficient in terms of DC power supplies, I would have thought.
So when people refer to LED's high efficiency, are they referring to the fact that the LEDs themselves convert the power they use into light efficiently, or is it considered efficient even after considering the 50% max wall plug efficiency?
Or is it just that I've given an example that happens to be a horrible circuit design that would never be found in production applications?