0
\$\begingroup\$

I was recently under the false impression that using a DC-DC converter would provide constant output voltage regardless of load. I found that as I increased load, the output would decrease. For example, on a converter adjusted to 13V no load, it dropped to 12.2V with a 1.2A load.

I found this also on both buck and boost converters I tried. I tested using the "Amazon" variety of converters such as:

https://www.amazon.com/gp/product/B07VVXF7YX (buck)

https://www.amazon.com/gp/product/B081X5YX8V (buck)

https://www.amazon.com/gp/product/B07T137H2Y (boost)

Is this just because they are cheap converters? Is there something better I can buy? I need to supply a load which can vary between 1A to 4A.

I am also wondering why they don't work as expected. All my searches on this problem go into details on how a converter works, but don't address the lack of load regulation. Nonetheless, if the output voltage is being monitored and correcting the duty cycle of the regulator to obtain a given voltage at the output pins, why would a change in load affect change the voltage at the output pins? Why wouldn't it just correct itself and still maintain the voltage it was adjusted for at no load?

\$\endgroup\$
13
  • 1
    \$\begingroup\$ Ignoring the fact they are random boards from Amazon without a datasheet, you're not giving us some important information - what power supply are you using to power the modules? Which of those three modules you linked to are you using? \$\endgroup\$ Commented May 30, 2023 at 11:24
  • \$\begingroup\$ How do I get constant voltage using a DC-DC converter regardless of load? <-- you can't, there will always be an error and some designs are worse than others. You get what you pay for. \$\endgroup\$ Commented May 30, 2023 at 11:26
  • \$\begingroup\$ Did the input voltage drop as well? \$\endgroup\$ Commented May 30, 2023 at 11:56
  • \$\begingroup\$ Power supply is a 20 AH battery. \$\endgroup\$ Commented May 30, 2023 at 12:30
  • \$\begingroup\$ Please measure the input voltage with your multimeter when you load your power supply. \$\endgroup\$ Commented May 30, 2023 at 12:48

5 Answers 5

3
\$\begingroup\$

How do I get constant voltage using a DC-DC converter regardless of load?

If you cannot change aspects of the DC-DC converter, then this cannot be improved.

There is resistance in the wiring between the DC-DC converter and the load, which is likely where the "missing volts" are going.

Three things can be done:

  1. Increase the wire gauge so that there is less resistance. If size/weight is an issue, silver-coated wire is more conductive than copper.
  2. Modify the DC-DC converter so that it measures the voltage at the load. This is called a "four-wire" system because two thicker wires are used to supply the load, and two thinner wires come from the load back to the DC-DC to sense the actual volts getting to it. So in this case, the DC-DC will output higher than 12 Volts to actually get 12 V to the load.*
  3. Change the DC-DC converter to one with a better line/load regulation characteristic.

* This also depends on what the load is. Many DC-DC converters cannot directly supply highly reactive (inductive, capacitive) loads. You will know if this is a problem because the voltage will not be stable and may oscillate, even into the MHz range (will not be visible by a handheld multimeter - can only see this on an oscilloscope.)

\$\endgroup\$
1
\$\begingroup\$

It's normal to expect some voltage reduction under load, especially with cheap crap from Ebay and Amazon. However, a full volt lost at 1A is not reasonable, even from the lower power models.

I think it's most likely your measurement setup is flawed. Consider this:

schematic

simulate this circuit – Schematic created using CircuitLab

By the time the voltmeter gets to measure anything, a lot of the potential difference has already been dropped across various resistances in the the wiring, connectors and even the ammeter.

To get a better idea of what the DC-DC converter is actually producing, make sure you measure its output voltage as close to the converter itself as possible:

schematic

simulate this circuit

\$\endgroup\$
1
\$\begingroup\$

In case you've missed this detail- DC-DC buck converters can only lower the output voltage, and they have a minimum headroom (maybe a couple of volts for this kind of bipolar chip) to do that. Possibly a bit more on cheaper modules because the inductor will have a bit more voltage drop. The LM2596 switch has a voltage drop of as much as 1.5V at 3A, and, although it can go to 100% duty cycle, the inductor is also in series and it will have some DC resistance.

enter image description here

So if your minimum input voltage is not higher than about 15V-16V you need to look at that.

To get constant output voltage for inputs that may range both above and below the output voltage you need something like a buck-boost converter, which is more complex and generally more expensive as a result.

\$\endgroup\$
1
\$\begingroup\$

A properly designed DC-DC will provide constant output voltage for its intended load current range, of course with a little bit of switching ripple.

You're getting quite high voltage drop, so here are some possible causes:

  • Voltage is measured at the end of thin wires with high current.

To get good regulation at high current, wires or traces must be short and feedback voltage must to be taken close to the load. Otherwise voltage drop in wires or traces can add up. Voltage drop in the ground wire also counts.

  • Not enough input-output voltage margin.

This depends on the chip being used, they all have a maximum duty cycle which is not necessarily 100%. This imposes a minimum input voltage, and the voltage drop in inductor and MOSFET ESR also requires some margin. For example to output 13V from 14V input at 4A current, you only have 1V drop, so that would require a chip with at least 95% duty cycle and less than 70 mOhms total resistance in the FET and inductor.

It is necessary to read the documentation.

  • Input power supply not being able to supply enough current, or dropping voltage under load

  • Current limiting due to overheating

  • Capacitor ESR or ESL too high, capacitance too low (at input and/or output): this increases ripple voltage and can cause all sorts of issues, for example triggering under/over voltage protections, transient input voltage drop making it impossible to provide the required current, etc.

  • Inductor saturation

  • Wrong choice of parts, especially inductor value, wrong frequency, etc

--

The "LM2596" modules which are all counterfeit. exhibit most of these issues: the inductor usually saturates, the caps are garbage, the chip isn't even a fake of the correct chip, it overheats, etc.

The boost module is sold for 6A and it has two general purpose electrolytic caps which probably have a max ripple current rating of 0.25A each, so they will die very shortly. Also the PCB is way too small and with no cooling for this current. I mean it is possible to make a boost converter in this form factor which would not require cooling, but not with a diode and general purpose caps.

The 20A buck is perhaps the less suspicious of the bunch, I mean the caps at least have "LOW ESR" printed on them. But, well, P=R.I^2 so 20A is pushing it for a single phase buck.

\$\endgroup\$
1
\$\begingroup\$

The error was in:

a) a faulty component in my test setup, and

b) making assumptions rather than examining the entire circuit thoroughly.

Regarding the buck converter, I was using an additional 12 V AGM battery to get my voltage above the headroom needed for a 13 V output. It turns out that this battery was old and had very high internal resistance, and that coupled with the resistance in wiring dropped the voltage down where, combined with the ~700 mV dropout of the converter, was below my set voltage. After eliminating these issues (by only using the good 12 V battery and reducing my output voltage to 10 V,) the buck converter (second one in my list) only had about a 30 mV drop when current increased to 1 A.

As far as the error with the boost converter (#3 in the list), I apparently measured incorrectly, as when I tested again, everything seemed fine, with only about 30 mV drop when increasing the current to 1.2 A. I have no idea what I (erroneously) did the first time which led to my conclusion.

So there seems to not be any problem with either of the more expensive buck and boost converters, at least not significant enough for my needs. (I did not bother to test the cheapy buck converter, #1 on the list.)

The solution for my project actually is to use a buck-boost converter, since I need a 13 V output, and my input can vary between 14.2-11 V (12 V battery with a solar charge system). I tried this one:

https://www.amazon.com/dp/B07VNDGFT6

I get about a 70 mV drop with an increase to 1.2 A, but that amount is still very acceptable for my purposes. The biggest downside I can see to using this converter is the current consumption (loss) at 1.2 A is significant - 450 mA (~5.4 W) - about 1/3 of the output current/power. This can be significant in a solar powered setup where one is trying to be efficient as possible to avoid needing to oversize in order to cover losses.

\$\endgroup\$
1
  • \$\begingroup\$ Just a slightly off-topic side note here: I discovered that my setup is actually more efficient if I use a 12V to 110V 150W inverter, then plug the vendor-supplied power adapters of my load devices into it. With the buck-boost converter I have an average overhead of about 6 watts; with the inverter-to-power adapters setup, my overhead drops to about 2.4 watts. Who'dve thought? \$\endgroup\$ Commented Jun 1, 2023 at 10:54

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.