I was recently under the false impression that using a DC-DC converter would provide constant output voltage regardless of load. I found that as I increased load, the output would decrease. For example, on a converter adjusted to 13V no load, it dropped to 12.2V with a 1.2A load.
I found this also on both buck and boost converters I tried. I tested using the "Amazon" variety of converters such as:
https://www.amazon.com/gp/product/B07VVXF7YX (buck)
https://www.amazon.com/gp/product/B081X5YX8V (buck)
https://www.amazon.com/gp/product/B07T137H2Y (boost)
Is this just because they are cheap converters? Is there something better I can buy? I need to supply a load which can vary between 1A to 4A.
I am also wondering why they don't work as expected. All my searches on this problem go into details on how a converter works, but don't address the lack of load regulation. Nonetheless, if the output voltage is being monitored and correcting the duty cycle of the regulator to obtain a given voltage at the output pins, why would a change in load affect change the voltage at the output pins? Why wouldn't it just correct itself and still maintain the voltage it was adjusted for at no load?


