Debuggers, while a useful tool for many things, are by definition primarily for... de-bugging.
>[...] trusting that the choice of language and good practice makes bugs less likely, which then eliminates the need for the debugger.

As you said, even if you trust that your languages/frameworks and good practices make bugs less likely, you haven't _eliminated_ all bugs, but reduced the likelihood of their occurrence. Without a debugger (or some similar approach such as logging), how will you diagnose those bugs that still occur?

Further, if everyone trusts their languages and frameworks 100%, how will defects in the languages/libraries themselves be discovered? Open any mainstream project on GitHub and see how many issues are reported.

Good practice can certainly [reduce software defects][1], but even the best practices and tools will never eliminate the utility of a debugger.

I think your answer is in your own comment:
>[...] many of my problems are found when hardware shows unexpected behaviour

The problem with bugs is, we never see them coming!


 [1]: https://softwareengineering.stackexchange.com/a/7936/209046