Debuggers, while a useful tool for many things, are by definition primarily for... de-bugging.
[...] trusting that the choice of language and good practice makes bugs less likely, which then eliminates the need for the debugger.
As you said, even if you trust that your languages/frameworks make bugs less likely, you haven't eliminated all bugs, but reduced the likelihood of their occurrence.
Further, if everyone trusts their languages and frameworks 100%, how will defects in the languages/libraries themselves be discovered? Open any mainstream project on GitHub and see how many issues are reported.
Good practice can certainly reduce software defects, but even the best practices and tools will never eliminate the utility of a debugger.
I think this can be summarized by your comment:
[...] many of my problems are found when hardware shows unexpected behaviour
The problem with bugs is, we never see them coming!