Okay, as discussed in the comments, I will be assuming that the programming language has something like a "break" statement, which completely skips all the remaining iterations of a loop, spending $O(1)$ time on them, no matter how many iterations were remaining.
Let $M$ be an abstract machine (a Turing machine, a RAM machine, whatever is more convenient) which calculates our function by the non-primitive-recursive way. The point is, the running time of $M$ is $O(g(N))$.
Write a primitive-recursive program that does the following:
- Create a simulation of $M$ in its initial state, ready to run.
- Calculate our function in a primitive-recursive way (with $O(f(N))$ complexity), EXCEPT:
- After each and every every step of this calculation, do the following:
- Advance the simulation of $M$ by one step.
- (Doing one step of a machine should be an $O(1)$ operation.)
- If it turns out that $M$ is in its finished state, then:
- break out of all the loops, stop the calculation, return the result that $M$ has inside it.
- If the calculation has finished while $M$ is still not done, forget about it and return the result we have calculated.
This program is "supposed to" do $O(f(N))$ operations, but if $N$ is large enough, then $M$ will finish first, and the program will return early after just $O(g(N))$ operations. Therefore, the asymptotic complexity of this thing is $O(g(N))$.
In summary: if you have a non-primitive-recursive solution for a problem, as well as a worse primitive-recursive solution, then you can rewrite the better solution in a primitive-recursive way. So the "inversion" of complexities is impossible.
(Really, the main purpose of the worse solution is merely to wait an appropriate amount of steps for the machine to complete. If we had a perfect predictor of how long the machine would take, then we wouldn't need the worse solution at all. But here, I'm using it as a backup, in case my estimation of the running time was too low.)