Why is there no algebraic definition of algorithm besides recursive functions?
If I'm wrong, what is the matheist definition of algorithm that you've ever seen in a paper and can you provide a link?
For example, modern machines do overflowed arithmetic by the rules of modulo $2^{32}, 2^{64}$, etc arithmetic.
So why is there rarely a discussion of a theoretical machine that works on finite groups or just integer quotient groups brought to some power which is the memory length (number of "words").
I think a lot can be done with math to solve algorithmic questions, but when one first approaches the subject, there is very little workable mathematics.
Recursive functions are nice, but mathematically they can be very hard to deal with. Since every recursive algorithm has an iterative counter-part, I wish there were a paper that had operators like $\sum_{i=1}^{g(n)} f(i)$ which mimic a for-loop with bound dependent upon $n$ and that essentially evaluates $f(n)$ every loop.
Why can't we have this type of more traditional mathematics describe algorithms, or can we? Do you have a definition you're working on?
If so, please share it in an answer to this question.