Theory vs practice here. Regarding time complexity, and I have a conceptual question that we didn't get to go deeper into in class.
Here it is:
There's a barbaric BROOT force algorithm, O(n^3)... and we got it down o O(n) and it was considered good enough. If we dive in deeper, it is actually O(n)+O(n), two separate iterations of the input. I came up with another way which was actually O(n/2). But those two algorithms are considered to be the same since both are O(n) and as n reaches infinity, it makes no difference, so not necessary at all once we reach O(n).
My question is:
In reality, in practice, we always have a finite number of inputs (admittedly occasionally in the trillions). So following the time complexity logic, O(n/2) is four times as fast as O(2n). So if we can make it faster, why not?
O(n)+O(n)is the same asO(n), and is also the same asO(n/2). Asymptotic analysis ignores such constants. What you mean is, suppose there is another O(n) algorithm with a lower hidden constant. And indeed, why not use a faster algorithm instead of a slower one? I don't see that there is a question here, but to know which O(n) algorithm is faster you will need some other method of analysis than asymptotic analysis; an algorithm is not automatically faster just because it has fewer loops.