Timeline for Is micro-optimisation important when coding?
Current License: CC BY-SA 3.0
6 events
| when toggle format | what | by | license | comment | |
|---|---|---|---|---|---|
| Sep 6, 2014 at 16:42 | history | edited | zwol | CC BY-SA 3.0 | deleted 59 characters in body |
| Aug 9, 2011 at 12:43 | comment | added | Mike Dunlavey | I think there's an even more common case. There's a "job" taking 50% of the time, but you don't actually need it, so you remove it entirely, reducing overall time by that amount, then repeat. I know this is hard to accept - that programs can be spending large fractions of time doing things that (in hindsight) are totally unnecessary. But here's my canonical example. | |
| Aug 9, 2011 at 3:30 | comment | added | zwol | The devil's in the word "average" there. It is mathematically the case that to speed up a program by 50% every piece must be sped up by 50% on average. Now, if you can divide the program into a job that takes 75% of runtime and another that takes 25%, and speed the former up 3x, that'll give you 50% overall despite not having done anything to the latter job. But the much more common case is that there are dozens of "jobs" that each take less than 5% of runtime - and then you have to speed up or get rid of many of them. | |
| Aug 9, 2011 at 2:38 | comment | added | Mike Dunlavey | +1 for Amdahl quote, but I don't agree with "to make the entire program run twice as fast, you need to speed up every piece". I would say you don't actually speed up any "piece". Rather, you find unnecessary work, and eliminate it. Especially function calls, if the program is anything larger than a toy. Much of the common thinking about performance seems to completely ignore the importance of finding whole unnecessary branches of the call tree (which can be single instructions), and lopping them off. | |
| S Aug 8, 2011 at 21:45 | history | answered | zwol | CC BY-SA 3.0 | |
| S Aug 8, 2011 at 21:45 | history | made wiki | Post Made Community Wiki by zwol |