Skip to main content
21 events
when toggle format what by license comment
Oct 22, 2021 at 8:48 comment added vitaly-t This looks like the slowest C implementation ever :)
Jun 17, 2020 at 9:04 history edited CommunityBot
Commonmark migration
Mar 13, 2016 at 5:23 comment added DDPWNAGE You could also start i to be an odd number (such as 19) and then increment by 2, since any even number is divisible by 2.
Feb 29, 2016 at 18:39 comment added user15259 @Cyoce - it might be faster, don't know - i wrote for clarity - these days most compilers will treat i + ii * 2
Feb 29, 2016 at 18:35 comment added user15259 @Liam - default on mine was 64-bit integers - have changed source to work with 32-bit, might run faster, might not
Feb 29, 2016 at 18:33 history edited user15259 CC BY-SA 3.0
Changed int to unsigned int to handle overflow on 32-bits.
Feb 29, 2016 at 18:15 comment added Cyoce Is i + i really faster than i * 2?
Feb 29, 2016 at 1:11 history edited Liam CC BY-SA 3.0
added 562 characters in body
Feb 28, 2016 at 20:31 comment added Dennis @Liam Hard to say. Signed integer overflow is undefined behavior, so without looking at the generated assembly, it's difficult to predict what the compiler did.
Feb 28, 2016 at 19:29 comment added Liam out of curiosity though, why wasn't that segfaulting on his computer too?
Feb 28, 2016 at 19:28 comment added Liam Hooray for @Dennis. I changed the int declaration line to long and now it works. I'll time it shortly.
Feb 27, 2016 at 18:55 comment added Dennis This segfaults because i + i does not fit in an int for large values of i.
Feb 27, 2016 at 18:53 comment added Liam I think my computer just straight up doesn't have enough RAM to run this as is. Please post the time from your own computer.
Feb 27, 2016 at 9:25 comment added user15259 Rewritten to use global array. Good luck!
Feb 27, 2016 at 9:23 history edited user15259 CC BY-SA 3.0
Two vs One GB, global array
Feb 27, 2016 at 9:15 history edited user15259 CC BY-SA 3.0
Two vs One GB
Feb 27, 2016 at 1:51 comment added Liam I already checked. Bool is 1 byte. Is it possible to just ask form 2*10^9 bytes of memory in the stack? I.e declare a global variable which (on gcc) I believe will be initiated to 0. This would require using char instead though I think.
Feb 27, 2016 at 1:47 comment added user15259 Many systems, including Linux, do optimistic allocation. E.g. if you ask for 1 Gb it will "give" it to you, but when you actually go to use it, and if the system can't find it, it will crash. If this were the case, it would likely be crashing at the memset. The minute it is taking is the time spending trying to coalesce the heap into a contiguous block. Also check on your system is sizeof(bool) == 1. If it's == 4, then I can rewrite this to use char.
Feb 27, 2016 at 1:39 comment added Liam I'm segfaulting on the first case (others run fine). It happens after ~1 minute of runtime. I added if (p==NULL) {exit(1);} line to the code, so I don't believe the malloc is failing (also it would fail at the beginning, not 1 minute in). Ideas on what is happening?
Feb 26, 2016 at 21:48 comment added Liam This works using the sieve of eratosthenes? I'll time it when I get home
Feb 26, 2016 at 21:27 history answered user15259 CC BY-SA 3.0