I am trying to write a program that generates a random number based on the internal clock of the system, in such a way that I don't need a seed nor a first value. The first value must be taken from the internal clock by converting year, month, day, hours, minutes, seconds to milliseconds and adding them to the current millisecond in order to get a unique number (Time Stamp). Any help to get these values in C?
- 1Note that using the current time as a seed for a random number generator is fine as long as you aren't trying to use it for security too. If you're trying to ensure that you get different values most times you run the program, it'll do. If you're trying to be unpredictable, as in cryptography, it is hopelessly insecure to use the time as a seed for your PRNG (it gives you at most 16 bits of entropy — I'm being generous; it is more like 10 bits, and arguably less than that — whereas you need 128 or more bits of entropy for most cryptographic work).Jonathan Leffler– Jonathan Leffler2014-03-29 12:59:37 +00:00Commented Mar 29, 2014 at 12:59
- 1Possible duplicate of How to measure time in milliseconds using ANSI C? and other system specific versions like stackoverflow.com/questions/3729169/…Ciro Santilli OurBigBook.com– Ciro Santilli OurBigBook.com2016-03-18 22:45:40 +00:00Commented Mar 18, 2016 at 22:45
3 Answers
You can use either clock_gettime() or gettimeofday() — or, if you're in a really impoverished environment, ftime() or time(). Make sure you're using a big enough data type to hold the millitime.
For clock_gettime(), the result is a struct timespec with elements tv_sec and tv_nsec. You'd use:
#include <time.h> #include <stdint.h> struct timespec t; clock_gettime(CLOCK_REALTIME, &t); int64_t millitime = t.tv_sec * INT64_C(1000) + t.tv_nsec / 1000000; With gettimeofday() (which is officially deprecated, but is more widely available — for example, Mac OS X has gettimeofday() but does not have clock_gettime()), you have a struct timeval with members tv_sec and tv_usec:
#include <sys/time.h> #include <stdint.h> struct timeval t; gettimeofday(&t, 0); int64_t millitime = t.tv_sec * INT64_C(1000) + t.tv_usec / 1000; (Note that ftime() was standard in older versions of POSIX but is no longer part of POSIX, though some systems will still provide it for backwards compatibility. It was available in early versions of Unix as the first sub-second time facility, but not as early as 7th Edition Unix. It was added to POSIX (Single Unix Specification) for backwards compatibility, but you should aim to use clock_gettime() if you can and gettimeofday() if you can't.)
6 Comments
clock_gettime() (on an Intel Xeon linux system) always return a nanoseconds value rounded to 250, so you may need to divide by 1000000 if using that part of the result. [EDIT - divide by 1000 yields microseconds, as pointed out below]int64_t before the assignment. Using the LL suffix is one way to do it, and would be the way I'd use if the type of millitime were long long. But since I used int64_t, it is probably better to use the INT64_C() macro defined in <stdint.h> (along with int64_t).Standard C does not guarantee time accuracy better than a second. If you're on a POSIX system, try the clock* functions.
1 Comment
If you happen to be doing this in Windows you can use:
unsigned int rand = GetTickCount() % A_PRIME_NUMBER_FOR_EXAMPLE; [Edit: emphasise modulo something appropriate for your circumstances]