 # Why does random() return a 32 bit number?

#1

The documentation says that random() returns a long. When I ask for sizeof(long), I get 8 bytes. So a long should be 64 bits. But when I ask for the hex value of random(), I always get 8 numbers in hex, which is a 32 bit number. And the authors seem to have expected this (left shifting 32 and then “bir Orring” another 32 bit number from random() to get a random 64 bit number). Is there something I am not understanding about long? Or does our seed from srandom((unsigned int) time(NULL)) play a role? Any explanations for this behavior?

#2

Try printing the long value in binary form, and see if that sheds any light. Maybe, random() is returning a 32-bit value in a 64-bit long.

#3

How do you print a number in binary form. Do you create your own function for that, or is there a format specifier you can use in printf of NSLog?

#4
``````#include <stdio.h>
#include <stdlib.h>

void printLongBinary (long);

int main (int argc, const char * argv[]) {

printLongBinary (0);
printLongBinary (1);
printLongBinary (2);
printLongBinary (3);
printLongBinary (4);
printLongBinary (5);
printLongBinary (-1);
printLongBinary (random ());

return 0;
}

void printLongBinary (long value)
{
const unsigned long NUMBITS = 8 * sizeof (value);

printf ("%ld: ", value);

for (unsigned long bit = 0; bit < NUMBITS; ++bit) {
const char digit = value & ((long)1 << (long)bit) ? '1' : '0';
printf ("%c", digit);
}
printf ("\n");
}``````
``````0: 0000000000000000000000000000000000000000000000000000000000000000
1: 1000000000000000000000000000000000000000000000000000000000000000
2: 0100000000000000000000000000000000000000000000000000000000000000
3: 1100000000000000000000000000000000000000000000000000000000000000
4: 0010000000000000000000000000000000000000000000000000000000000000
5: 1010000000000000000000000000000000000000000000000000000000000000
-1: 1111111111111111111111111111111111111111111111111111111111111111
1804289383: 1110011010100010110100011101011000000000000000000000000000000000``````
#5

Thanks for the function. It looks like random() only returns a 32 bit number.

#6

random() does actually return a long and, at least in modern macOS, a long is 64-bits. However, there is a catch: random() only generates 32 bits of random data and fits those 32 bits inside a 64-bit long.

So, if we were to print the output of random() in binary format, we would get something like this:
0000000000000000000000000000000001010111100110011010101101001011

That is the reason why the author needs to execute this code…
`int64_t randomBytes = (random() << 32) | random();`
To generate 8 bytes (64 bits) of random data, like this:
0001111011100010100000100001000100000001011100101110000101111011