Hello everybody
I try a short program on Primer2...
Why are the declatations long long int only evaluated as long int?
long long int a64=4581298449LL; // 0x111111111 33bits
long long int b64=9162596898LL; // 0x222222222 33bits
long long int c64;
c64 = a64 + b64;
The debugger gives:
a64 is 0x11111111 (only 32 bits)
b64 is 0x22222222 (only 32 bits)
and the result is 0x33333333 (only 32 bits)
Thank you for your help