int var = 2147483648;
The actual behavior for signed integers is implementation-defined. Most possibly the value will be narrowed (in other words "cut-of") to four least significant bytes (assuming that sizeof(int) = 4
).
The constant 2147483648
is likely of type long
(or long long
if former is not enough to hold it), so what is actually happening here is like:
int var = (int) 2147483648LL;
The actual value after conversion is -2147483648
as only sign bit is set (assuming two's complement representation).
Citing from C11
(N1570) §6.3.1.3/p3 Signed and unsigned integers (emphasis mine):
Otherwise, the new type is signed and the value cannot be represented
in it; either the result is implementation-defined or an
implementation-defined signal is raised.
For instance, with GCC
, the result is to reduce result modulo 2^N
, which is effectively the same as bytes "cut off":
For conversion to a type of width N, the value is reduced modulo 2^N
to be within range of the type; no signal is raised.
与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…