My gcc version is 12.4.0.
code is below:
int x1 = 0x80000000;printf("%d\n\n", x1 ^ (~x1 + 1));printf("%d\n\n", !(x1 ^ (~x1 + 1)));
I though the output should be 0 and 1, since !0 is 1.
However, the output is 0 and 0. Which I could not understand.
I thought maybe it is intepreted as 8 bytes, but sizeof( x1 ^ (~x1 + 1)) is just 4.
BTW, if I assign x1 ^ (~x1 + 1) to a variable, then output is correct (0 and 1):
int check = (x ^ (~x + 1));printf("%d\n\n", check);printf("%d\n\n", !check);