Consider this code:
#include <stdio.h>int main(void) { /* TEST 1 */ double d = 128; char ch = (char)d; printf("%d\n", ch); /* TEST 2 */ printf("%d\n", (char)128.0); /* TEST 3 */ char ch1 = (char)128.0; printf("%d\n", ch1); return 0;}
Results:
gcc* clang* cl*TEST 1 -128 -128 -128TEST 2 127 0 -128TEST 3 127 -2 -128* latest version
Questions:
- Why the results differ between tests (excluding
cl
)? - Why the results differ between compilers (excluding
TEST 1
)? - In case of UB/IB, where is the UB/IB exactly? What the standard says?
- [Extra question] Why
clang
shows so different behavior? Where these0
and-2
come from?