What are the char implicit typecasting rules? The following code gives an awkward output of -172.
char x = 200;char y = 140;printf("%d", x+y);
My guess is that being signed, x is casted into 72, and y is casted into 12, which should give 84 as the answer, which however is not the case as mentioned above. I am using gcc on Ubuntu.