Quantcast
Channel: Active questions tagged gcc - Stack Overflow
Viewing all articles
Browse latest Browse all 22002

How to define INT128_MAX and INT128_MIN for __int128?

$
0
0

gcc has the __int128 type natively.

However, it’s not defined in limits.h. I’m mean there’re no such things as INT128_MAX or INT128_MIN

And gcc is interpreting literal constants as 64 bits integers. This means that if I write #define INT128_MIN −170141183460469231731687303715884105728 it will complain about the type telling it has truncated the value.

This is especially annoying for shifting on arrays. How to overcome this ?


Viewing all articles
Browse latest Browse all 22002

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>