Quantcast
Channel: Active questions tagged gcc - Stack Overflow
Viewing all articles
Browse latest Browse all 22077

How can you make a C compiler assume decimal literals (e.g. 1.23) are float and not double?

$
0
0

In my source code, if I write 1.23 as a literal, e.g. doThis(1.23), gcc assumes it's a double.

Rather than type doThis((float) 1.23), is there a way to use floats for decimal literals/constants unless otherwise specified in an individual source file?

Mega-bonus points, is there a way that works across (nearly) every C compiler?


Viewing all articles
Browse latest Browse all 22077

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>