Re: Integer size (fwd)


Subject: Re: Integer size (fwd)
From: Caolan McNamara (cmc@stardivision.de)
Date: Tue Jun 06 2000 - 02:38:00 CDT


At 09:09 06.06.00 +0200, Hubert Figuiere wrote:
>According to Aaron Lehmann <aaronl@vitelus.com>:
> > Back when I used to program on the mac, the default integer size was 16
> > bits. Just FYI. I think it would be smart to use the UT_*32 macros when
> > you really want a 32 bit int.
>
>Yes and no. On 68k it was supposed to be 16 bits by default, but both
>Symantec and CodeWarrior was giving the option to use 32 bits ints.
>
>BTW, the C standard does NOT define the range of an int, so this must not be
>assumed. Same for short and long AFAIK. The best is to use C standardized
>types.
>
>
>Hub

Yep, all that ANSI-C|C++ said on the size of fundamental types is that

sizeof([signed|unsigned]char) <= sizeof([signed|unsigned] short)
<= sizeof([signed|unsigned]int) <= sizeof([signed|unsigned] long)

a system that defined all types to be 8 bits would be legitimate if odd. Or
more
commonly int and long both as 32bits or short and int both as 16bits.
If you need an exact size you have to determine the type to use using out
of band
information, i.e. #defines based on platform/compiler, or better autoconf
tests based
upon sizeof(type), AC_CHECK_SIZEOF
http://www.gnu.org/manual/autoconf-2.13/html_chapter/autoconf_4.html#SEC33

C.

(p.s. In a perfect world wed have the proposed C9X standard
http://lglwww.epfl.ch/~wolf/c/c9x_changes.html which includes
an inttypes header which has types defined of exact lengths. )



This archive was generated by hypermail 2b25 : Tue Jun 06 2000 - 02:38:23 CDT