See this response by Jack Klein (see original post):
The original C standard (ANSI 1989/ISO
1990) required that a compiler
successfully translate at least one
program containing at least one
example of a set of environmental
limits. One of those limits was being
able to create an object of at least
32,767 bytes.
This minimum limit was raised in the
1999 update to the C standard to be at
least 65,535 bytes.
No C implementation is required to
provide for objects greater than that
size, which means that they don't need
to allow for an array of ints greater
than (int)(65535 / sizeof(int)).
In very practical terms, on modern
computers, it is not possible to say
in advance how large an array can be
created. It can depend on things like
the amount of physical memory
installed in the computer, the amount
of virtual memory provided by the OS,
the number of other tasks, drivers,
and programs already running and how
much memory that are using. So your
program may be able to use more or
less memory running today than it
could use yesterday or it will be able
to use tomorrow.
Many platforms place their strictest
limits on automatic objects, that is
those defined inside of a function
without the use of the 'static'
keyword. On some platforms you can
create larger arrays if they are
static or by dynamic allocation.
Now, to provide a slightly more tailored answer, DO NOT DECLARE HUGE ARRAYS TO AVOID BUFFER OVERFLOWS. That's close to the worst practice one can think of in C. Rather, spend some time writing good code, and carefully make sure that no buffer overflow will occur. Also, if you do not know the size of your array in advance, look at malloc
, it might come in handy :P
与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…