开发者

Is there a limit to the number of #defines that the gcc and VC++ preprocessors can handle?

开发者 https://www.devze.com 2023-02-04 15:38 出处:网络
In discussing design possibilities for a project that has a very large number of constants and bit patterns to be defined, the question came up about how many #defines can a standard compiler handle?I

In discussing design possibilities for a project that has a very large number of constants and bit patterns to be defined, the question came up about how many #defines can a standard compiler handle? I assume it is a very large number, but we were curious to know if 开发者_运维技巧there is an actual upper bound.


For a "standard compiler":

5.2.4.1: "Translation limits"

The implementation shall be able to translate and execute at least one program that contains at least one instance of every one of the following limits

...

4095 macro identifiers simultaneously defined in one preprocessing translation unit

Note the slightly odd way of phrasing the requirement. Implementations could satisfy it by having a single "golden program" which they recognise and compile as a special case, although that would be akin to rigging benchmarks. In practice you can read the standard as saying that if your implementation imposes a limit other than available memory, then that limit should be at least 4095. Beyond 4095 you are relying on implementation-specific behavior to an extent.

Some compilers (Microsoft) impose some implementation limits which are less than the standard says. These are listed somewhere on MSDN I think, but possibly only for C++. As far as C goes, since I'm quoting C99 it might not be relevant to MSVC anyway.

For GCC and MSVC in particular, it shouldn't be too hard to test whether a given implementation imposes an arbitrary limit, perhaps easier than finding it documented :-) Auto-generate files containing nothing but great long lists of #define, see what the preprocessor makes of them.


I have never heard of anyone running out. Ever.


The C preprocessor doesn't expand #define before they are actually used. So in a typical implementation the only limit you might encounter is memory to store all that. But this memory for storing the internal representation of the macros will basically at most be something proportional to the size of the files that the compiler reads.

(Well you could do multiple inclusion of files also...)

You could make explode a preprocessing run by expanding deeply nested macros, I guess. Something like

#define EXP1(X) X X
#define EXP2(X) EXP1(X) EXP1(X)
#define EXP3(X) EXP2(X) EXP2(X)
.
.
#define EXP64(X) EXP63(X) EXP63(X)
EXP64(A)

should do the trick, since it gives you 2^64 copies of A, or so. AFAIR, these macro definitions are even within the bounds that the standard imposes.

0

精彩评论

暂无评论...
验证码 换一张
取 消