开发者

Difference between Enum and Define Statements

开发者 https://www.devze.com 2023-01-03 03:57 出处:网络
What\'s the difference between using a define statement and an enum statement in C/C++ (and is there any difference when using them with either C or C++)?开发者_开发问答

What's the difference between using a define statement and an enum statement in C/C++ (and is there any difference when using them with either C or C++)?开发者_开发问答

For example, when should one use

enum {BUFFER = 1234}; 

over

#define BUFFER 1234   


enum defines a syntactical element.

#define is a pre-preprocessor directive, executed before the compiler sees the code, and therefore is not a language element of C itself.

Generally enums are preferred as they are type-safe and more easily discoverable. Defines are harder to locate and can have complex behavior, for example one piece of code can redefine a #define made by another. This can be hard to track down.


#define statements are handled by the pre-processor before the compiler gets to see the code so it's basically a text substitution (it's actually a little more intelligent with the use of parameters and such).

Enumerations are part of the C language itself and have the following advantages.

1/ They may have type and the compiler can type-check them.

2/ Since they are available to the compiler, symbol information on them can be passed through to the debugger, making debugging easier.


Enums are generally prefered over #define wherever it makes sense to use an enum:

  • Debuggers can show you the symbolic name of an enums value ("openType: OpenExisting", rather than "openType: 2"
  • You get a bit more protection from name clashes, but this isn't as bad as it was (most compilers warn about re#defineition.

The biggest difference is that you can use enums as types:

// Yeah, dumb example
enum OpenType {
    OpenExisting,
    OpenOrCreate,
    Truncate
};

void OpenFile(const char* filename, OpenType openType, int bufferSize);

This gives you type-checking of parameters (you can't mix up openType and bufferSize as easily), and makes it easy to find what values are valid, making your interfaces much easier to use. Some IDEs can even give you intellisense code completion!


Define is a preprocessor command, it's just like doing "replace all" in your editor, it can replace a string with another and then compile the result.

Enum is a special case of type, for example, if you write:

enum ERROR_TYPES
{
   REGULAR_ERR =1,
   OK =0
}

there exists a new type called ERROR_TYPES. It is true that REGULAR_ERR yields to 1 but casting from this type to int should produce a casting warning (if you configure your compiler to high verbosity).

Summary: they are both alike, but when using enum you profit the type checking and by using defines you simply replace code strings.


It's always better to use an enum if possible. Using an enum gives the compiler more information about your source code, a preprocessor define is never seen by the compiler and thus carries less information.

For implementing e.g. a bunch of modes, using an enum makes it possible for the compiler to catch missing case-statements in a switch, for instance.


enum can group multiple elements in one category:

enum fruits{ apple=1234, orange=12345};

while #define can only create unrelated constants:

#define apple 1234
#define orange 12345


#define is a preprocessor command, enum is in the C or C++ language.

It is always better to use enums over #define for this kind of cases. One thing is type safety. Another one is that when you have a sequence of values you only have to give the beginning of the sequence in the enum, the other values get consecutive values.

enum {
  ONE = 1,
  TWO,
  THREE,
  FOUR
};

instead of

#define ONE 1
#define TWO 2
#define THREE 3
#define FOUR 4

As a side-note, there is still some cases where you may have to use #define (typically for some kind of macros, if you need to be able to construct an identifier that contains the constant), but that's kind of macro black magic, and very very rare to be the way to go. If you go to these extremities you probably should use a C++ template (but if you're stuck with C...).


If you only want this single constant (say for buffersize) then I would not use an enum, but a define. I would use enums for stuff like return values (that mean different error conditions) and wherever we need to distinguish different "types" or "cases". In that case we can use an enum to create a new type we can use in function prototypes etc., and then the compiler can sanity check that code better.


Besides all the thing already written, one said but not shown and is instead interesting. E.g.

enum action { DO_JUMP, DO_TURNL, DO_TURNR, DO_STOP };
//...
void do_action( enum action anAction, info_t x );

Considering action as a type makes thing clearer. Using define, you would have written

void do_action(int anAction, info_t x);


For integral constant values I've come to prefer enum over #define. There seem to be no disadvantages to using enum (discounting the miniscule disadvantage of a bit more typing), but you have the advantage that enum can be scoped, while #define identifiers have global scope that tromps everything.

Using #define isn't usually a problem, but since there are no drawbacks to enum, I go with that.

In C++ I also generally prefer enum to const int even though in C++ a const int can be used in place of a literal integer value (unlike in C) because enum is portable to C (which I still work in a lot) .


If you have a group of constants (like "Days of the Week") enums would be preferable, because it shows that they are grouped; and, as Jason said, they are type-safe. If it's a global constant (like version number), that's more what you'd use a #define for; although this is the subject of a lot of debate.


In addition to the good points listed above, you can limit the scope of enums to a class, struct or namespace. Personally, I like to have the minimum number of relevent symbols in scope at any one time which is another reason for using enums rather than #defines.


Another advantage of an enum over a list of defines is that compilers (gcc at least) can generate a warning when not all values are checked in a switch statement. For example:

enum {
    STATE_ONE,
    STATE_TWO,
    STATE_THREE
};

...

switch (state) {
case STATE_ONE:
    handle_state_one();
    break;
case STATE_TWO:
    handle_state_two();
    break;
};

In the previous code, the compiler is able to generate a warning that not all values of the enum are handled in the switch. If the states were done as #define's, this would not be the case.


enums are more used for enumerating some kind of set, like days in a week. If you need just one constant number, const int (or double etc.) would be definetly better than enum. I personally do not like #define (at least not for the definition of some constants) because it does not give me type safety, but you can of course use it if it suits you better.


Creating an enum creates not only literals but also the type that groups these literals: This adds semantic to your code that the compiler is able to check.

Moreover, when using a debugger, you have access to the values of enum literals. This is not always the case with #define.


While several answers above recommend to use enum for various reasons, I'd like to point out that using defines has an actual advantage when developing interfaces. You can introduce new options and you can let software use them conditionally.

For example:


    #define OPT_X1 1 /* introduced in version 1 */
    #define OPT_X2 2 /* introduced in version  2 */

Then software which can be compiled with either version it can do


    #ifdef OPT_X2
    int flags = OPT_X2;
    #else
    int flags = 0;
    #endif

While on an enumeration this isn't possible without a run-time feature detection mechanism.


Enum:

1. Generally used for multiple values

2. In enum there are two thing one is name and another is value of name name must be distinguished but value can be same.If we not define value then first value of enum name is 0 second value is 1,and so on, unless explicitly value are specified.

3. They may have type and compiler can type check them

4. Make debugging easy

5. We can limit scope of it up to a class.

Define:

1. When we have to define only one value

2. It generally replace one string to another string.

3. It scope is global we cannot limit its scope

Overall we have to use enum


There is little difference. The C Standard says that enumerations have integral type and that enumeration constants are of type int, so both may be freely intermixed with other integral types, without errors. (If, on the other hand, such intermixing were disallowed without explicit casts, judicious use of enumerations could catch certain programming errors.)

Some advantages of enumerations are that the numeric values are automatically assigned, that a debugger may be able to display the symbolic values when enumeration variables are examined, and that they obey block scope. (A compiler may also generate nonfatal warnings when enumerations are indiscriminately mixed, since doing so can still be considered bad style even though it is not strictly illegal.) A disadvantage is that the programmer has little control over those nonfatal warnings; some programmers also resent not having control over the sizes of enumeration variables.

0

精彩评论

暂无评论...
验证码 换一张
取 消