开发者

Are typedef and #define the same in c?

开发者 https://www.devze.com 2022-12-10 22:02 出处:网络
I wonder if typedef开发者_StackOverflow中文版 and #define are the same in c?typedef obeys scoping rules just like variables, whereas define stays valid until the end of the compilation unit (or until

I wonder if typedef开发者_StackOverflow中文版 and #define are the same in c?


typedef obeys scoping rules just like variables, whereas define stays valid until the end of the compilation unit (or until a matching undef).

Also, some things can be done with typedef that cannot be done with define.
For example:

typedef int* int_p1;
int_p1 a, b, c;  // a, b, c are all int pointers

#define int_p2 int*
int_p2 a, b, c;  // only the first is a pointer, because int_p2
                 // is replaced with int*, producing: int* a, b, c
                 // which should be read as: int *a, b, c
typedef int a10[10];
a10 a, b, c;  // create three 10-int arrays
typedef int (*func_p) (int);
func_p fp;  // func_p is a pointer to a function that
            // takes an int and returns an int


No.

#define is a preprocessor token: the compiler itself will never see it.
typedef is a compiler token: the preprocessor does not care about it.

You can use one or the other to achieve the same effect, but it's better to use the proper one for your needs

#define MY_TYPE int
typedef int My_Type;

When things get "hairy", using the proper tool makes it right

#define FX_TYPE void (*)(int)
typedef void (*stdfx)(int);

void fx_typ(stdfx fx); /* ok */
void fx_def(FX_TYPE fx); /* error */


No, they are not the same. For example:

#define INTPTR int*
...
INTPTR a, b;

After preprocessing, that line expands to

int* a, b;

Hopefully you see the problem; only a will have the type int *; b will be declared a plain int (because the * is associated with the declarator, not the type specifier).

Contrast that with

typedef int *INTPTR;
...
INTPTR a, b;

In this case, both a and b will have type int *.

There are whole classes of typedefs that cannot be emulated with a preprocessor macro, such as pointers to functions or arrays:

typedef int (*CALLBACK)(void);
typedef int *(*(*OBNOXIOUSFUNC)(void))[20]; 
...
CALLBACK aCallbackFunc;        // aCallbackFunc is a pointer to a function 
                               // returning int
OBNOXIOUSFUNC anObnoxiousFunc; // anObnoxiousFunc is a pointer to a function
                               // returning a pointer to a 20-element array
                               // of pointers to int

Try doing that with a preprocessor macro.


#define defines macros.
typedef defines types.

Now saying that, here are a few differences:

With #define you can define constants that can be used in compile time. The constants can be used with #ifdef to check how the code is compiled, and specialize certain code according to compile parameters.
You can also use #define to declare miniature find-and-replace Macro functions.

typedef can be used to give aliases to types (which you could probably do with #define as well), but it's safer because of the find-and-replace nature of #define constants.
Besides that, you can use forward declaration with typedef which allows you to declare a type that will be used, but isn't yet linked to the file you're writing in.


Preprocessor macros ("#define's") are a lexical replacement tool a la "search and replace". They are entirely agnostic of the programming language and have no understanding what you're trying to do. You can think of them as a glorified copy/paste mechanic -- occasionally that's useful, but you should use it with care.

Typedefs are a C language feature that lets you create aliases for types. This is extremely useful to make complicated compound types (like structs and function pointers) readable and handlable (in C++ there are even situations where you must typedef a type).

For (3): You should always prefer language features over preprocessor macros when that's possible! So always use typedefs for types, and constant values for constants. That way, the compiler can actually interact with you meaningfully. Remember that the compiler is your friend, so you should tell it as much as possible. Preprocessor macros do the exact opposite by hiding your semantics from the compiler.


They are very different, although they are often used to implement custom data types (which is what I am assuming this question is all about).

As pmg mentioned, #define is handled by the pre-processor (like a cut-and-paste operation) before the compiler sees the code, and typedef is interpreted by the compiler.

One of the main differences (at least when it comes to defining data types) is that typedef allows for more specific type checking. For example,

#define defType int
typedef int tdType

defType x;
tdType y;

Here, the compiler sees variable x as an int, but variable y as a data type called 'tdType' that happens to be the same size as an int. If you wrote a function that took a parameter of type defType, the caller could pass a normal int and the compiler wouldn't know the difference. If the function instead took a parameter of type tdType, the compiler would ensure that a variable of the proper type was used during function calls.

Also, some debuggers have the ability to handle typedefs, which can be much more useful than having all custom types listed as their underlying primitive types (as it would be if #define was used instead).


No.
typedef is a C keyword that creates an alias for a type.
#define is a pre-processor instruction, that creates a text replacement event prior to compilation. When the compiler gets to the code, the original "#defined" word is no longer there. #define is mostly used for macros and global constants.


AFAIK, No.

typedef helps you set up an "alias" to an existing data type. For eg. typedef char chr;

#define is a preprocessor directive used to define macros or general pattern substitutions. For eg. #define MAX 100, substitutes all occurrences of MAX with 100


As mentioned above, there is a key difference between #define and typedef. The right way to think about that is to view a typedef as being a complete "encapsulated" type. It means that you cannot add to it after you have declared it.

You can extend a macro typename with other type specifiers, but not a typedef'd typename:

#define fruit int
unsigned fruit i;   // works fine

typedef int fruit;
unsigned fruit i;   // illegal

Also, a typedef'd name provides the type for every declator in a declaration.

#define fruit int *
fruit apple, banana;

After macro expansion, the second line becomes:

int *apple, banana;

Apple is a pointer to an int, while banana is an int. In comparison. a typedef like this:

typedef char *fruit;
fruit apple, banana;

declares both apple and banana to be the same. The name on the front is different, but they are both pointers to a char.


Another reason to use typedef (which has only been mentioned briefly in other answers and yet I think is the entire reason typedef was created) is to make debugging easier when using libraries that have custom types. For example, I'll use a type-conversion error. Both the codes below will print a compile-time error saying that a char is not comparable to a string, but in different ways.

typedef char letter;
letter el = 'e';
if(el == "hello");

The above code will print something like the variable "el" of type letter (aka "char") is not compatable with type "char*"

#define letter char
letter el = 'e';
if(el == "hello");

This code will instead print the variable "el" of type char is not compatable with type "char*"

This may seem silly because I'm defining "letter" as "char", but in more complex libraries this can be extremely confusing because pointers to objects like buttons, windows, sound servers, images, and lots of other things are defined as unsigned char *, which would only be debuggable as exactly that when using the #define method.


As everyone said above, they aren't the same. Most of the answers indicate typedef to be more advantageous than #define. But let me put a plus point of #define :
when your code is extremely big, scattered across many files, it's better to use #define; it helps in readability - you can simply preprocess all the code to see the actual type definition of a variable at the place of its declaration itself.

0

精彩评论

暂无评论...
验证码 换一张
取 消