I tried
printf("%d, %d\n", sizeof(char), sizeof('c'));
and got 1, 4 as output. If size of a character is one, why does 'c'
give me 4? I guess it's because it's an integer. So when I do char ch = 'c';
is there an implicit conversion happening, under the hood, from that 4 byte value to a 1 byte value when it's assigned to the c开发者_C百科har variable?
In C 'a' is an integer constant (!?!), so 4 is correct for your architecture. It is implicitly converted to char for the assignment. sizeof(char) is always 1 by definition. The standard doesn't say what units 1 is, but it is often bytes.
Th C standard says that a character literal like 'a' is of type int, not type char. It therefore has (on your platform) sizeof == 4. See this question for a fuller discussion.
It is the normal behavior of the sizeof
operator (See Wikipedia):
- For a datatype,
sizeof
returns the size of the datatype. Forchar
, you get 1. - For an expression,
sizeof
returns the size of the type of the variable or expression. As a character literal is typed asint
, you get 4.
This is covered in ISO C11 6.4.4.4 Character constants
though it's largely unchanged from earlier standards. That states, in paragraph /10
:
An integer character constant has type int. The value of an integer character constant containing a single character that maps to a single-byte execution character is the numerical value of the representation of the mapped character interpreted as an integer.
According to the ANSI C standards, a char
gets promoted to an int
in the context where integers are used, you used a integer format specifier in the printf
hence the different values. A char is usually 1 byte but that is implementation defined based on the runtime and compiler.
精彩评论