size_t size = sizeof(int);
printf("%d\n", size);
int i;
for (i = 0; i < size; i++) {
printf("%d ", i);
}
The above cod开发者_高级运维e (using gcc) outptus
4
0 1 2 3
size_t size = sizeof(int);
printf("%d\n", size);
int i;
for (i = -1; i < size; i++) {
printf("%d ", i);
}
This code (i is initialized to -1) outputs only 4 and nothing in the loop.
size_t size = sizeof(int);
printf("%d\n", size);
int i;
for (i = -1; i < (int) size; i++) {
printf("%d ", i);
}
Adding a cast makes the code run fine again. The output is
4
-1 0 1 2 3
What's going wrong in the second code? Why doesn't printf go wrong anywhere?
i < size
When i
is signed and size
is unsigned, then i
is converted to unsigned before the comparison is performed. This is part of what are called the usual arithmetic conversions.
When -1
is converted to an unsigned type, the result is the largest possible value representable by the unsigned type, thus i < size
is false when i
is -1
for any value of size
.
When you use i < (int)size
instead, both operands of <
are of type int
, so no conversions need to be performed and since both operands are signed, you get the expected result.
size_t is unsigned. When you cast size to int, you're casting back to signed, and the comparison works.
精彩评论