开发者

INT_SIZE_LENGTH for itow() is 20, why?

开发者 https://www.devze.com 2023-01-25 15:33 出处:网络
In my old Visual Studio 6 I found that calling itow(), which looks basically like this: #define INT_SIZE_LENGTH 20

In my old Visual Studio 6 I found that calling itow(), which looks basically like this:

#define INT_SIZE_LENGTH 20

wchar_t* _itow(wchar_t*bif, int i, int radix)
{
   char abuf[INT_SIZE_LENGTH];
   itoa(abuf, i, radix);
   // convert_to_wide_char();
}

Now, notice the define INT_SIZE_LENGTH. Why is this set to 20?

Worst case for an int32 should be -4294967295, right. And that is only 11 characters, plus \0. (My own buffer, in the call to _itow, is only 13 long. I thought that was sufficient.)

(An positive int64 would be up to 20 characters, a negative up to 21. But this is the method for 开发者_运维技巧32-bit integers.)

I feel like I am missing something here? Any ideas gratefully received.

(I looked at the code from Visual Studio 2008, and there the code was completely rewritten. So I guess the VS6 code is not that good.)


Probably because it can emit non-decimal numbers, if radix is less than 10. Then the number of digits grows. On the other hand, that would imply that INT_SIZE_LENGTH should be 33, to support binary output.


MSVC is buggy; big surprise. A correct length (for arbitrary base support) would be sizeof(inttype)*CHAR_BIT/log2(minbase)+2, where minbase is the minimum base you need to support. Round the logarithm down, of course.

0

精彩评论

暂无评论...
验证码 换一张
取 消