开发者

Why is char being converted into a ushort instead of an int in this decompilation?

开发者 https://www.devze.com 2023-03-07 16:54 出处:网络
When i write a for loop as below it works fine. for (Char ch = \'A\'; ch < \'Z\'; ch++) { Console.WriteLine(ch.ToString());

When i write a for loop as below it works fine.

for (Char ch = 'A'; ch < 'Z'; ch++)
{
    Console.WriteLine(ch.ToString());
}

I thought that compiler converts C开发者_如何学Char type to int but when i looked at the decompiled code, this is what i saw:

for (char i = 65; i < 97; i = (ushort)i + 1)
{
    Console.WriteLine(i.ToString());
}

Can someone please explain why the compiler did not change the datatype of i from non-numeric to numeric?

--EDIT-- Added decompiler screenshot

Why is char being converted into a ushort instead of an int in this decompilation?


To answer the question title,

Why is char being converted into a ushort instead of an int in this decompilation?

Chars are 16-bit, and so are unsigned shorts. There simply isn't a need to convert to any larger-ranged type. The decompiler you used was probably working based on that.

To answer your edited question,

Can someone please explain why the compiler did not change the datatype of i from non-numeric to numeric?

It's precisely because chars, while they have corresponding numeric character codes, aren't themselves the same as numeric types. You can cast an integer to a char, but they're not the same thing. Consequently, ((char) 65).ToString() is not the same as ((int) 65).ToString().


For the record, .NET Reflector 7 decompiles your code to this:

for (char ch = 'A'; ch < 'Z'; ch = (char) (ch + '\x0001'))
{
    Console.WriteLine(ch.ToString());
}

No sign of any integers anywhere according to Reflector. The code is almost identical to what you originally wrote.

If you want to look at what's really happening, look at the IL.


A char can be implicitly converted to ushort, int, uint, long, ulong, float, double, or decimal

0

精彩评论

暂无评论...
验证码 换一张
取 消