According to the Java standard the short
and char
types both use 2 bytes so when I code something like:
char ch = 'c';
short s = ch;
There is an error saying "possible loss of precision". What am I missing here?
char
is unsigned, short
is signed.
So while they are both 2-byte long, they use the sixteenth bit for different purposes.
The range of the char
type is 0 to 2^16 - 1 (0 to 65535).
The short
range is -2^15 to 2^15 - 1 (−32,768 to 32,767).
The difference is that char
is unsigned, short
is signed. Thus, half the range of values of char
is too big to be represented as a short
(and of course, in symmetry, char
cannot represent any of the negative values short
can).
精彩评论