I inherited some code and can't figure out one piece of it:
byte[] b = new byte[4] { 3, 2, 5, 7 };
int c = (b[0] & 0x7f) &开发者_开发百科lt;< 24 | b[1] << 16 | b[2] << 8 | b[3];
Can anyone tell what's happening here?
Basically it converts the lower 31 bits of a 4 byte array into an integer using a big-endian conversion.
So a byte array of { 0, 0, 0, 1 } would be converted to 1; a byte array of { 0, 0, 1, 0 } would be converted to 256 etc.
It does this through a mixture of bitwise operators:
&
is bitwise "and"|
is bitwise "or"<<
is the left shift operator
haven't done bit math in a minute so..for fun:
[extra parenthesis to show order of ops]
((b[0] & 0x7f) << 24) | (b[1] << 16) | (b[2] << 8) | b[3]
(b[0] & 0x7f) << 24 = 11 0000 0000 0000 0000 0000 0000
b[1] << 16 = . . . . . . . . . . 10 0000 0000 0000 0000
b[2] << 8 = . . . . . . . . . . . . . . . 101 0000 0000
b[3] = . . . . . . . . . . . . . . . 0111
now OR these together, and you get
0011 0000 0010 0000 0101 0000 0111 = 50,464,007
精彩评论