Background: I am receiving a array as char* as a part of socket session. Now we have to match the Tokens (HTTP headers) out of it.Code here is that we have created a UBYTE* and getting the value from the char array after typecasting with UBYTE. Later same UBYTE pointer we are passing to other function which accepts char* after typecasting it to char*.
Problem here is this works in release build and not in debug build (with -g and different optimization). Not only this adding few prints in debug mode hide the problem.
So my queestion he开发者_运维技巧re, What is the difference between UByte pointer(which is essentially a unsigned char) and char pointer. Changing UByte to char is solving my problem in all modes but I dont have any explaination for the same? Any thoughts ?
There is nothing wrong with casting between char *
and unsigned char *
. If you're getting unexpected behavior which varies depending on optimization levels, there's certainly a bug in your code, but it probably has little to do with discarding signedness in the cast.
Aside from that, UBYTE
is a pretty ridiculous typedef since there exists a standard C type, uint8_t
, which is identical and defined in stdint.h
.
Perhaps you could explain in the first place, why you though you had to use an unsigned char
in the first place?
And what doesn't work means?
void*
, char*
and unsigned char*
have different semantics and you should use them according to that:
void*
points to unspecific data with which you can't do anything unless you cast it to some real typechar*
unfortunately has two different meanings, either as text string, or as unspecific data but which may be addressed (patched) at a low (byte) levelsigned char
andunsigned char
are small width integers on which you want to perform arithmetics
精彩评论