Something that has piqued my interest is Objective-C's BOOL type definition.
Why is it defined as a signed char
(which could cause unexpected behaviour if a value greater than 1 byte in length is assigned to it) rather than as an int
, as C does (much less margin for error: a zero value is false, a non-zero value is true)?
The only reason I can think of is the Objective-开发者_开发问答C designers micro-optimising storage because the char
will use less memory than the int
. Please can someone enlighten me?
Remember that Objective-C was created back in the 1980's, when saving bytes really mattered.
As mentioned in a comment, as long as you stick with the values YES and NO, everything will be fine.
精彩评论