开发者

Xcode - Debugger does not show correct value

开发者 https://www.devze.com 2023-03-23 00:49 出处:网络
Let\'s try this 开发者_StackOverflow: float test = 3.56; float roundedVal = round(test * 10.0f) / 10.0f;

Let's try this 开发者_StackOverflow:

float test = 3.56;
float roundedVal = round(test * 10.0f) / 10.0f;
NSLog (@"%f", roundedVal);

Why does tthe NSLog shows 3.600000, and the debugger 3.5999999 ? What is the correct value I can count on ?


A float is not very precise and you can't expect them to be displayed accurately from one "view" to the next depending on a variety of factors. Read What Every Computer Scientist Should Know About Floating-Point Arithmetic.

Boils down to this: use double if you really want precision.


The correct value is the debugger. You can see this with a simplified version of the code above:

float test = 3.5999999;
NSLog (@"%f", test);

In this case, you get the same results as what you mentioned above: the log states 3.600000 and the debugger states it is 3.5999999. In all cases, the debugger has the correcrt value over an NSLog. When we dig a bit deeper, we can see that NSLog is slightly massaging the float value.

In reality - you should probably use a double here to maintain the precision you are looking for.


There is no "correct value". The expected value "3.6" does not have an exact binary representation. Therefore, the value the computer uses internally is an approximation. When printing the value, it needs to convert that back to decimal. In your case, the binary representation is likely slightly less than 3.6. NSLog happens to round it up, while the debugger rounds it down.

Using double instead of float, like the accepted answer says, does not change anything in this.

0

精彩评论

暂无评论...
验证码 换一张
取 消

关注公众号