I am writing just these four lines in ViewDidLoad, and facing unexpected output,
// Implement viewDidLoad to do additional setup after loading the view, typically from a nib.
- (void)viewDidLoad {
[super viewDidLoad];
float a = 15.264;
NSLog(@"FLoat is %f",a);
float b = 10/60;
NSLog(@"FLoat is %f",b);
}
In first float开发者_StackOverflow中文版 a is displaying correct value which is being assigned to a, but b is displaying 0.0000, why?
If you want to declare float numbers, you should declare them with this format: x.xf
. E.g float b = 10.0f/60.0f
You are assigning the result of an integer division to a float, meaning, the result will be truncated before being assigned. In other words, 10/60
, which would be 0.1666...
yields the result 0
, which is then assigned to b
. So, b
has the value zero.
Since both operands are integer, the result of your division is an integer (0). If you want the float result of your division you should use
float b = 10.0/60.0
or even
float b = 10.0/60
because you need at least one of the operands to be float.
try this,
float b = 10.0/60.0,
bcox, just 10/60 return int
精彩评论