So I have this code:
p.Value = 1;
decimal avg = p.Value * 100 / 10000;
string prntout = p.Key + " : " + avg.ToString();
Console.WriteLine(prntout);
But the program prints out 0, instead of 0.01.开发者_运维问答 p.Value is an int. How do I fix that?
Change one of the literals into a decimal:
decimal avg = p.Value * 100m / 10000;
Now, to explain why this works:
Lets process the original line one operation at a time, substituting 1 for p.Value:
decimal avg = 1 * 100 / 10000; // int multiplication
decimal avg = 100 / 10000; // int division, remainder tossed out
decimal avg = (decimal) 0; // implicit cast
By changing 100 to 100m, it's now:
decimal avg = 1 * 100m / 10000; // decimal multiplication
decimal avg = 100m / 10000; // decimal division
decimal avg = 0.01m;
The expression p.Value * 100 / 10000
is only using integer types, so evaluates according to integer division rules.
Change one (or more) of the parameters to a decimal and it will perform as expected:
p.Value * 100 / 10000m
Try changing this:
decimal avg = p.Value * 100 / 10000;
to
decimal avg = Convert.ToDecimal(p.Value) * 100.0 / 10000.0;
Your previous versions used all integers.
If P.Value is an integer you will probably lose the fraction on this line:
decimal avg = p.Value * 100 / 10000;
So you can do this:
decimal avg = (decimal)P.Value * 100 / 10000;
Hope it helps.
精彩评论