开发者

Taking Logarithms of relatively small numbers in different languages/architectures/operating systems

开发者 https://www.devze.com 2023-01-05 17:04 出处:网络
In Java I run: System.out.println(Math.log(249.0/251.0)); Output: -0.008000042667076265 In C# I run: <- fixed

In Java I run:

System.out.println(Math.log(249.0/251.0));

Output: -0.008000042667076265

In C# I run: <- fixed

Math.Log (x/y); \\where x, y are almost assuredly 249.0 and 251.0 respectively

Output: -0.175281838 (printed out later in the program)

Google claims:

Log(249.0/251.0)

Output: -0.00347437439

And MacOS claims about the same thing (the first difference between google and Snow Leopard is at about 10^-8, which is negligible.

Is there any reason that these results should all vary so widely or am I missing something very obvious? (I did check that java and C# both use base e). Even mildly different values of e don't seem to account for such a big difference. Any suggestions?

EDIT:

Verifying on Wolfram Alpha seems to suggest that Java is right (or that Wolfram Alpha use开发者_Go百科s Java Math for logarithms...) and that my C# program doesn't have the right input, but I am disinclined to believe this because taking (e^(google result) - 249/251) gives me an error of 0.0044 which is pretty big in my opinion, suggesting that there is a different problem at hand...


You're looking at logarithms with different bases:

  • Java's System.out.println(Math.log(249.0/251.0)); is a natural log (base e)
  • C#'s Math.Log (x,y); gives the log of x with base specified by y
  • Google's Log(249.0/251.0) gives the log base 10

Though I don't get the result you do from C# (Math.Log( 249.0, 251.0) == 0.998552147171426).


You have a mistake somewhere in your C# program between where the log is calculated and where it is printed out. Math.Log gives the correct answer:

class P
{
  static void Main()
  {
      System.Console.WriteLine(System.Math.Log(249.0/251.0));
  }
}

prints out -0.00800004266707626

0

精彩评论

暂无评论...
验证码 换一张
取 消

关注公众号