If there are 2 number system, e.g. Decimal Number System with 10 symbols (0-9) and hexadecimal with 16 Symbols (0-9,A-F), can i conclude that with hexadecimal, i will be able to retain greater accuracy than with the decimal number system due to the h开发者_JAVA技巧igher number of symbols ??
Edit1: Sorry, i was asking only from the point of view of a computer, it may be for a written calculation or anything
Accuracy is related to non-integer (real) numbers. It depends on
- Radix (decimal, hex)
- Amount of digits
For same amount of digits hex can be more accurate.
Simplest example: Radix: decimal and binary. Amount of digits: 1.
In case of decimal you can have from 0.0
to 0.9
, in case of binary only 0.0
to 0.1
(0.5
in dec). Decimal is 5 times more accurate.
They are two different bases for representing numbers. Neither offers greater accuracy than the other as they are simply different ways of representing a numerical value. Decimal is base 10 whereas hexadecimal is base 16. You could as easily represent any decimal or hex value in octal (base 8) or binary (base 2).
Precision is a better description of what I'm guessing you are referring to. The precision at which a number is stored dictates the accuracy of the representation of that number.
32-bit precision (single precision) is less precise than 64-bit precision (double precision) because there are less bits in which to store the significand.
No, 0x0f = 15(base 10). Neither has greater accuracy, one just has more characters. And remember these representations are for our benefit, your computer sees everything as binary.
You cannot say anything about accuracy unless you know what you are trying to represent.
For example, if you are trying to represent things measured in hundredths, e.g., pennies, decimal is more accurate than hex for the same number of digits (2). You cannot represent 0.10 exactly in hex (or binary), but you can in decimal.
See the General Decimal Arithmetic web page for more details.
精彩评论