Is there any benefit to structuring boolean expressions like:
if 开发者_运维百科(0 < x) { ... }
instead of
if (x > 0) { ... }
I have always used the second way, always putting the variable as the first operand and using whatever boolean operator makes sense, but lately I have read code that uses the first method, and after getting over the initial weirdness I am starting to like it a lot more.
Now I have started to write all my boolean expressions to only use <
or <=
even if this means the variable isn't the first operand, like the above example. To me it seems to increase readability, but that might just be me :)
What do other people think about this?
Do whatever is most natural for whatever expression you are trying to compare.
If you're wondering about other operations (like ==
) there are previous topics comparing the orderings of operands for those comparisons (and the reasons why).
It is mostly done to avoid the problem of using =
instead of ==
in if
conditions. To keep the consistency many people use the same for with other operators also. I do not see any problem in doing it.
Use whatever 'reads' best. One thing I'd point out is that if I'm testing to see if a value is within bounds, I try to write it so the bounds are on the 'outside' just like they might be in a mathematical expression:
So, to test that (0 < x <= 10):
if ((0 < x) && (x <= 10)) { ... }
instead of
if ((0 < x) && (10 >= x)) { ... }
or
if ((x > 0) && (10 >= x)) { ... }
I find this pattern make is somewhat easier to follow the logic.
An advantage for putting the number first is that it can prevent bug of using = when == is wanted.
if ( 0 == x ) // ok
if ( 0 = x ) //is a compiler error
compare to the subtle bug:
if ( x = 0 ) // assignment and not comparison. most likely a typo
To be honest it's unusual to write expressions with the variable on the right-side, and as a direct consequence of that unusualness readability suffers. Coding conventions have intrinsic value merely by virtue of being conventions; people are used to code being written in particular standard ways, x >= 0
being one example. Unnecessarily deviating from simple norms like these should be avoided without good cause.
The fact that you had to "get over the initial weirdness" should perhaps be a red flag.
I would not write 0 < x
just as I would not use Hungarian notation in Java. When in Rome, do as the Romans do. The Romans write x >= 0
. No, it's not a huge deal, it just seems like an unnecessary little quirk.
精彩评论