I want to check if a reference type is null. I see two options (_settings is of reference type FooType):
if (_settings == default(FooType)) { ... }
and
if (_settings == null) { ... }
How do these two perform differently?开发者_运维知识库
There's no difference. The default value of any reference type is null
.
MSDN's C# reference page for default
keyword: https://msdn.microsoft.com/en-us/library/25tdedf5.aspx.
Now that we don't need to pass the type to default anymore, default is preferred.
It is just as readable
It can be used both for value and reference types
It can be used in generics
if (_settings == default) { ... }
Also, after calling
obj = enumerable.FirstOrDefault();
it makes more sense to test for default after that and not for null. Otherwise it should have been FirstOrNull, but value dont have a null value but do have a default.
There is no difference, but second one is more readable. The best place to use default
is when you deal with generics. Common code is return default(T);
Not different but I think
if (_settings == null) { ... }
is clearer.
My understanding is they are not different. It only matters when you are dealing with value types.
I would definitely go with the specific check against null. Because if the type of the _settings
class ever changes you may run into reference issues. At minimum it would require a change to the code breaking the open/close policy.
if( _settings == null ) {...}
This IMO is safer and cleaner.
As has been mentioned, there is no difference... but you might want to use default(<type>)
anyway, to handle the cases where it's not a reference type. Typically this is only in generics, but it's a good habit to form for the general case.
精彩评论