In most languages, such as C, C++, Ruby, Python, and Java, if you ever attempt to send a message to nil (or dereference a NULL pointer), one of two things will happen: either your program will crash, or a runtime exception will be thrown, which could also lead to your program crashing.
For example, in Ruby,
One of my favorite languages, Objective-C, does not follow this convention: it instead treats any message to nil as a no-op and happily continues on running your program.
Either behavior will allow you to write quality software, so what are the design tradeoffs between them? Throwing an exception can be considered safer, because it might be dangerous to proceed if the programmer had not foreseen and explicitly handled the case where the account could not be found. On the other hand, Objective-C returns nil (or 0) as a convenience to the programmer. The assumption is that, if the programmer cares about handling the case where account is nil, he can check for it himself.
Suppose that you would like to display the amount of the most recent transaction in a person’s bank account. If the person doesn’t exist, or he doesn’t have a bank account, or the account has no transactions, then you don’t need to do anything. In most languages, you need to do explicit checking:
When messaging nil is allowable, the following code will run correctly even if person is nil:
In my opinion, the latter is far more readable and elegant. And it’s quite frequent to find cases like this. If you have an object, display some of its info to the user. If you have a parent view, ask it whether it’s currently visible – but if you don’t have a parent view, that’s okay. Leveraging the ability to safely message nil helps keep your code simpler and easier to read.
I think that, although it’s true that there’s a slight loss of implicit safety, it’s well worth it for a slightly smaller and more concise codebase. I would love to see other languages also allow you to safely message nil.
Unfortunately, while some languages, such as Ruby, make it possible for you to monkey-patch this behavior, I don’t believe it’s safe to change such a fundamental assumption on an existing codebase or to defy the conventions of a language and its established community.
What are your thoughts? Would you like to see more languages with this feature?