Java Reference
In-Depth Information
Chapter 10. Using Optional as a better alternative to
null
This chapter covers
What's wrong with null references and why you should avoid them
From null to Optional : rewriting your domain model in a null-safe way
Putting optionals to work: removing null checks from your code
Different ways to read the value possibly contained in an optional
Rethinking programming given potentially missing values
Raise your hand if you ever got a NullPointerException during your life as a Java developer.
Keep it up if this is the Exception you encounter most frequently. Unfortunately, we can't see
you at this moment, but we believe there's a very high probability that your hand is raised now.
We also guess you may possibly be thinking something like “Yes, I agree, NullPointerExceptions
are a pain for any Java developer, novice, or expert, but there's not much we can do about them,
because this is the price we pay to use such a convenient, and maybe unavoidable, construct as
null references.” This is a common feeling in the (imperative) programming world; nevertheless,
it may not be the whole truth but more likely a bias with solid historical roots.
A British computer scientist named Tony Hoare introduced null references back in 1965 while
designing ALGOL W, one of the first typed programming languages with heap-allocated records,
“simply because it was so easy to implement.” Despite his goal “to ensure that all use of
references could be absolutely safe, with checking performed automatically by the compiler,” he
decided to make an exception for null references, because he thought this was the most
convenient way to model the absence of a value . After many years he regretted this decision,
calling it “my billion-dollar mistake.” We've all seen the effect—we examine a field of an object,
perhaps to determine whether its value is one of two expected forms, only to instead find we're
examining not an object but a null pointer that promptly raises that annoying
NullPointerException.
In reality, Hoare's statement could underestimate the costs incurred by millions of developers
fixing bugs caused by null references in the last 50 years. Indeed, the vast majority of the
languages [ 1 ] created in recent decades, including Java, have been built with the same design
decision, maybe for reasons of compatibility with older languages, or more probably, as Hoare
states, “simply because it was so easy to implement.” Let's start by looking at a simple example
to understand the problems with null.
 
Search WWH ::




Custom Search