“Worse Is Better” 25 Years Later – Have We Gotten Lazy?

worse-is-better

25 years ago, Richard Gabriel introduced the idea of two competing styles of design: worse-is-better vs right-thing. Here’s how he described the core philosophy of one. Can you tell which one it is?

  • Simplicity: Implementation simplicity is the most important consideration in a design.
  • Completeness: The design should cover only necessary situations.
  • Correctness: The design should be correct in all observable aspects.
  • Consistency: The design should be consistent as far as it goes. Consistency is less of a problem because you should always choose the smallest scope for the first implementation.

That philosophy is worse-is-better. Worse doesn’t mean bad—it’s merely worse than perfect; only good enough to succeed.

Here’s the competing right-thing philosophy:

  • Simplicity: It is more important for the interface to be simple than the implementation.
  • Completeness: All reasonably expected cases must be covered.
  • Correctness: Incorrectness is simply not allowed.
  • Consistency: The design must not be inconsistent. A design is allowed to be slightly less simple and less complete to avoid inconsistency.

When Gabriel wrote this, the Lisp community was in rough shape. Lisp machines were losing to general purpose Unix machines and Lisp itself was losing to C. He invented the worse-is-better caricature to explain how Right things can fail in the marketplace to obviously Worse things. Gabriel later developed worse-is-better into a very nice model of how software ideas become accepted.

Worse and Worse?

Over the last 25 years, the world has moved significantly further towards the worse-is-better model. So much so that many developers have never worked on a right-thing project. Some have never even used right-thing tools. Unix and the web, poster children of worse-is-better design, are widely respected as good design. Agile, TDD, YAGNI and many other popular modern development ideas embody similar philosophy as worse-is-better.

Right-thing ideas are rare and sometimes seem only feasible for monopolists or academics. Apple abandoned worse-is-better Objective-C for right-thing Swift. Microsoft replaced worse-is-better DOS with right-thing Windows NT. Haskell and Clojure are successful right-thing languages, but have a perception as difficult academic languages (assuming someone has any perception of them at all).

Unlike Paul Chiusano, I don’t think worse-is-better is a dangerous meme or causes technical debt. Gabriel’s observation would be still true without anyone giving it a name—it’s how ideas succeed or fail in any market, not just software.

On the other hand, I totally agree with Chiusano that we’ve started to think small and take small risks. And given our large scale slide towards worse-is-better, there’s a real possibility that worse-is-better will simply become bad-lazy-design, and we will lose the ability to recognize the right-thing.

Worse-is-better only works when you understand the right-thing and rationally choose a compromise.

Learning to See the Right Thing


So even though I’m an agile, Unix, web developer, I highly recommend experiencing right-thing design. Pick a small system or side-project and do everything right-thing. Don’t do the minimum viable product; go for fully complete and correct. Develop your ability to see the right-thing when practicing worse-is-better. I think it’s the only way to defeat Gabriel’s admonition:


“The future will be in the hands of the worst of our fruits.”
– Richard Gabriel

Sometimes it takes a tough man to make a tender chicken.