Developers are constantly making new tools that make it easier for us to get our own job done. There’s always a shiny new framework to try around the corner with a rich and well-documented API. There’s a faster, more idiomatic language coming up that’s just about to break the big 1.0 release. There’s a new software development practice, a new way to run tests, a new tool for team communication and code review. There’s a metric ton of failures, of course, but as a general trend, things just keep getting better — I can happily report that I have never looked a punch card in the eye.
But, unfortunately for us, there’s nothing that necessitates that our users are going to have as wonderful experience using a program as we had writing it. (And, perhaps, fortunately, vice versa.) Joe from HR, logging into $ENTERPRISE WEB APP$ every morning, isn’t going to care about whether the code is pristine. He’s not going to care about whether the original developers used TDD. It could be an app hacked together from an unholy combination of PHP and bash scripting by one cowboy coder over the course of three sleepless, Mountain Dew-filled nights. It could use XML-formatted text files for storing persistent data. None of this is going to directly matter, to the end user. He’s only going to care about whether or not it helps him do his job.
And in the end, that’s what really matters. We write software to be used, after all.
I’m not trying to say that these better tools, better practices, newer frameworks, are useless. Quite the opposite. They’re amazing. But their usefulness, first and foremost, is that it makes it easier for us as developers to make better software. “Better software” is software that makes our users’ lives just a little easier. If it doesn’t meet that goal, it’s useless. Sometimes worse than useless.
Technically it “Works,” But…
My mother has been a practicing physician for over twenty years. She adopted Electronic Medical Records last year, and at the time she was pretty ecstatic. But the EMR software has doubled the amount of time it takes her to record her notes. She needs to go in and out of five or six different submenus to input any data on a patient chart. The menu groupings themselves aren’t sorted in any order, alphabetical or otherwise — input fields are an odd mix of freeform and selects stuffed so full of data that they reach off the page. It’s full of alert modals, tiny fonts, ambiguous controls, and just about every other design anti-pattern you could think of. Things that used to take all of 30 seconds for her to write down now take upwards of twenty minutes to input and save.
And the thing about this software is — as far as she’s told me, there are no bugs. She’s never reported it ever actually doing anything against its own logic, however nonsensical that logic may be. For all I know, the code is beautiful. For all I know, there’s a CI monitor somewhere that’s cheerfully read “Tests 100% passing” in green text since the app was first deployed. (I seriously doubt this. I strongly suspect that it’s a Waterfall app written in an outdated version of Java by a team of well-paid, miserable developers. But theoretically, the codebase could be absolutely pristine and the developers all well-rested and happy, and it wouldn’t make any difference to her.)
Start with the Users
The best silver bullet I have so far is actually sitting down next to a client while they are trying to use software to do their job, and watch what they do. That way you can see if they struggle finding a certain feature, or would benefit from a new one. It’s easy to catch lots of tiny but time-saving things that way (form fields not autofocusing, for example). The practice has a formal name: a “go and see.” It hooks in naturally to Agile and XP development practices — talk to the client, get your user stories sorted. Rough out some features. Put it in front of the user, get feedback. Go back and tweak what you initially thought you needed. Rinse, repeat.
I think it’s unfortunate that computer science programs don’t normally require any kind of interaction design courses. It’s true that for some apps it really doesn’t matter — command line apps, hardware drivers, kernel code — but the vast majority of the time, we’re building something that needs to speak to a human.