A User Interface Emergency

 

The Panic Button

Usability and human-centered design are of course paramount when creating digital systems. Make your apps and websites easy to click through without expecting your customer, employee—whoever your end-user is— to think about it. Think about the daily, digitally immersed, increasingly trying lives we all live. And don’t wait for catastrophe to make your design more user-friendly—we have enough to worry about already.

Earlier this month, the entire state of Hawaii received an alert informing them that a ballistic missile strike was imminent. “SEEK SHELTER IMMEDIATELY. THIS IS NOT A DRILL,” the warning concluded.

While residents scrambled to find shelter, some abandoning their cars on the road, state-level employees frantically attempted to fix their mistake. 38 minutes later, messages were dispatched revealing the warning to be an error.

Though initial reports revealed that an employee had pressed the “wrong button,” this was more than a simple mistake. This was a failure of a design system—or rather, a failure to implement an effective one. A screenshot of the system eventually surfaced, revealing an eyesore of a drop down menu. Vibrant blue hyperlinks in plain text listed the following:
 

Amber Alert (CAE) - Kauai County Only
Amber Alert (CAE) Statewide
1 . TEST Message
PACOM (CDW) - STATE ONLY
Tsunami Warning (CEM) - STATE ONLY
DRILL - PACOM (CDW) - STATE ONLY
Landslide - Hanna Road Closure
Amber Alert DEMO TEST
High Surf Warning North Shores

 

The state employee, told to test the PACOM (CDW) system. Rather than selecting the 6th option, beginning with DRILL, the employee initiated the 4th option, sending the entire state into a panic.
 

How to not make mistakes

Though the mistake was simple, the real problem was how easy it was to make that mistake, and how difficult it was to fix it once in motion. There was little organization to the dropdown menu—tests and drills mixed with real warnings. There was also no option to cancel an alert once it went out, which is why the state spent the better part of an hour trying to inform people of their mistake.

And it really was a simple mistake. One employee. One click. No verification. No “are you sure you want to initiate this process?” message. Think about how many times you’ve accidentally clicked a popup, or how many times you’ve accidentally sent an email before you were done writing. Should the employee have been more careful? Yeah. But should an emergency warning system be designed to make fallibility easy? No.

So as news outlets and government chairmen scolded the state for it’s mistake, those of us who live our lives thinking about how human beings interact with technology simply shook our heads. We see these kinds of frustratingly archaic systems every day. And they need to be fixed, especially as we delegate more complex functions to simple digital tools.

Clearly, the system in place in Hawaii was not designed with usability in mind. It’s a simple menu of options, but one that simultaneously makes the user think too hard. The increased cognitive load stemming from poor design choices puts a real strain on the people who use digital products.
 

An app for poor choices

An insightful blog post all the way back from 2013 speaks to how prevalent and human these design choices are. Everything we do takes cognitive effort. Make something hard to decipher, and you’re going to cause people to make bad choices. The classic example, referenced in the blog post, is the 1999 experiment which asked some participants to memorize a 7-digit number and others to memorize a 3-digit number. After a period of time, participants were given the choice between two snacks: chocolate cake and fruit. Those who had been told to memorize the longer number tended to make the “negative” choice—they chose the cake.

The conclusion of this study, and many subsequent ones, is that complex mental tasks take real energy. Pile enough decisions on your brain, and you’re more likely to make choices that are less good for you, or just flat-out mistakes. Make your software complex and frustrating and your users are going to make bad decisions.

We have come a long way since 1999, even since 2013, but the issue of human cognitive overload is more prevalent than ever. We’re inundated with social media, gamified apps, and we’re always connected to the internet. The news cycle doesn’t stop anymore. Everything is happening now. Even as app design becomes more human-oriented and user research becomes more refined, there’s a heck of a lot of digital water to wade through. And it can put us off our game.
 

Too little, too late

Back in Hawaii, the software that led to the incident is still there. It’s still a plain text list of hyperlinks. It still looks like a Geocities site. They added a ‘False Alarm’ option at the top of the list, but did they actually make it any more usable or reduce overall cognitive load? Nope, not by much.

An update - 2/1/2018

It has come to our attention that the employee who initiated the warning in Hawaii actually did so intentionally. He misheard the order to run the drill as an order to initiate the real warning. We were also made aware that the warning system does, in fact, include a message which asks the user if they intended to select the option they are about to initiate. Despite the intentionality of the decision and safeguards in the software, the fact that such an important order could be so easily misinterpreted compels us to think that system could be much more user friendly to prevent mistakes like this in the future.