3 UX Mistakes That Make Sites More Hackable

Drew Davidson of ÄKTA points out simple design improvements companies can make to prevent security mishaps.

3 UX Mistakes That Make Sites More Hackable
[Image: Abstract via Shutterstock]

Do you know that the URL bar in your browser is a potential security hole? I didn’t either. I barely look at the thing unless I’m punching in a search term. But according to Drew Davidson, vice president of design at ÄKTA, that thin strip of UI chrome is a little keyhole that a hacker can use to infiltrate a company’s website.


As Charles Eames famously said, “the details are not the details. They make the design.” Here are three subtle mistakes your company might be making in user-experience design that open you up to a breach.

1. The security features of your UI are a pain in the ass.

Wait a minute–aren’t fancy security measures like two-step verification all the rage now? (Just ask Google and Dropbox.) The counterintuitive truth, says Davidson, is that the trickier you make your site’s interface–even for a good cause, like protecting the user’s data–the more likely your user is to actively undermine it.

“Security policies that introduce too many steps are not effective,” Davidson explains, “because people will tend to do something imprudent–like setting a basic password–in order to make navigating the UI easier.”

Davidson cites a file-storage company (which he can’t name) as an example: “There’s literally 25 steps to go through before you can create an account.” This might make some sense if the company’s customers were only uploading sensitive information like medical records or social security numbers. But in reality, most of the users are just “using the software for Dropbox-like functionality, like storing resumes and photos,” Davidson says. The inappropriately Fort Knox-like UI design backfires as users cope by making their own data even less secure. It’s lose-lose.


2. Your user interface is full of peepholes into your backend systems.

Here’s where that URL bar can become a problem. “When you’re in a checkout process, many sites use different vendors to power that process,” Davidson says. “You can see the URL changing as you click through the checkout, and it can tell a hacker exactly which systems you’re using for which parts of your process, so they can infiltrate it that way.”

Vendor names, software libraries, and even file and folder structures can be left hanging out in the open accidentally. Davidson says that this was how Edward Snowden got his hands on NSA files he wasn’t supposed to be able to access. The NSA’s software interface showed him exactly where to look for sensitive materials, even though he didn’t have access to actually open them. Armed with that information, Snowden was able to use the command line as a “back door.” The UI design technically prevented him from walking in the front door, but certainly helped him case the joint.

3. No one at your company really knows how to use your backend software.

Why is it that Medium, Instagram, and Tumblr can make complicated functionality feel effortless, but most enterprise software makes even the simplest manipulations feel like torture? Davidson says that the simplest thing a company can do to make its software secure is to ensure that its employees know how to use it.

“Things like the role of administrators, making sure there’s a permissions system in place that is robust and alerts you when someone’s doing something they’re not supposed to be doing–almost all of these systems are extremely clunky and hard to use,” Davidson says. “It’s not clear who has access to what, and when, and for how long. It’s totally a UI problem: all the security engineering in the world isn’t going to prevent someone from checking the wrong box if it’s not clear to them what they’re doing.”


Implementing these changes might be easier said than done, but they acknowledge that security is a “people problem,” not just a technical one. Designing tools that let the people we trust with our data actually do their jobs–and don’t compel us to do them poorly ourselves–should be the starting point, not an afterthought. If a hacker wants in, he or she will almost surely find a way. But we don’t have to invite him in.

About the author

John Pavlus is a writer and filmmaker focusing on science, tech, and design topics. His writing has appeared in Wired, New York, Scientific American, Technology Review, BBC Future, and other outlets.