In WWII, Allied bombers were key to strategic attacks, yet these lumbering giants were constantly shot down over enemy territory. The planes needed more armor, but armor is heavy. So extra plating could only go where the planes were being shot the most.
A man named Abraham Wald, a Jewish mathematician who’d been locked out of university positions and ultimately fled the persecution in his own home country of Hungary, was brought in to oversee the operation. He started with a simple diagram–the outline of a plane–and he marked bullet holes corresponding to where each returning bomber had been shot. The result was the anatomy of common plane damage. The wings, nose, and tail were blackened with bullet holes, so these were the spots that needed more armor.
Or at least that’s what people thought, until Wald flipped conventional logic on its head. He said the military didn’t need to reinforce the spots that had bullet holes. They needed to reinforce the spots that didn’t have bullet holes.
Because the planes that had been shot in these bullet-free zones never made it home to be accounted for. A bomber shot through the wing could likely make it to his diagram. A pilot shot through the cockpit wouldn’t.
Wald’s paper on the topic (download a PDF here) breaks down the science through groundbreaking statistics, but the eureka moment–inferring that the results revealed the opposite of the obvious conclusion–is something we can all appreciate. Maybe that’s why the Wald story was first told to us by Nate Bolt, research manager at Facebook, who’d himself first heard the story at the DCOG HCI Lab at UC San Diego.
“When they taught it to me, my mind was fucking blown,” Bolt says. “You can’t armor the bombers everywhere or they can’t fly. Right off the bat it’s a design problem a lot of people can relate to.”
As an undergrad, the story sunk in immediately. Now, Bolt uses the Wald story as an occasional source of grounding, a sort of parable that balances looking at both the small and big pictures of data.
“Everyone’s done data collection where you gather all the metrics, and it’s so easy to make bad inferences with data,” he explains. “The thing I apply most out of it is … there’s a creative part of understanding quantitative data that requires a sort of artistic or creative approach to research. Design research, especially, isn’t a sterile laboratory.”
Bolt approaches Facebook’s research by mixing big data and individual use cases. On one hand, they have all the anonymized trend tracking information you could imagine. On the other, they arrange opt-in, real-time trials in which real users are paid to telecommunicate their Facebook browsing experiences.
“It’s really common that we’ll have a hypothesis about why people are doing something–a behavior we might have noticed statistically, that when we observe people one on one, it may or may not correlate,” he says. “Understanding the difference of what we see in [big] data and one on one is always important.
“It’s easy to get stuck in one side or the other. If you do a lot of exploratory behavioral observation of humans, you can get stuck in that method; if you only do a lot of big system analysis you can lose the human touch.”
For Bolt, it’s a “constant circle” of questioning old hypotheses, an ongoing challenge to make sure that Facebook’s decisions are accounting for every last bullet hole–even the ones the data doesn’t immediately see.
“Facebook could always be better,” Bolt admits. “Any design or interface we’re working on, we want to be collecting data on it, inspiring people internally to be able to create better versions of it. And research is one of the fundamental ways that happens.”