"The modern man who tattoos himself is either a criminal or a degenerate." Those were the words used by Austrian modernist Adolf Loos to link tattoos to crime in the 1910s. (The solution, according to Loos? Clean, pure modernism.) In the century since, tattoos have shaken a good deal of that wrongheaded stigma. According to Pew, one in four millennials has at least one tattoo.
Oddly enough, tattoos are now playing a significant role in modern law enforcement—thanks largely to the advent of computer vision. This month, an investigation by the Electronic Frontier Foundation revealed that the FBI is actually investing in research that can identify and semantically analyze tattoos en masse.
The purpose of this technology? To create a tool for the police that can not only recognize tattoos but analyze their meaning to create a network of contextual links between potentially millions of tattoos. One commonly cited scenario for how it might be used: If a surveillance camera didn't record a perpetrator's face, but it did catch a part of his or her tattoo, this technology could theoretically link that tattoo to a list of possible suspects. Or, investigators could use it to search for other examples and context about a particularly inscrutable tattoo.
But questions about whether such a database impinges upon freedom of speech—and the rights of the prisoners whose photos populate it—are rife.
Beginning in 2014, the FBI worked with the National Institute for Standards and Technology to create a competition and program called Tatt-C: a huge database of photos containing 15,000 tattoos drawn from prisoners all over the country. Plenty of law enforcement agencies have kept track of tattoos and cataloged their meanings (Russian prison tattoos, for example), but Tatt-C was different because it aspired to use tattoos as genuine biometric data through computer vision, which can process huge amounts of image data for patterns that would take human investigators months to parse.
A group of six private companies and research organizations used the Tatt-C database to develop algorithms to accurately analyze them. The winning team was MorphoTrak, a private biometrics company made famous for developing the first fingerprint matching system in the 1970s. It built a 96.3% accurate algorithm that can recognize the same tattoo over time as the prisoner ages—and could match a small piece of a tattoo to the greater whole. Other entrants developed systems that could group tattoos with similar semantic content, like a match between dozens of Tweety Bird tattoos, or praying hands holding rosaries.
It’s a startling example of how computer vision can be applied in the real world—not just as a novel way to sort through all your vacation photos, but as a way to create a living database that links millions of people together without their consent. It’s treading on totally new ground, as far as the rights of prisoners and freedom of speech are concerned. Here’s how the EFF explains it:
Imagine that you’ve been arrested or sentenced to prison, and law enforcement officers made you lift your shirt, roll up your sleeves, and pull up your pant legs to take photos of your tattoos. You’d be outraged to learn that those images were not only shared with scientific researchers without your consent, but handed out to a large number of third parties—including private companies—and possibly published online.
This research is irresponsible and is a serious threat to our privacy and First Amendment rights. Researchers experimented with tattoos that contained religious imagery and personal information, without thinking through the ethical and constitutional issues at stake. Not only that, NIST researchers failed to provide safeguards for the prisoners subject to the research, despite ethical rules requiring greater oversight.
The Tatt-C competition was just the beginning of the project. Next, the FBI and NIST plan to publish an API, and are asking for developers working on tattoo recognition algorithms to get in contact during this next phase of development. The EFF, meanwhile, is calling for the project to be shut down, citing the unethical treatment of prisoners on the part of both the government and the researchers involved in the project.
Computer vision and privacy issues have been closely linked for years—after all, one of Google's earliest public-facing applications of the technology used it to parse house numbers in Street View. Today, it's analyzing much more than our addresses: It's analyzing our bodies.