advertisement
advertisement

Don’t Let Your Kids Use Apps

It’s not even about screen time anymore. It’s about privacy.

Don’t Let Your Kids Use Apps
[Photo: Anucha Sirivisansuwan/Getty Images]

That harmless-looking kids’ app, with the cute cartoon characters and funny noises, is really a monster in disguise. That’s according to a new study of 5,855 children’s apps in the Google Play store that found that most apps collected personally identifiable data and transmitted it over the internet. From there, that data could be shared with advertisers–data that is not erasable and could follow a child as a targeted profile into their teens and adulthood.

advertisement

The abuses weren’t just by nameless developers, either. Even an app from Disney, named Where’s My Water? Free, transmitted router information that could help identify a minor.

(Disney has since written, stating that its data transmission is not in violation with the law, and it has “strict data collection and use policies for Disney apps.”)

[App Icon: Disney]
How is this possible? Don’t we have a law to protect our children’s privacy? We do, it’s called Coppa, and it mandates  commonsense rules like that apps need parental permission before collecting a child’s data, and that whoever is collecting said data cannot share it with anyone else.

Coppa is far from perfect, but developers simply ignored its rules or didn’t work hard enough to enforce them in the first place. And Google didn’t bother to properly police the developers.

Until these privacy issues are sorted out, the most obvious answer might be the hardest for most parents, such as myself, to consider: Don’t let your kids use apps. End stop.

The Invisible Problem With Screen Time

I’ve written about design and technology for 13 years, and it’s been remarkable how quickly the conversation around children and screen time has shifted. Apple launched the iPhone and iPad in 2007 and 2010, respectively, tantalizing parents and kids alike with the glow of the screen. OMG, kids learned how to use tablets faster than adults! Who would have thought! Quickly, parents uploaded videos of their kids tapping on TVs, as if they had touch sensitivity, and smartphones and tablets became essential salves for screaming car trips and airplane rides.

advertisement
advertisement

Then we began to worry about screen time. As we watched our own adult relationships suffer, rather than flourish, because of screens, we thought the same might be true for our children. Since 1999, the American Academy of Pediatrics had adopted a no screens guideline for children under two years of age. The working assumption: Physical play and human interaction are better for brain development than digital replacements. But by 2015, the AAP threw an informal huddle, and declared that iPads were probably fine for infant use, before releasing amended “kinda fine . . . but still wait until about 18 months” guidelines in 2016. Either way, the problem was addressed at last.

In the wake of Facebook’s Cambridge Analytica scandal, however, we’re realizing another, less visible consequence of screen time: a loss of privacy. When your fingers touch a screen, your privacy leaves your hands, period. Even if you trust an app, if that app has chosen to connect to an unscrupulous third party (adding in extra analytics or advertising software via an SDK), your information can end up anywhere, used for any purpose, in perpetuity.

[Photo: Halfpoint/iStock]

What’s frustrating is that we have laws for this; they’re just not working. Coppa ultimately depends on apps to police themselves, with developers having the option to submit their own code for FTC privacy compliance testing 180 days before sending the app to market. As researchers found, that’s just not happening. Part of the problem is that while Coppa penalties are serious–up to $41,484 per each violation (yes, that scales all the way down to the user level, meaning  that even small infractions to large user bases can cost billions)–the FTC has a poor track record of enforcing them.

Without real penalties, why should we be so naive that developers would self-police? If we can’t count on the world’s most valuable companies, like Facebook or Google, to protect our data, why should we think a free-to-play quiz app by a nameless developer would burn six months in certification to protect privacy more?

Google does have a Designed for Families program that is said to be mandatory for apps that are “primarily child-directed” (yet, confusingly, not mandatory for apps aimed at “kids and families”). The program certifies Coppa-compliant apps. But that doesn’t change the fact that, according to this study, most children’s apps in Google’s own store are collecting data that puts children’s privacy at risk.

We asked Google’s press office what happened here and were offered this stock response:

advertisement
“We’re taking the researchers’ report very seriously and looking into their findings. Protecting kids and families is a top priority, and our Designed for Families program requires developers to abide by specific requirements above and beyond our standard Google Play policies. If we determine that an app violates our policies, we will take action. We always appreciate the research community’s work to help make the Android ecosystem safer.”
Google declined to answer any of our follow-up questions on the topic. Though to be clear, this isn’t just about a majority of children’s apps in the Google Play store. Facebook, too, has poor track record with privacy, that becomes more worrisome when looking at its addictive children’s app. The problem also extends to children’s smartwatches that are sold to parents for safety and security, but have the same lax privacy standards as bad apps. It’s about Google/YouTube’s failure to police children’s content, and its propensity to recommend more and more extreme videos to impressionable minds. It’s about the hypocritical enablers, too, like Apple, which routinely brags about its protection of your privacy, but still has all sorts of strong-arming permissions to let its own app developers collect more data than they should. Like Google, Apple helps advertisers target you with a unique Advertiser Identifier on your phone. Apple also uses its technology in Shazam to help third-party advertisers identify what show you’re watching on your TV by listening through certain apps that, yes, the New York Times found were available even in Apple’s sacred App Store. It’s a sad state of affairs when the best corporate attempt to protect privacy in the last three years is a Samsung phone that just doesn’t connect to the internet.

The bottom line is that no one in the tech industry is free from blame on the issue of privacy. No one in the  U.S. government seems to have a fix (though the vast privacy legislation that Europe plans to roll out next month, known as GDPR, seems promising). And until all parties step up, the most prudent option, as parents, is simply to keep our kids away from apps. PBS cartoons playing over an antenna have never seemed so comforting.

This article has been updated with a comment from Disney, along with a clarification about the scope of its data collection.

advertisement

About the author

Mark Wilson is a senior writer at Fast Company. He started Philanthroper.com, a simple way to give back every day.

More