Before any more laws aimed at protecting young people’s online privacy get passed, I wish lawmakers could spend more time with kids using social media – kids of both sexes and various ages, at least their own children or grandchildren – and less time reacting to constituents’ concerns and news reports about kids in social media. Certainly not all they’d observe and discuss with kids would be positive, but they’d get a much more accurate picture of what they’re trying to protect and regulate, and we might get better (and fewer) laws.
I say “fewer” because – here at the beginning of a whole new, more user-driven media environment, where regulatory power is becoming more distributed and concerns about the impacts are naturally running high – there’s too much pressure to regulate reactively. Might lawmakers slow down and learn more about the realities of this new media environment and its users? Of course the legislation is well-intended; the sponsors of, for example, California’s just-passed “eraser button” law and the federal Do Not Track Kids (DNTK) Act just reintroduced on Capitol Hill want kids to be able to take back content they’ve posted and don’t want advertisers to exploit minors’ data. But the state law and the proposed federal law don’t reflect an understanding of today’s media and are just as likely to jeopardize our children’s privacy and other rights as to protect them. Here’s a sampler of the problems with these laws:
- Collecting more, not less, kid data. By requiring sites and apps to protect kids of certain ages and, in the case of the California law, determine where they live, the legislation means services will have to capture more data on young users than ever, actually reducing their privacy and increasing the potential of exposure to identity theft and other crimes.
- Not addressing bullying. For constitutional reasons, neither law requires social media services to delete unwanted content that other people post about a minor. The young people these laws aim to protect can only delete (or request deletion of) the content they themselves post (and yet backers of the California law say it protects users from cyberbullying and other forms of harassment). It doesn’t, and one unintended consequence of a law that does not allow a targeted person to delete harassing content is that it does allow harassers to delete the evidence of their harassing behavior.
- Nasty content can still go viral. Because the legislation can only help users delete their own original content – not any reposts or copies posted by others in the same or other sites – it can’t help stop something from getting passed along or going viral, which is part of what makes cyberbullying cyberbullying.
- Making compliance (and protection) harder. Both the new state law and proposed federal one raise the age level in the Children’s Online Privacy Protection Act, but differently – the federal law adding ages 13 through 15 and the California one 13 through 17 – creating two more protected groups, protected differently from each other and the original one (kids under 13). All by itself COPPA created the unintended consequence of millions of kids under 13 lying about their age so they could use social media services (most with their parents either helping or looking the other way – see this), so adding more complexity better protects kids? How should services directed at 13-, 14- and 15-year-olds be different from those directed at people who are 16 or 17?
- Other age-related questions: Should a US law be based, for example, on the claim that all people under the age of 16 “lack the cognitive ability to distinguish advertising from program content and to understand that the purpose of advertising is to persuade them,” as the DNTK Act claims? [An award-winning teacher told me once that “nothing is too sophisticated for a 12-year-old mind.”] Laws aimed at protecting children seem less focused on striking a balance between protection of the subjects’ privacy and protection of their rights than laws aimed at protecting everybody. Should laws to protect children not factor in their rights of free expression, participation and association? Should they have a chilling effect on the exercising of these as well as on media properties that enable children to exercise these rights?
- Age of deletion unclear. The California law doesn’t make clear whether the people it’s intended to protect can delete their content only while they’re still minors or anytime in their lives, such as when they’re, say, 45 and running for office (some people call erasing information from the public domain “censorship,” others “revisionism” – see this).
- The don’t-ask-don’t-protect effect. The California law addresses only sites “directed at” people under 18 or sites that knowingly have minors on them, which gives social media sites and services an incentive either to ignore users’ ages or bar minors altogether. If they don’t ask for ages (to avoid liability for having actual knowledge of minors on their sites), they can’t provide special protections for minors. If they bar minors, minors are very likely to find workarounds or lie about their ages just as they have on sites that bar people under 13 (see this about a study finding the increased risk that can spell for minors).
- Who’s deleting what from where? There are lots of data layers that the laws don’t address – e.g., the data on servers scattered around the country and globe, what appears on Web pages and what’s turned up in search engines. Although an embarrassing post “will, theoretically, be deleted from the page, there are no stipulations [in the California law] requiring deletion of the actual data on the servers … [which] may or may not be in California,” the tech news site Gizmodo reports. “Web sites with users in California won’t necessarily have their servers based in the state.” As for search engines, a request that they delete a person’s content only means that the links in Google, Bing, etc. will be deleted, which makes the content harder to find but doesn’t delete it.
- Very little additional control. These laws may give users a little more control of their data in some sites (which don’t already provide for deletion and which come under state or US law), but not on the global Internet. “Eraser button regulations may change user experience for some users in a way that makes them feel they have more control over information they share, but they are far from effective at truly ‘erasing’ information from the Internet,” says the Center for Democracy & Technology in Washington.
- False sense of security. By suggesting that they give kids more control over their content, these laws give kids the impression that they don’t have to think before posting or manage their reputation online. They give parents the impression that their kids will be less vulnerable or have greater control over their content in social media. They suggest that top-down solutions control bottom-up (user-driven) media, easing a sense of responsibility on users’ part at a time when regulation is increasingly distributed and self-regulation is needed more than ever.
- State laws/borderless medium. Of course some of the backers of the California law want it to be a model for other states and/or federal law, but the Internet is a medium using data that flows across all kinds of borders, between servers in countries all over the world. If legislation is a solution, it needs to reflect that reality. According to the Center for Democracy & Technology, the DNTK Act does embrace some international privacy protection principles but only for 13-to-15-year-olds. Why only that small age group?
- Disappearing kid-friendly content. Like any law that increases the cost of starting and operating services for minors, such as COPPA, these laws have a chilling effect on creating great services for children, which reduces the number of high-quality options for children provided by legitimate, law-abiding businesses. Regulation like COPPA, the proposed Do Not Track Kids Act and the new California law basically penalizes conscientious businesses while putting no dent in the number of noncompliant services kids can go to, whether operating illegally in the US or legally but less responsibly outside the US.
- Greater restriction, more workarounds. Laws that lead to fewer options and more restrictions for kids in legitimate services increase the possibility of kid-created workarounds and the likelihood of young people going “underground” (to lower-quality, potentially riskier sites and services), where there’s no one moderating or guiding what’s going on.
- Protections already in place. Calling for legislation like this suggests that protections aren’t already in place. Responsible social media services – the most popular ones (and the ones held most accountable by the public) – already allow users to “erase” their comments and accounts (Facebook, for example, allows users to delete any post or photo they’ve ever made, and see this resource at Google.com). Web browsers allow users to turn off tracking. And of course there are privacy settings in services and devices and help sections on how to set them. It would be better to have laws aimed at making these protections consistent across all social media services, platforms and countries and for users of all ages.
So add up the misdirected requirements, ambiguous language, increased collection of kids’ data and unanticipated unintended consequences in these pieces of legislation, along with the chilling effect they have on kid-serving businesses, and what we get is laws that could penalize kids more than protect them.
Legislation that protected consumers in the mass-media era in which most lawmakers grew up can be extremely problematic for users in today’s media environment – where content is the content of people’s lives, intertwined with other people’s lives, changing in real time and flowing across all possible borders. Regulation aimed at protecting the youngest users of today’s media has to carefully, thoughtfully factor in their use of it at least as much as their elders’ concerns about their use of it. But start with what the Center for Democracy & Technology suggests: Consider how much more workable online privacy laws would be for young people too if aimed at protecting everybody’s privacy, not just kids’.
Related links
- “Do Not Track Kids Bill Revives Minors’ Online Privacy Debate” in the Center for Democracy & Technology blog (November 2013)
- “California Eraser Button Law Passes,” by Adam Thierer at George Mason University’s Mercatus Center (September 2013)
- “Kids, Privacy, Free Speech & the Internet: Finding the Right Balance,” a working paper by Adam Thierer (August 2011)
- “Senator Markey’s Do Not Track Kids Act of 2013 Raises the Question: What’s the Point of COPPA?” by Ben Sperry, associate director of the International Center for Law & Economics, in the TruthontheMarket blog
- Lawmakers, you may find the title of this about-to-be-released, must-read book comforting: It’s Complicated: The Social Lives of Networked Teens, by danah boyd. If you don’t have any young social media users to talk with anywhere in your lives (hard to imagine), you’ll learn a lot through a careful read of that book, hopefully the minute it becomes available (February 25, 2014). Senator Markey (a sponsor of the DNTK Act), the first stops on danah’s book tour are in your home state and Washington, D.C.
- About a national task force’s 2010 report to Congress, “Youth Safety on a Living Internet” (and why we called it that)
- “French Archivists Say No To Proposed ‘Right To Be Forgotten’” (June 2013)
- About an Argentinean pop star, not kids, but illustrating what concerns archivists, historians and others about allowing people to delete parts of the public record: In both the Stanford Law Review and The Atlantic, legal scholar Jeffrey Rosen highlights the story of pop star Virginia De Cunha, who, on finding that racy photos she’d posed for as a young person, sued Google and Yahoo to get them “taken down.” “An Argentinean judge, invoking a version of ‘the right to be forgotten,’ sided with Da Cunha, fined Google and Yahoo, and ordered them to delink all sites with racy pictures that included her name.” Because of the complexity of that task, Yahoo simply deleted all links to anything referring to Da Cunha from its Argentinean search site. Further down, Rosen wrote, “The right to be forgotten [in Europe] gives people the right to demand the removal of embarrassing information that others post about them, regardless of its source, unless Google or Facebook can prove to a European regulator that the information is part of a legitimate journalistic, literary, or artistic exercise. This would transform Facebook and Google from neutral platforms into global censors and would clash directly with the principle, embedded in U.S. free-speech law, that people can’t be restricted from publishing embarrassing but truthful information. As a result, the right to be forgotten may precipitate the Internet Age’s most dramatic conflict between European conceptions of privacy and American conceptions of free speech…. However the international legal battles are resolved, the impulse to escape your past on the Internet is an understandable one.”
- On whether the Internet can be “a space where we’re free to analyze the data but not free to abuse the data” in a February 2011 article in The Atlantic by John Hendel: “In Europe a right to be forgotten trumps the memory of the Internet”
- “The Right to Be Forgotten Across the Pond,” by Meg Leta Ambrose and Jef Ausloos in the Journal of Information Policy (2013)
- “COPPA has likely increased minors’ risk: Study” in NetFamilyNews (November 2012)
- “The ‘minimum age’ & other unintended consequences of COPPA” in NetFamilyNews (September 2012)
- “What does ‘safe’ really look like in a digital age?” in NetFamilyNews (September 2012)
- “Cyberbullying: The view from behind a kids’ Web site” in NetFamilyNews (October 2011)
- ConnectSafely’s Tips for Getting Cached Content Removed
[…] anothr law, The Do Not Track Kids Act (see my ConnectSafely.org co-director Anne Collier’s analysis), that assumes that young people are at a higher risk […]