A grabbier headline for this post might be “Screens are watching us back,” but that would be like so many scary news headlines parents are subjected to. More importantly, it wouldn’t do justice to all that this important new book – Datafied Childhoods, by Profs. Giovanna Mascheroni in Italy and Andra Siibak in Estonia – offers us. It provides….
- Preparation not only for handling the new tech phase in which we find ourselves (more on that in a moment) but also…
- A reality check on where we are with tech, parenting, digital literacy education and, maybe most importantly, how to think about them (for example, “To speak of ‘children’s internet culture’ … runs the risk of overlooking the diversity of children and the heterogeneity of their mobile and online practices.”)
- An update on youth digital practices and thinking which suggests that the goal of teaching children about datafication should be their competency, not our control (e.g., instead of treating them as “data objects,” treat them as “data owners” and “partners in discussion about what data is collected, for whom and for what purposes,” so they can “exercise their agency” as data owners, the authors write).
- Background for designing digital literacy education that’s useful to children and respectful of their lived experience (e.g., “We adopt a non-media-centric yet child-centered approach that, in line with the developments in the sociology of childhood, recognizes children as active agents and interpreters of their own social worlds,” Mascheroni and Siibak write).
- Context: what the latest research shows us about kids’ lived experiences and digital practices, as well as their contexts – home, school and social circles.
Clearly, digital literacy education needs an upgrade – both in content and approach. On the content part, while it still needs to teach children what they can control, what they can do to optimize their digital practices, privacy and safety. But it also needs to teach them about what they can’t control: how the technology works; what data, algorithms, machine learning, AI, etc. are; how life has become “datafied”; and what the implications of datafied life are for them.
Education professor Neil Selwyn at Monash University explained it well when he wrote last week in the Parenting for a Digital Future blog that “the most significant digital technologies during the 2020s, are likely not to be [those] ‘used by’ people. Instead [they] … are likely to be technologies that are ‘used on’ people” (emphases mine).
The second part of the upgrade – how we approach digital literacy and online safety education – is to make what we teach more child-centric than tech- or media-centric. To be sure it’s relevant and meaningful to its intended beneficiaries, it needs to incorporate their own practices and perspectives about digital media.
This new tech phase
So what does this new tech phase we’re moving into look like? I won’t go into any depth because that would be a book not a blog post. So two paragraphs….
The new phase has five pieces to it: besides the two we hear about most – 1) Web3 and 2) the metaverse, which I’ll come back to in a moment – 3) unprecedented regulatory scrutiny and action around the world, 4) a growing public discussion about content moderation and the rights of both moderators and users and 5) a noticeable lack of education about the datafication of life on Earth and the roles that business and different types of governments play in it. At the moment, tech and user experience appear to be evolving on two parallel paths with an alphabet soup of terms for them: Web3 (“the decentralized Web” or “dWeb,” which includes but is now broader than “defi,” or decentralized finance, “crypto” and things like “smart contracts,” “tokens” and “NFTs”) and “the metaverse,” also bearing many different descriptions and analogies but usually employing “VR/AR/MR” (for virtual, augmented and mixed reality) or the catchall “XR” (for cross-reality, extended reality). But they aren’t separate paths, actually. It’s more like they represent the tech and human ends of this equation: a more immersive and embodied experience of media at the human end and decentralizing technology at the tech end (though the decentralizing tech doesn’t necessarily spell decentralized power or governance, smart people at Data & Society are saying, so stay tuned).
All that points to ever more individualized, decentralized, diversified 1:1, 1:group and group:group interaction, transactions and spaces that build on the peer-to-peer revolution represented by Napster, BitTorrent and Skype (invented here in Estonia) some 20 years ago – even as it’s all getting more geographically chunked up into China’s Internet, Russia’s Internet and what might be called the free Internet that we’re talking about here. From a parent’s perspective, you might think: Discord meets Roblox, ever more immersive, embodied and ubiquitous. Or Minecraft meets Second Life, if you remember that virtual world where, 15+ years ago, people, “brands” and governments all over the real world were building a presence (see this interview with its founder, Philip Rosedale).
Moving away from media-centric
Which points to why, more than ever, we need to take the child-centric, not media-centric, approach Mascheroni and Siibak model. It also explains why there is so much talk now, at least in Europe, of children’s right to have a say in what’s being designed and decided for them. “It’s time to put children’s voices into all the debates and for their voices to be heard,” Prof. Sonia Livingstone said, referring to Article 12 of the UN Convention on the Rights of the Child.
All the platforms, tools and play spaces – as well as data brokers and a whole lot of other businesses we’ve never heard of – are capturing the data we and our children are leaving on them, and applying machine-learning AI to make better and better guesses about how we like to use the technology and what advertising and other content we like to engage with … and, well, capture more data. And so on.
Which brings us back to this important book. It has eight chapters, from “The Datafication of Everything” (doesn’t this make the “screen time” discussion sound pretty “dark ages”?) to “Datafied Futures” – and in between them discussing identity production; mediatized parenting, homes and schools; and how young people’s peer networks are datafied. It explains phenomena such as “automated peer pressure,” “transcendent parenting,” “device-ification of mothering” and “privacy boundary turbulence.” The authors cite Prof. Veronica Barassi’s argument that “everything has become onlife,” not just online or offline.
But the book is not about children as victims of tech. While providing a clear-eyed view, it moves us past the supremely unhelpful moral panic that has defied decades of scholarship. Datafied childhood is complicated. For good or ill (or perhaps good and ill), children, like us, love the convenience end of the privacy-convenience spectrum, the authors show us. Kids are unaware of the surveillance potential because they’re no more educated than we are about how “data capitalism” affects privacy. They feel empowered by instant, everywhere access to their friends, fun, inspiration and information about themselves and their world. They, like us, use apps and technologies for self-improvement, physical and mental wellbeing – and, in keeping with their developmental stage, for identity production. The authors cite a survey finding that 52% of 11-18 year-olds said they use tech tools to regulate their bodies – sleep patterns, calorie intake, mood monitoring, tracking exercise, heart rate, menstruation, etc.
‘Marji’ and ‘Heleen’
“All through the centuries individuals have practiced ‘technologies of the self’,” the authors write, quoting philosopher Michel Foucault (1988). Then they cite Prof. Jill Walker Rettberg writing in 2014 that smartphones could be seen as real-time diaries and, at the beginning of Chapter 3, tell the story of Marii and Heleen, who got their first smartphones at age 7 and began “(un)consciously writing and self[ie]-shooting themselves into being, sharing bits and pieces about their daily lives, their likes, and dislikes, sharing data that is seemingly irrelevant to anyone outside their immediate family, sharing the everydayness of their lives and their identities.”
And all this becomes data used by people and entities our children don’t even know exist. Think about their future (or present) avatars in virtual reality spaces when you read “embodiment” in what Mascheroni and Siibak write here: “Sometimes, without any conscious agency from the user, and often without us (fully) acknowledging it, our phones are generating and materializing inherent dimensions of human embodiment and practices leading to the creation of human data assemblages.” Technology, media and adolescent development have become a total mashup.
Self-monitoring too
There’s a lot of talk about social comparison in Instagram, for example, but what about “self-imposed social comparison” and comparing self to self, or self-tracking? “On the one hand,” the authors write, “gamified self-improvement apps evoke a certain kind of agency – that of an active subject willing to succumb to self-governance while striving for self-realization and self-management. On the other hand … as this constant and willing self-surveillance is embedded within a series of gamified techniques that nudge the individual into self-discipline, the self-tracker is turned into a docile body, that is, a body ‘that can be subjected, used, transferred, and improved,’” they add, quoting Foucault. And who wrote the algorithm that decides where one’s weight or fitness should be? And how does that work for adolescents in India, Kenya or the Philippines?
Think too about what the authors call “the performative nature of self-tracking.” Children have always performed for parents and other loved ones – drawing pictures, building, swimming, acting, singing, dancing, writing, often calling for their parents’ attention. Now they do so in digital spaces, but also for the commercial benefit of businesses known and unknown. Are we talking with them enough about what that means for the privacy they can and can’t control and about how algorithms feed on and learn from all that we share?
Agency for safety and privacy
This book shows us that the solution to pervasive datafication can’t be more adult control and less child agency. Mascheroni and Siibak’s work is, to my mind, the next milestone in our journey away from technopanic and toward approaches to working with children that factor in their views and lived experiences online as well as offline. Previous such milestones looked at four key perspectives on youth+digital: youth (Hanging Out, Messing Around and Geeking Out, Mizuko Ito, ed., the culmination of more than two dozen researchers’ three-year study of teens ‘digital media practices at home, at school and in after school programs, 2009 and 2019 editions); parenting (the 2020 book Parenting for a Digital Future, by Drs. Sonia Livingstone and Alicia Blum-Ross); online safety and moral panic (the 2016 book Framing Internet Safety, by Prof. Nathan Fisk); and online safety and the “control paradigm” (the 2019 book Young People in Digital Society: Control Shift, by Prof. Amanda Third, et al.
And while Control Shift called out the control and surveillance external to kids, Datafied Childhoods illuminates the paradoxically internal and external monitoring, performance and identity production that is only increasing in the phase we’re moving into – to the unqualified benefit of the companies behind the apps, games and services kids use online. Children and young people are doing their normative developmental work in a datafied world that isn’t educating them enough about how that world works. This needs to change. Datafied Childhoods helps.
Related links
- Digital media provides an “infrastructure” for young people’s interactions with peers and the identity development involved – I’d say a parallel third infrastructure to family life and school life. For insights into the family life part, see my review of Parenting for a Digital Future, a book that I believe helps parents factor in this third infrastructure rather than fear it.
- “The Unnerving Rise of Video Games that Spy on You: Players generate a wealth of revealing psychological data—and some companies are soaking it up” in Wired
- “The Age of AI,” a documentary series on YouTube
- Why teens preferred the metaverse to phone calls, FaceTime, etc. during the pandemic: For one thing, “seeing virtual bodies as avatars created a sense that they were co-located in the same physical space,” found Divine Keetle-Maloney in his PhD research (touched on here and linking to more (here are his own highlights from his dissertation)
- “The metaverse will fuel massive innovation (and Facebook isn’t the metaverse)”: This article in Venture Beat, by a writer who is, yes, very invested in AR (Faisal Galaria, CEO of Blippar), may be a bit aspirational but also has a good grasp of tech history is one of the first I’ve seen that points to how Web3 and the metaverse intersect: “The real promise of the metaverse is new data-rich experiences and services that are faster, better, and cheaper, whether that’s in finance, virtual socialization, business meetings, healthcare, [etc.].” All those use cases will be decentralized, Galaria argues. “In fact, decentralization is not just a feature of an open metaverse, it’s a core tenet that will avoid bottlenecks and enable interoperability that traverses wall gardens” (the way the first manifestations of peer-to-peer, Napster, Kazaa, BitTorrent and even Skype did some 20 years ago). “The next generation of peer-to-peer services will enable even greater participation of users who, in a decentralized metaverse, can work directly with each other and trust the network for things like money transfer and social media rather than relying on a centralized operator or service. We’re looking at you, Meta and Facebook.” Right, but note that he says nothing about user safety.
- This thoughtful, pioneering conversation about user safety and the “decentralized Web” is just one sign of how many people are thinking about datafication and data ownership right now. This one is with Tim Lordan, co-founder of the Decentralized Future Council, Charlotte Willner, executive director of the Trust & Safety Professionals Association, Janet Haven, executive director of Data & Society and Alex Feerst, CEO of Murmuration Labs
- “The creator of Second Life has a lot to say about all these new ‘metaverses’,” in PCGamer.com
- For a brief glimpse of all the things children have learned about safety and wellbeing to date, read “Risks, Opportunities and Risky Opportunities,” by Drs. Leslie Haddon and Sonia Livingstone
- Human effects on tech (when we usually hear about the other way around): “Internet ‘algospeak’ is changing our language in real time…. To avoid angering the almighty algorithm, people are creating a new vocabulary” in the Washington Post
- What I’ve written about the metaverse in recent months: “The metaverse an the Meta part” and “Online safety for 2022: 8 things we need to see”
[…] data privacy” from the BBC, this about growing regulatory scrutiny of kids’ privacy and this about an important new book about “datafied […]