Six years ago Prof. Gillian Hadfield at the University of Toronto wrote in QZ.com, “What we don’t hear nearly enough is the call to invent the future of regulation.” It seemed the insanely fast development pace of multiple kinds of tech was becoming almost too hot for regulators to touch. Hadfield proposed “super-regulation,” which she said would get governments “out of the business of legislating the ground-level details.” She was proposing that “a new competitive market of private regulators” be established so governments could instead be in the business of making sure regulation operated in the public interest.
That was the best idea for tech regulation I’d seen in my 20+ years in the child online safety space, I wrote back then. But that was then; this is now.
Working in silos for children’s safety and wellbeing – even in a new regulatory class or across a whole industry or across academic disciplines or NGOs – is no longer enough. Each stakeholder group working for children’s wellbeing needs more than its own vantage point. Child online safety is too complex a problem. For one thing, neither party to the traditional arrangement – government and business – gets regular exposure to the expertise of the intended beneficiaries of their work, young people’s. We innovate better when we understand the needs and interests of those we’re innovating for.
We also innovate better when we understand the needs and constraints of each other, in this case our fellow adult parties to minors’ online safety: platforms, regulators, researchers and helper organizations that can provide the much-needed offline context that youth share with them and that platform moderators never have.
“Evidence suggests that productive and effective regulation is built on the cooperation of stakeholders, the identification of shared purpose and desired outcomes, the creation of incentives and continuous learning from data and feedback,” write Ioanna Noula and Tijana Milosevic in their response to the European Commission’s call for evidence for its guidelines on protecting minors under the Digital Services Act.
The operative phrase, there, when facing constant technological change, is “continuous learning” – or, I suggest, continuous co-learning by all the stakeholder groups. This doesn’t typically happen in traditional top-down or adversarial regulatory arrangements. And it’s needed under conditions of constant change. So what creates the right conditions for continuous co-learning? A regulatory sandbox.
“We propose that the use of regulatory sandboxes already encouraged in privacy and AI-related EU regulation is going to be particularly advantageous for rights-based, participatory implementation of the DSA, including Art. 28, and the development of solutions that will advance children’s best interests in the digital world,” Drs. Noula and Milosevic write.
Regulatory sandboxes are not yet well-known, but they are definitely a growing trend. In more than 50 countries there are regulatory sandboxes in operation or being tested in the fields of fintech (financial technology), healthcare, transportation and artificial intelligence, according to the European Parliamentary Research Service.
A regulatory sandbox would take both online safety and regulation thereof to the next level for three reasons:
- It’s a “safe” place for all the experts in the space – platform Trust & Safety policy and workers, regulators, youth and researchers – in terms of both regulation and collaboration. A regulatory sandbox offers 1) a break or reprieve from regulatory scrutiny or action while sandboxing is happening, so participants can freely innovate and test products and services for greater safety. It also offers “safety” as in agreed-upon respect for all perspectives and the dignity of each participant, professional and psychological safety under a rule that everyone’s expertise and perspectives have value in the process.
- Its aim is continuous co-learning. So it’s a way to keep up with rapid change in tech, its effects and its young users’ practices. Participating regulators get to learn about new safety technologies, features and procedures before or as they emerge, businesses can understand and work regulatory compliance into product and service design (“safety by design“), all adult parties can continuously learn about potential impacts on youth from youth as their thinking and practices evolve, participants both under 18 and 18+ can be current on quantitative research, and the academic participants get continuous inputs for qualitative research from young people themselves.
- It’s rights-based. Among the rights of participation that the 35-year-old UN Convention on the Rights of the Child, everybody under 18, is the right to have a say in all matters affecting them. Every country on the planet has ratified the CRC, so implementing the DSA, for example, would be out of compliance if it did not include input from Internet users under 18. For more than two decades, online harms to children have been defined largely by adults. “We argue,” write Noula and Milosevic in their comment, “that in addition to the focus on children’s right to privacy, safety and security, the [DSA implementation] guidelines should give due weight to children’s right to participation and freedom of expression.”
Almost two years ago, at the annual gathering of the Family Online Safety Institute in Washington, Australian eSafety Commissioner Julie Inman-Grant and colleagues announced a new Global Online Safety Regulators Network aimed, I think, at counterbalancing the global scope of social media platforms (on GOSRN’s webpage you’ll find position statements on the need for regulatory coherence and coordination and on regulation with regard to human rights). I believe a regulatory sandbox for young Internet users’ care and redress which starts with implementing what the DSA requires could become a valuable tool for a global regulatory network.
It’s never easy to get something unprecedented up and running, but conditions are ripe for the Child Online Redress sandbox, and founder Ioanna Noula and our team and advisers in seven countries and counting will persevere. Because it’s time to collaborate and innovate for safety safely, not in the old non-productive, adversarial, “grown-ups”-only way (at least when we act like so-called grown-ups). It’s time for safety by co-design.
[…] Safety by co-design: How we can take youth online safety to the next level – NetFamilyNews.org […]