A flash of insight occurred the other day as I was listening to a conversation between two completely remarkable lawyers: Vivek Maru, founder of Namati, the nonprofit organization behind the Global Legal Empowerment Network of more than 3,000 organizations and 13,043 individuals in 170+ countries, and Preeta Bansal, who was general counsel and senior policy advisor to the federal Office of Management and Budget during the Obama administration.
In that flash, four themes converged in my head: social media’s future, safety and justice on global platforms, offline law and justice, and content moderation. I think about content moderation a lot, partly because it’s vital to people’s online safety and partly because I know so many great people working in all parts of the field, from the making and enforcement of platform rules and content policy to research about the work and its effects to supporting content moderators as clinicians, directors of professional associations, educators and conveners.
The insight was that content moderators could be, probably are, “community paralegals” for global digital justice, helping to make the platform legal systems work better for billions of Internet users around the world. That is because…
- Social media platforms are legal systems, global digital legal systems that interact with governments and offline legal systems worldwide (see the sidebar below explaining)
- Offline social justice and injustice obviously spill onto social media platforms, and vice versa
- Neither the platforms nor offline laws and law enforcement can fully protect people from online harm and injustice, and many governments either struggle to understand how digital legal systems work, work against them or even exploit them (one reason why Filipina Maria Ressa won the 2021 Nobel Peace Prize)
- Top-down online protection is increasingly inadequate, partly because of all the layoffs suffered in recent months, and demonstrating the need for distributed, grassroots (or peer-to-peer) justice online, i.e., the digital version of Namati’s Legal Empowerment Network that “puts the power of law into the people’s hands” with the help of grassroots justice workers or “community paralegals.”
I know that more than 130,000 tech workers have been laid off since last November 1. I know many, among them content moderators, were laid off in extremely inhumane ways, which is one reason why I had to resign from Twitter’s Trust & Safety Council last month.
So this may seem like terrible timing to be floating this idea. But could this possibly be a source of inspiration for people in the field – the idea of helping to create a digital version, or arm, of a global network dedicated to justice for Internet users all over the world who don’t have access to justice on the platforms they use? Maybe some of you are already forming digital legal aid groups for colleagues. Possibly some will be inspired to bring their deep knowledge of platform moderation to helping –or empowering – Internet users from the outside. Would content moderators doing the nearly impossible, often thankless, job of keeping people safe online (or possibly laid off from doing so) find inspiration in the idea of being digital “community paralegals”? It seems to me content moderators are the natural standard bearers, teachers, organizers, user advocates and “bridge between the law [community standards] and real [everyday] life” online as well as offline.
“What community paralegals are is organizers who are deeply rooted in their places” in those 170 countries, Vivek Maru said, deeply committed to finding practical solutions to intractable problems. The offline Legal Empowerment Network is as global as social media platforms are. But now, thanks to the Internet, we have (digital) interest communities as well as geographical ones, and the former need community paralegals and TS&J (trust, safety and justice) workers too.
Does this sound too blue sky or only logical? By definition, social justice – civility and safety (online and offline) – must be equitable and so needs to be bottom up as well as top-down. It’s increasingly distributed. And there are a growing number of people in and out of tech calling for it online – people in professional associations such as TSPA and #TSCollective, on decentralized platforms such as Mastodon and in organizations such as All Tech Is Human, but I have a feeling many are quietly advocating and working for user legal empowerment behind the social apps and platforms as well.
Long and short, I’m testing the waters. This is just a thought, an inspiration. Its only power is in whether it resonates with others of multiple perspectives, especially experts in social content moderation and policy. I believe any movement forward has to be at least cross-disciplinary, and ideally cross-sector as well. You too?
[SIDEBAR:] By way of explanation…
So in case you’re looking for a little more detail on the above thinking:
Social media platforms are many things, among them digital legal systems – global ones. You might call them digital justice systems, though – not unlike national justice systems around the world – many, many Internet users feel they’re not being served well by them, even though responsible platforms are at least working on working for their users.
To explain the analogy a little further: Social media companies have platform-wide rules, systems for enforcing them, and “judges” (usually called content moderators) trying to ensure they’re enforced correctly and, if not, dealing with users’ “appeals.” One company, Meta, even has a global “court of appeals” of sorts called the Oversight Board, now an independent organization, which considers a tiny proportion of those appeals and advises Facebook’s “justice system” on content policy. Meta and the Oversight Board want the Board to serve other platforms as well as Facebook (the needed cross-industry piece).
Also global social institutions
All that is only part of the justice work needed, because the social media platforms have become global social institutions as well as corporations – a historically unprecedented situation that challenges governments as well as platforms. Because these particular social institutions only act as companies beholden to shareholders or, as in Twitter’s case, now, multi-billionaire owners.
Social justice in social media
We might consider what social justice in social media looks like. I can’t presume to paint that whole picture because it takes the perspectives of all social groups who participate in these spaces. But, baseline, I believe it includes, and must mirror, the safety, equity and justice everybody deserves but doesn’t always get in our offline lives. It requires education about people’s rights and, possibly, organization and mobilization for protecting them. So grassroots offline legal empowerment needs to be mirrored online.
Content moderators and others as justice workers
One of these stakeholder groups is actually the very one most qualified to do this work in digital spaces: content moderators. They are the people who understand how platform “legal systems” work because they are enforcing their community’s rules and policy. These are stakeholders embedded in online communities, and I know many of them care greatly about the safety and wellbeing of the people in their communities (social media users). (Of course, content moderators also need their rights, including workplace ones, upheld). Of course platform workers aren’t the only stakeholder group whose help is needed. Users of all ages need to be educated, the work that many of us on the outside have been doing for some time; and there are certainly other stakeholder groups, such as informed policymakers who see this is an ecosystem of care of which they are just a part. All parts of the ecosystem are informed of and support the others’ roles.
Front end, back end
Justice for social media users must include how their data is handled at the back end as well as how their content is handled at the front end. Rules enforcement, particularly harm mitigation, has to be equitable. In addition to content moderators, anyone involved in this work – the people who develop platform rules, people who enforce them and people who write the algorithms that contribute to content moderation – would contribute greatly to grassroots platform justice supported by “community paralegals.”
Maru’s vision
What Vivek Maru saw 20+ years ago in Sierra Leone, where there were few lawyers and they were inaccessible to most people, was a great need for what he called “community paralegals” to help people in their local communities understand and “turn the law from an abstraction or a threat” into something they could leverage for justice where they lived. These paralegals don’t assist lawyers but rather the people with the problems. They find practical solutions to intractable problems. “They demystify law, break it down into simple terms, and then help people look for a solution.” I believe there are digital community paralegals operating right now in the form of Internet helpline workers in Australia, Europe, New Zealand and the UK and NGOs the platforms call “trusted partners” who have offline context for the digital harm they escalate to the platforms.
Behind the platforms
As for the many potential community paralegals behind platforms and outsource companies, there are so many who have the qualities, skills and commitment of community paralegals. They deserve to have the same level of inspiration as members of the Global Legal Empowerment Network. Some might help users obtain justice from within the digital legal systems (platforms), others might work from professional communities such as #TSCollective and TSPA, others such as All Tech Is Human might work from their role of fostering cross-sector business+government+academia collaboration, and still others might be engaged in digital social experiments such as Mastodon.
Even governments+platforms not enough
Why are digital community paralegals so needed? Because governments and civil society around the world are still trying to figure out what to do about harm incited, spread and organized online. Some governments are corrupt and exploit platforms and harm citizens, with platforms lacking the understanding, possibly the will, to stop the distant harm (e.g., the digital injustice that led to genocide in Myanmar in the last decade). I believe, and I don’t think I’m alone, that these global corporations, even with the resources some of them have, cannot provide the people and communities worldwide with the justice they deserve online – not at the scale at which they operate. For example, one content moderation expert recently said, “In any given day [a platform’s content moderation team] could get 20,000 appeals in an hour.” And that’s just appeals. Way back in 2016, CNN reported Facebook got 4x that in reports of abuse (these statistics are hard to come by).
Where things are headed
At the end of last year, I wrote about where I believe the algorithmic side of content moderation is headed. This is about where the human side is likely to go. Because the toolkit for safety and justice online is not only growing and diversifying, people everywhere are recognizing the need for a multi-pronged approach. I suspect the experts in this space, the original Trust & Safety workers, will help get global digital justice and safety more and more distributed – outward and from the bottom up. Through education, advocacy, organization and mobilization, as offline “community paralegals” have been doing for a long time now. I welcome your thoughts.
Related links
- Another description of “community paralegals” is that they’re grassroots legal aides who use their training in basic law and their skills in mediation, organizing, education and advocacy to help individuals and communities find concrete solutions to instances of local injustice (the UN estimates that some 4 billion people in the world “fall outside the protection of the law,” according to H. Abigail Moy, director of the Global Legal Empowerment Network). They’re the “bridge between the law and real life.” Here is Namati’s guide to their work.
- “People can’t improve their lives without exercising their rights,” Namati founder Vivek Maru said in his 2017 TED Talk (here‘s the Jan. 28 conversation with him at ServiceSpace.org). They need to know what their rights are before they can exercise them. This includes children. We need to teach them about their rights under the UN Convention on the Rights of the Child, including their digital rights as described by General Comment 25, adopted by the UN Committee for the Rights of the Child in 2021.
- Never enough: “Today, 15,000 workers, the overwhelming majority of them employed by third-party vendors, police Facebook’s main platform and its Instagram subsidiary. About 10,000 people scrutinize YouTube and other Google products. Twitter, a much smaller company, has about 1,500 moderators.” That was in a June 2020 report from New York University’s Stern Center for Business and Human Rights, “Who moderates the social media giants?” and, in MIT Technology Review, “Facebook needs 30,000 of its own content moderators, says a new report” in MIT Technology Review
- Early investigative reporting: “The Secret Rules of the Internet,” the award-winning article on the field of content moderation by my friends Catherine Buni and Soraya Chemaly. It was published by The Verge about seven years ago, when people outside the field had barely heard of it.
- Seminal scholarship: Two years later, in 2018, Yale University Press published a pioneering scholarly book on the subject, Custodians of the Internet, by Tarleton Gillespie (see this review)
- About an extraordinary gathering of lawmakers trying to figure out how to define and regulate social media platforms in 2018
Thanks to the beautiful volunteers at ServiceSpace.org who hosted the conversation with Vivek Maru that inspired this.
[…] Future safety: Content moderators and digital grassroots justice – NetFamilyNews.org […]