As a parent or educator, have you ever wondered what bullying and harassment look like from the other side of your kids’ favorite online spaces – to the people who run them, moderate the action, and get those “abuse reports” users click on for all kinds of reasons? Moderators and community managers have an indispensable perspective to bring to the public discussion about cyberbullying, so I’m posting here key insights I’ve gained from leading experts in the moderation field.
Not all the negative behavior we see in kids’ online games and virtual worlds is cyberbullying. In fact, very little of it is. Moderators tell me about 90% of the abuse reports they get from kid users are “false positives” – testing the system, attention-seeking, acts of boredom, etc.
‘Abuse reports’ of all sorts
But it’s a mix of behaviors, whether or not they violate a site’s Terms of Service. Community management expert Izzy Neis wrote in an email that “the majority of reports we receive fall in these categories” (I’m both quoting her and folding in some things I’ve learned generally):
- “Testing the ‘Report’ button” just to see if or how it works
- Using the button to pick on “clueless kids” by reporting those who haven’t done anything wrong (typically these kids are only playing the victim, acting out of either boredom or social aggression)
- Using the abuse report button to create drama (sometimes trying to distract the moderators, others maybe to do a bit of power-tripping or see how worked up peers or moderators might get)
- Reporting peers with “different sensibilities” or values or just differences (“Hey she said omg – and g stands for god, and we’re not allowed to say that word! He’s using it wrong.” or “Crap is a bad word” or “I don’t like his avatar, it’s ugly.”)
- Reporting trolls or spammers (“He won’t stop following me” or “She keeps saying moo and wont stop” or “She’s in my virtual house and won’t leave.”)
Izzy adds that “users reporting real issues, such as danger, abuse, or other users talking about suicide or truly harassing others come in very small spurts, and Moderators have to weed [through] all the false-positives to get to these.” But all the weeding is worth it because small problems can escalate in-world, escalating costs to both users and companies – and sometimes there are real-life dangers, at home or school, behind kids’ cries for attention.
The soil of in-world bullying
Here’s the ground from which the “bullying” (either claims of it or actual bullying) arises, and not just among young people:
- “Misunderstanding“: where a user group still growing up and in the midst of social development is concerned. Kids act out when they don’t yet know how to articulate their feelings. Misunderstanding causes reactions out of context, and that in turn can create a hypersensitivity that causes reactions out of proportion. Avatars can be clunky – they can’t convey subtleties, like when an otherwise mean comment is meant as humor, the way humans’ facial expressions and body language can. So misunderstanding happens.
- “Action and reaction“: In an online game, “you’re missing the physicality of sports, fights, and arguments,” Izzy wrote. “When one user ’causes’ something, the only ‘reaction’ they have is an animated reaction. I’d be very interested in the research between the brain acknowledging an action on the screen [without the option or cue of a] physical response [like a frown, gesture or sad expression], therefore having to add force in the text/words they choose to accompany the animated reaction. I wouldn’t be surprised if there is a very interesting correlation there.”
- “Energy spikes: Bored kids (i.e., the majority of those spending large amounts of time online in virtual worlds, etc.) want a spike in adrenaline [to lose] their boredom,” Izzy wrote. “In fact, I’d go as far as saying that – within the realm of online games and worlds – the majority of ‘jerks’ or ‘bullies’ are just really bored people (not just kids) that might have had a bad day or a boring day, and they’re looking for some sort of dramatic fix – something to get their socially developing brains challenged and fired up. And if they don’t get the fix they’re looking for, they keep pushing and pushing until something happens.
- “Attention: A lot of the more ‘dangerous’ users who talk about abuse and suicide are often looking for attention, calling for help. But there are also a certain number of users who use these words ‘suicide,’ ‘kill myself,’ ‘kill me,’ and other scary terms to get attention.” Izzy finds that “a lot of times hiding behind an online fantastical identity makes a person think such words don’t have the same ‘911’ effect that people outside of the community might associate with them if looking in from the outside. I remember a very popular kid broadcaster site that had a forum with tons of kids who competed in posts for who had the worst/most/extreme form of cancer. None of them had cancer – but it was a competition for attention within the small community” (which shows how important it is for moderators, parents – everybody – not to take what they see in online communities too literally, or at least not to react too reflexively).
Of course moderators “still have to research the conversations and report the users’ claims to the parents or authorities” if they have identifying information and privacy law allows them to do so. The context of community incidents often gets trumped by the context of societal concerns, Izzy wrote (e.g., “saying ‘suicide’ in a community can be like saying ‘bomb’ in an airport – it’s just as questionable and just as concerning!”).
Mirroring offline life
But let’s look at context a little more closely. What moderators see in games and virtual worlds is often “replicated behaviors experienced on the playground, during play dates, in school lunchrooms, etc.,” Izzy wrote. They are not unique to the online spaces. The values children absorb at home and school – but especially home – are replicated online too, for good or bad.
“For as much as we hear about cyberbullying and negative interactivity online, we see amazing [positive] interactivity as well,” Izzy pointed out, “kids sticking up for each other, bonding, kids learning how to socially navigate situations, and kids exploring social development in a fantastical environment (where they don’t have to hold their true identity responsible). Kids aren’t allowed to spend time roaming the neighborhood as they could decades ago, so now they often do their play-time role play online. Fights may happen – and the education needs to come from the conversation between the parent and child about Netiquette and self-protection and being able to forgive, understand, empathize, learn, move on, improve.” That work we do with our children simply can’t exclude their online activities and behavior, and these digital spaces provide great opportunities for them to practice respect and civility.
The industry’s piece
As for what social media companies can do: consumer education and product development based on research about child development, online risk, and social media use. But companies need to educate staff and develop tools, policies, and environmental conditions based on the research too – to provide products and services that help inform, educate, protect, and teach protection of self and peers. “Context and understanding are two elements I try to instill in my moderation staff,” Izzy wrote. “Moderation’s goal is to stick to the [online game’s or world’s] Terms of Service/Code of Conduct, and provide a healthy experience overall, while doing our best to address minor issues and respect context and conduct.”
All these insights point to a new media reality: that safety is a shared proposition. It’s maintained collaboratively among users communicating and socializing in real time, companies creating the atmosphere, values, story line, and safety tools of online environments, and the influencers behind users in offline life: parents, educators, friends, co-workers, and anybody else who helps shape who users are offline as well as online.
[…] U.S. – Cyberbullying: The view from behind a kids’ Web site, Anne Collier, Oct. 27, […]