Remember “Blue Whale”? Almost five years ago, when I was getting to the bottom of that murky hoax, it wasn’t yet understood as one. It was being called a “suicide game,” and those two words were scaring parents around the world, literally.
I was looking all over the Web for reliable sources and found my best one – still one of the world’s top experts on the subject, I believe – to be Georgi Apostolov a former journalist, media literacy expert and head of Bulgaria’s Safer Internet Center. His goal was to stop the panic from taking over his country. There was major concern about misinformation and disinformation in many countries – then maybe even more than now – and this was “fake news” too – a particularly dark example. There’s a cultural and geographical element to deconstructing a viral hoax, and Apostolov’s understanding of Russian media, politics and culture, as well as the news media and social media, was hugely helpful. I’ll come back to those elements in a moment, but first….
Gaining ground
Fast-forward to this week and the report “Exploring effective prevention education responses to dangerous online challenges,” commissioned and released by TikTok this week, and it’s clear we’ve come a long way. Sensationalist news reports and hoaxes can be seen better for what they are, thankfully, and in this report we have a very complete picture from multiple perspectives of all that has been learned about this social problem to date.
I’m biased because I got to participate in the project, but my bias can’t possibly discount the multi-country study TikTok commissioned nor the input from the adolescent development and clinical psychiatry experts consulted: Drs. Gretchen Brion-Meisels in the US and Richard Graham in the UK. The study, which encompassed both hoaxes and challenges (ranging from fun and harmless to risky to dangerous), surveyed 10,000 teens, parents and educators in Argentina, Australia, Brazil, Germany, Italy, Indonesia, Mexico, the UK, the US and Vietnam.
It was affirming to see the finding that most kids just watch rather than engage in online challenges; only 21% of respondents (aged 13-19) said they’d participated in an online challenge (of any type), “whether choosing to post it or not.” And, though 2% is of course too many, only 2% have taken part in a challenge they considered risky and dangerous and “only 0.3% have taken part in a challenge they categorized as really dangerous” (please see the report for much more).
Those 9 things
So back to the elements – what makes new media hoaxes so challenging….
- They’re not new or unique to new media – remember the “Truth or Dare” game or chain letters? So there’s something enduring about dares, challenges and hoaxes that no amount of content moderation or education can totally overcome.
- They’re global, and the authorities or institutions that tend to address them are typically local or national and not accustomed to looking to expertise in other countries for models of intervention and prevention education.
- They trigger and feed on universal fears of child victimization, especially when suicide is an element and when concerns about social media are heightened, which makes them perfect for clickbait.
- They have cultural and political components. If it behooves the news media or even the government of a country to exaggerate a threat to distract citizens or mobilize them against an enemy, a hoax can come in handy or be spread intentionally and rapidly by even the authorities. And well-meaning NGOs and subject matter experts often feel pressure to share their expertise before they have a handle on the media part.
- Well-intentioned people spread them too – to warn people off – but by doing so, they too spread the hoax or dangerous challenge and thus grow the reach of its harmful effects, even though unintentionally.
- Technology plays a role, both in spreading and in incentivizing human spreaders. Hoaxes are not just spread by people but also by algorithms optimized for user engagement – which recommend grabby sensationalist or shocking content. And clickbait creators get rewarded financially by social media systems (see this BBC story). Technology can also help, of course. For example, TikTok’s countermeasures include banning hashtags about a hoax, detecting keywords referring to it and deleting content, blocking search results, serving popups with help info, tagging videos to make viewers aware that the content isn’t 100% safe, seeking local NGO support (like a Safer Internet Center) where a hoax is going viral, etc.
- They can morph and become harmful. They’re hoaxes – not real and therefore not harmful at the start, but they can become harmful as they’re spread. This is based on social norms research on how perception affects behavior. Take bullying, for example. Social norms researchers conducted a study in five middle schools where young people thought bullying was rampant in their schools but it wasn’t. When the students found out that most of their peers didn’t engage in bullying, bullying went down even further in direct proportion to the changed perception. So if we tell kids the truth that most of their peers don’t engage in dangerous behavior, which we know from this new study, many are less likely to themselves.
- Repeating a lie can give it an “illusion of truth,” psychologists tell us. People start to believe the lie, so spreading it for good or ill is harmful. It is “critically important to not repeat falsehoods, even while we attempt to debunk them, lest we legitimize lies by reiteration itself,” Scientific American reports.
- Addressing hoaxes calls for multiple types of expertise, so for example getting suicide prevention experts’ help with a hoax about a suicide “game” is great but not enough – the subject matter expert doesn’t necessarily understand the media factors. What types of expertise are needed is based on the content of the hoax. Media literacy is always needed. So is knowledge of child and adolescent development and clinical psychology. For Blue Whale, in addition to suicide prevention expertise, sociocultural knowledge, such as of Russia’s very high teen suicide rate, was needed as well.
One final thing: The way we address hoaxes and harmful challenges needs to be respectful of young people’s intelligence. I was delighted to see that the global study found that only 3% of teens believed a hoax they’d recently seen was 100% real (less than parents and teachers surveyed), that the vast majority of teens don’t participate in risky challenges and that they want more information to help them assess the level of risk. Let’s give them more information not just on how risky behaviors can hurt their health. Let’s also give them more info on all the above – on how challenges and hoaxes work. A great way to inoculate them against the dangers of manipulation is to focus less on trying to remove all risk and more on supporting their (completely developmentally appropriate) risk assessment with solid information as it emerges. That way, they grow their resilience and help each other as well as us reduce harm.
Related links
- The BBC’s thorough coverage of the TikTok report and its in-depth 2019 “origin story” on the Blue Whale phenomenon – wonderful 20/20 hindsight
- Here’s an example of a news story on the TikTok report that’s really a commentary. It’s good to keep up the pressure on social media to do better but I feel it’s also important to acknowledge where the platforms are making improvements and to be aware of the news industry’s business model too – of how that model incentivizes news outlets to grow readership/viewership by confirming consumers’ biases and fears.
- Example of sensationalist Blue Whale coverage by Russia’s Izvestia, known to be a pro-Kremlin news outlet (March 2017)
- The comprehensive independent report TikTok commissioned and released yesterday: “Exploring effective prevention education responses to dangerous online challenges” and TikTok’s blog post about the project
- An article in the Journal of Pediatric Psychology that showed not only that “exposure to peer social norms that favored risk taking predicted a significant increase in risk taking” but also that “communicating social norms against risk taking was effective to decrease risk taking … among school-age children.”
- An interview with Georgi Apostolov, “Here’s Who Stopped Bulgaria’s Blue Whale Hoax,” by Italian journalist, author and game designer Andrea Angiolini (you may need Google Translate to read it)
- About the Momo hoax that surfaced a year after Blue Whale went global (scroll to the bottom for my first post) and all my Blue Whale posts (also in reverse chronological order)
[…] 9 things that make viral hoaxes challenging – NetFamilyNews.org […]