It could help increase the visibility of the very content people want deleted. Here, in a guest post for NetFamilyNews, is an account by Maureen Kochan, our director of community at ConnectSafely.org, of how that happens:
By Maureen Kochan
Many users of Facebook have come across questionable content on the site on occasion. Chances are they reported it and moved on. But sometimes pages or groups are so offensive that organized campaigns spring up to get the page or group taken down. But when it comes to reporting bad content on Facebook, more reports might not be better.
Take one example that came to our attention recently. A user contacted us about a Facebook page that she and many others wanted removed. In her message the user acknowledged that the campaign against the page was probably making it all the more visible due to their engagement with the page (a lot of people were visiting the page to view and discuss the content, and to report it).
Facebook ultimately took the page down, though it had 227,000 “likes.” Of course no one can know how much the “anti” campaign increased the page’s reach, but the saying “any press is good press” came to mind as I followed what happened with the offending page, and there is no question that public outrage over a piece of content in social media only increases the attention it gets.
A similar situation came up several days later. An acquaintance sent us an online petition aimed at getting Facebook to remove a pro-dog-fighting group. By the time the petition, which contained a disturbing picture of an injured dog, got to me, Facebook had already removed the group. But the name of the group – and the disturbing image – lived on through the petition, which wasn’t hosted on Facebook. By the time I saw it, the petition had been signed 134,000 times with 334,000 shares. (The petition closed several days later with 259,000 signatures and 589,000 shares.)
So the people behind this group and page got even more mileage from the reactions to their exploits – in the case of the pro-dog-fighting group, long after their stuff was removed from Facebook. And it’s worth noting that, while Facebook removed the content in both cases, that doesn’t always happen. A Facebook page or group can be offensive without violating the site’s terms of use (e.g., when Facebook considers the content free speech). So piling on reports may not get the page or group removed and instead may help it reach a wider audience.
Certainly using social media to stage a public protest against something in social media seems logical, but we might think about whether the outcome will be different from the way things worked in the past, when public protests typically were staged about something that happened in the past – something that couldn’t be instantly clicked to or conveniently viewed on the very same page. Being able to see what’s being protested about can be informative, but it also means the viewer is giving attention to – essentially giving power to – the offending content. That may’ve been partly true in the past, but not to the same degree as now.
And then there’s the view of people who grew up with social media. Here’s what an Australian high school student recently told ConnectSafely co-director Anne Collier when asked about whether pages depicting violence, hate, misogyny, etc. should be taken down: “Nobody’s forcing you to look at or like that page. You can block it too, so that you never have to look at it if you don’t want to.” Another student told her, “Free speech is important,” adding that it’s better to allow people to “express their displeasure” with and on an offensive page than to require a service to delete it. A third said that, if it promotes violence, it should be taken down, but cautioned adults to remember that it isn’t just a social-media thing: “People say awful things to each other in person, and [online] is just another place where that happens.” Helpful perspective from the people who know social media well.
Leave a Reply