On the social Web, where content is not just communication but behavior as well, safety is a shared experience. A single user can’t guarantee it, no matter how optimized his privacy settings and practices are, nor can a site – not when people can tag, copy, forward, and instantly mass-distribute photos of and info about each other. That’s why education about civility, citizenship, and safety practices is essential. It’s also why Facebook’s new, more social approach to safety intervention is just practical, since Facebook can’t fix problems arising from offline relationships, it’s sending a vital educational message: Safety is about humanity, not technology.
So, as announced at the White House bullying-prevention summit yesterday, Facebook is rolling out an innovative approach to its abuse-reporting system, what it calls “social reporting.” Here’s how it’ll work (eventually worldwide): If people want to report a photo, for example, and click on “Report” under it, they’ll get a pop-up window asking if the photo is about them. If it is, they can choose “I don’t like this photo” or “This photo is harassing or bullying me.” If they pick the latter, they’ll then have the option to block the person who posted the nasty photo or (the new part) “Get help from a trusted friend.” If they choose that second option, Facebook lets them forward the photo (with a message they can write) to someone they think should know about it or help them deal with it. They’ll be able to send the message and photo either via email to someone outside of Facebook or via in-site messaging to a fellow FB member. This will work in Profiles, Groups, Pages, and Events, reports PC Magazine, and “Facebook will also add the option to notify a trusted source – like a parent or teacher – of bad behavior.” You’ll find screenshots illustrating all this at Facebook.
If this seems like a fairly simple addition to an abuse reporting system, it is. It’s a first baby step that does two things: 1) It sends a message that needs to sink in, that blocking someone or deleting an account in a Web site rarely resolves relational problems in the “real world,” and 2) it’s an indicator of where we’re all headed on and with the social Web. Author, educator and pundit Clay Shirky summed it up in a keynote this week with a simple quote from Microsoft research sociologist Mark Smith: “More people pooling resources in new ways” (I would add “for social good” or “for everybody’s benefit”). Smith said that’s the 7-word summary of the history of civilization. Shirky said it also sums up civilization’s future. I think this is true for the future of online wellbeing too. In his talk, Shirky offers lots of illustrations (if you tune in for nothing more, hear his opening description of how women organized to protect themselves from threats of violence from the “morality police” in Mangalore, India), so I urge you to listen. This is the online piece of how we’re going to reduce bullying (the White House summit this week covered a whole lot of offline pieces).
Facebook’s social abuse-reporting is a first step toward designing online communities of all sorts so that safety isn’t just a negotiation among users and between users and site, but so that it’s in everybody’s interest – so that safety is just part of our social capital.
Related links
- “How the Net industry can help get us all to Online Safety 3.0”
- On Shirky’s illustration of how eBay improved things vastly by implementing its reputation system at CMSWIRE.com
[…] “First look at Facebook’s ‘social [abuse] reporting’” (March 2011) […]