| Hateful content surfaces on Facebook despite moderators' efforts by Kevin Tran on Jan 2, 2018, 10:20 AM Advertisement
This story was delivered to BI Intelligence "Digital Media Briefing" subscribers hours before appearing on Business Insider. To be the first to know, please click here. Decisions made by Facebook’s human content moderators to weed out hate speech posts are often inconsistent, causing some offensive content to wrongly remain on the platform, according to investigative journalism group ProPublica. The group found that Facebook’s content reviewers often make different calls on similarly offensive content, and that reviewers don’t always abide by the company’s rules surrounding hate speech. ProPublica analyzed 900 posts as part of the investigation, and presented Facebook with 49 posts users flagged as offensive, but still remained online. Facebook admitted its content reviewers made a mistake on 22 of the posts, but defended its decision on 19 other posts. Wrongly hosting hate speech content on Facebook is problematic for two key reasons: - It hurts Facebook’s broader goal of fostering communities on the platform.Facebook looks to have 1 billion users join “meaningful groups” within five years, up from 100 million in July of this year — the goal comes as part of Facebook’s mission to “bring the world closer together.” However, negative and hateful posts can make it more difficult to connect with others on the platform, potentially discouraging users from joining meaningful groups.
- It can lead to hate speech related fines from countries outside the US. Germany passed a law in June of last year that requires social media companies to remove hateful content in a timely manner, fining those who fail to do so up to $59 million. Additionally, UK Prime Minister Theresa May has suggested implementing similar fines for companies that fail to remove hateful content in the UK.
However, Facebook places a high priority on minimizing the amount of hateful content on the platform. Facebook deletes 66,000 hate speech posts each week, according to Facebook VP Justin Osofsky. Additionally, the company plans on doubling its safety and security team, which includes content reviewers and other employees, to 20,000 people in 2018, per ProPublica. More content reviewers provide a better opportunity for flagged posts to get second opinions, helping to lower hateful content on Facebook. Meanwhile, Facebook isn’t the only major social platform with offensive content slipping through filters. On Twitter, users affiliated with the alt-right movement used the platform to stoke racial tensions and organize a neo-Nazi rally in Charlottesville, Virginia, earlier this year, according to Recode. This shows policing hateful content remains a high priority issue for multiple platforms. The Digital Trust Report, a report from BI Intelligence, examines consumers’ perception of major social platforms. It rates Facebook, YouTube, Instagram, Twitter, Snapchat, and LinkedIn on security, community, user experience, and content authenticity and shareability. These insights help brands and marketers make informed decisions about where to spend their marketing and branding dollars. All of the information in this survey comes from our proprietary BI Insiders panel, made up of more than 15,000 specially selected and recruited Business Insider readers. This panel is designed to be a leading indicator of what’s next in digital. The panelists are business and tech savvy, they have buying power, and they’re highly engaged. The survey revealed some fascinating insights into how millennials and decision makers view today’s most popular social media platforms. Here are some key takeaways from the report: - Digital trust has been shaken by a proliferation of malicious content and data breaches, which has significant consequences for brands that use these platforms.
- The top platform won by a huge margin on most attributes. Content on this platform is more likely to be viewed as forthright and honest, which increases the persuasiveness of ads and marketing messages that appear alongside it. This also creates ideal conditions for thought leadership and branded and sponsored content to flourish.
- The second-ranked platform was bolstered by users' confidence sharing content they find there. Users were most apt to share content they found there, which, together with its massive audience and high engagement, makes it the right platform to maximize reach.
- The social platform that finished dead last did so because of its abusive comments section and extremely annoying ads. Still, this hasn’t dissuaded people from visiting, as evidenced by the time spent monthly and massive user base. This platform also resonates more with older generations.
The Digital Trust Report is only available with a subscription to BI Intelligence, Business Insider's premium research service. To access this report, plus hundreds of other reports on the future of digital, click here. | |
0 comments:
Post a Comment