Facebook said that a human rights report it commissioned found that the company hasn’t been successful enough in preventing hate speech used to fuel violence in Myanmar.
The report, performed by San Francisco-based nonprofit Business for Social Responsibility (BSR), advised Facebook to implement stricter content policies. It also recommended Facebook to increase engagement with both Myanmar officials and civil society groups.
“The report concludes that, prior to this year, we weren’t doing enough to help prevent our platform from being used to foment division and incite offline violence. We agree that we can and should do more,” the company stated in a blog post.
Earlier in 2018, Facebook established a dedicated team across product, engineering, and policy to work on issues specific to Myanmar. The company plans to from grow its team of native Myanmar language speakers reviewing content to at least 100 by the end of the year.
Further measures to handle the issue include an update to Facebook’s credible violence policy, allowing moderators to remove misinformation that has the potential to contribute to imminent violence or physical harm. More aggressive action towards dubious account networks are also being taken.
The report follows in response to an ongoing crisis, in which Myanmar’s minorities, particularly the Muslim Rohingya, are facing violence from its Rakhine state. The social media website removed several Myanmar military officials from the social media platform in August after a crackdown was performed on Rohingya insurgents.
According to Facebook, it now has 99 Myanmar language specialists reviewing potentially questionable content. The platform has also expanded its use of automated tools to reduce distribution of violent and dehumanising posts while they undergo review.
(Source: CNET)