Can Facebook Weaponize Information and Incite Violence?
by Yash Saboo April 13 2018, 6:06 pm Estimated Reading Time: 2 mins, 57 secsWhile the recent Cambridge Analytica data privacy scandal is the main focus for American lawmakers questioning Facebook’s Mark Zuckerberg, the company’s record beyond the U.S. raises even more alarms.
U.N. investigators have accused Facebook of playing a “determining role” in facilitating violence against Rohingya Muslims, a minority ethnic group, by allowing anti-Muslim hate speech and false news to spread on its platform that has driven nearly 700,000 Muslim Rohingya out of the country and killed at least 6,700 people in the first month alone. How exactly, and to what extent, the social media giant affected Myanmar’s military-led campaign of rape, arson and murder remains impossible to quantify, given the absence of available data.
Earlier this week, Zuckerberg told Vox that Facebook’s systems had detected a pair of chain letters spreading around Myanmar on Facebook Messenger last year. One warned of an imminent attack by Muslims on 11 September. “That’s the kind of thing where I think it is clear that people were trying to use our tools in order to incite real harm,” Zuckerberg said. “Now, in that case, our systems detect that that’s going on. We stop those messages from going through.”
Source :Washingtont
However, the groups, which have worked with Facebook to flag dangerous content, have revealed it took more than four days for the company to respond when the messages started circulating online during the Rohingya crisis.
During the Senate hearing, Vermont Senator Patrick Leahy brought up the company’s role in the ongoing ethnic violence in Myanmar, citing one incident where death threats against a Muslim journalist did not violate the platform’s rules. In Myanmar, journalists are regularly arrested and even killed for reporting on the government’s activities.
“Six months ago I asked your general counsel about Facebook’s role as a breeding ground for hate speech against Rohingya refugees,” Leahy said. “Recently, U.N. investigators blamed Facebook for playing a role in inciting the possible genocide in Myanmar, and there has been genocide there.”
Using screenshots mounted on a poster, the Senator cited a specific threat calling for the death of Muslim journalists in the country. That threat went straight through the detection systems. It spread very quickly and it took attempt after attempt after attempt and the involvement of civil society groups to get Facebook to remove it. Why couldn’t it be removed within 24 hours?
Source :Polygon
“Hate speech is very language specific. It’s hard to do it without people who speak the local language and we need to ramp up our effort there dramatically,” Zuckerberg said.
He mentioned the company’s plan to hire “dozens” of Burmese language content reviewers as the first part of a three-pronged approach in Myanmar, also noting a partnership with civil society groups to identify hate figures in the country rather than focusing on removing individual pieces of content.
Third, Zuckerberg stated that Facebook is “standing up a product team to do specific product changes in Myanmar” and other countries with similar situations, though he did not delve into the specifics of those changes.
He also apologized with a personal message to the groups after they published an open letter that accused him of misrepresenting Facebook's success in combating hate speech in Myanmar. A spokeswoman for the groups later told BuzzFeed News that the response was “grossly insufficient.”