(Washington Insider Magazine) -In recent weeks, Facebook has been under fire after a whistleblower, Frances Haugen, released documents that demonstrated that Facebook was aware and allowed misinformation on the platform which interfered with the election in the United States, as well as political violence that was manifested in the Capitol riots on January 6th. In the developing world, the ramifications of this misinformation have been more severe.
Reports from the whistleblower show that militias in Ethiopia, a country amidst a civil war, used Facebook to spread misinformation and incite violence. The Fano, an ethnic Amhara group, has used the Facebook platform to “seed calls for violence”, recruit, and fundraise. The group is embroiled in conflict against the Tigray ethnic group and is associated with multiple human rights violations toward the Tigray people.
These posts were eventually flagged by Facebook and removed but critics argue the material was removed too late. However, Facebook’s content algorithm is designed to drive engagement. As a result, divisive topics often get lifted to the top of users’ newsfeeds. While Facebook has protocol to remove problematic content, the policy around potentially harmful material is generally been a “break the glass” to remove violent content after unrest erupts.
Another factor that exacerbates the issue, is according to testimony given to UK parliament, is that Facebook does not moderate non-English languages proficiently. Currently, Facebook supports approximately 50 languages, but the whistleblower believes that these 50 languages are not moderated as robustly as American English. Beyond that, smaller languages, and dialects, likely go without any scrutiny.
In February of 2021, Myanmar, underwent a military coup. For many Burmese citizens, Facebook was the primary source of information regarding the ongoing events. To that end, it is thought that Facebook content amplified the dissent and violence in the Southeast Asian country. According to the UN, it has been determined that hate speech on Facebook played a vital role in fueling violence in Myanmar. Facebook has admitted that the company failed to stifle violence-inducing content in the country.
Governments across the world, led by the European Union, have sought stricter regulations on Silicon Valley tech giants such as Facebook. While the EU has so far not imposed these regulations, it is expected that Haugen’s testimony to Congress will lead to additional inquiry. According to Haugen, “regulation could actually be good for Facebook’s long-term success”, believing that making Facebook a less divisive place will boost users’ enjoyment.
Meanwhile, US policymakers such as Senator Klobuchar are calling for increased regulation in the social media space. While US policymakers have been relatively weak on Big Tech compared with its European counterparts, Frances Haugen’s testimony is expected to reignite the debate.
Facebook has responded to the testimony by saying that the “Facebook Papers” are only a subset of documents of the work Facebook has done to moderate the platform. However, they have yet to share said documents. Amidst the negative press, Facebook is reportedly exploring a rebrand.
Ultimately, Facebook and other platforms have amplified misinformation across the world from the United States to Myanmar. Regulators, and the public-at-large will be keen to identify solutions that dull the impact of Facebook on sowing violence and dissent across society.
