Meta in Myanmar, Part II: The Crisis

tags
Facebook

previous: Part I: The Setup
next: Part III: The Inside View

Notes

“2015 was a great year for Facebook,” Mark Zuckerberg announces.

Meta has responded by getting local Burmese groups to help it translate its rules and reporting flow, but there’s no one to deal with the reports. For years, Meta employs a total of one Burmese-speaking moderator for this country of 50M+ people—which by the end of 2015 they increased to four.

a brief detour through economics, explaining that internet access \== no more zero-sum resources == global prosperity and happiness

Digital Sublime

What his new “Next Billion” initiative to “connect the world“ will do is build and reinforce monopolistic structures that give underdeveloped countries not real “internet access” but…mostly just Facebook, stripped down and zero-rated so that using it doesn’t rack up data charges.

no one at Meta was responsible for assessing cultural and political dynamics as new communities came online, or even tracking whether they had linguistically and culturally competent moderators to support each new country.

for a ton of people across Myanmar, getting even a barebones internet was life-changingly great.

in the app I can see what the prices are in the big towns, so I don’t get cheated

people who document incidents of communal and state violence for organizations like Medicins Sans Frontieres and the UN Human Rights Council use precise, economical language. Spend enough time with their meticulous tables and figures the precision itself begins to feel like rage.

My honest advice is that you don’t read it.

The overwhelming volume and velocity of this hate campaign would not have been possible without Meta

Meta bought and maneuvered their way into the center of Myanmar’s online life with and then inhabited that position with a recklessness that was impervious to warnings by western technologists, journalists, and people at every level of Burmese society.

After the 2012 violence, Meta mounted a content moderation response so inadequate that it would be laughable if it hadn’t been deadly.

With its recommendation algorithims and financial incentive programs, Meta devastated Myanmar’s new and fragile online information sphere and turned thousands of carefully laid sparks into flamethrowers.

Meta allowed an enormous and highly influential covert influence operation to thrive on Burmese-language Facebook throughout

increasingly intense frustration—bleeding into desperation—among the people who tried, over and over, to get individual pieces of dehumanizing propaganda, graphic disinformation, and calls to violence removed from Facebook by reporting them to Facebook.

They report posts and never hear anything. They report posts that clearly call for violence and eventually hear back that they’re not against Facebook’s Community Standards.

even the United Nations’ own Mission, acting in an official capacity, can’t get Facebook to remove posts explicitly calling for the murder of a human rights defender.

The UN Mission team investigating the attacks on the Rohingya knows Michael. They get involved, reporting the post with the photo of Michael’s passport in it to Facebook four times. Each time, they get the same response: the post had been reviewed and “doesn’t go against one of [Facebook’s] specific Community Standards.”

There were—and are—ways for Meta to change its inner machinery to reduce or eliminate the harms it does. But in 2016, the company actually does something that makes the situation much worse. In addition to continuing to algorithmically accelerate extremist messages, Meta introduces a new program that took a wrecking ball to Myanmar’s online media landscape: Instant Articles.

Instant Articles was kind of a bust for actual media organizations, but in many places, including in Myanmar, it became a way for clickfarms to make a lot of money—ten times the average Burmese salary—by producing and propagating super-sensationalist fake news.

The fact that the comments with the most reactions got priority in terms of what you saw first was big—if someone posted something hate-filled or inflammatory it would be promoted the most—people saw the vilest content the most. I remember the angry reactions seemed to get the highest engagement. Nobody who was promoting peace or calm was getting seen in the news feed at all.