Flare Up

News for nerds

On Tuesday morning, Mark Zuckerberg held a meeting with employees over video chat to address concerns related to the company’s decision not to take action on some recent posts by President Trump. As I reported on Thursday, the decision sparked an unusual amount of internal dissent among employees, and on Monday it spilled over into public with a virtual walkout of a few hundred employees.

Employees sent me a recording of the Tuesday meeting, and it offers a valuable window into Mark Zuckerberg’s decision-making progress, his planned next steps, and the ongoing dissatisfaction of some employees. My colleague Adi Robertson listened in with me. She wrote:

While Zuckerberg said he should have offered more transparency to employees, he stood by what he called a “pretty thorough” evaluation of Trump’s posts, saying the choice to avoid labeling or removing them was difficult but correct.

According to a recording obtained by The Verge, Zuckerberg described being upset by Trump’s recent posts, one of which warned protesters that “when the looting starts, the shooting starts.” But “I knew that I needed to separate out my personal opinion … from what our policy is and the principles of the platform we’re running are — knowing that the decision that we made was going to lead to a lot of people being very upset inside the company and a lot of the media criticism we’re going to get,” said Zuckerberg.

On some level, the dispute between top executives and the walkout participants is intractable — you either think Trump’s post should have come down, or shouldn’t have. But if you assume there will be many more decisions like this to be made in the run-up to the US presidential election, as I do, it’s valuable to understand how Facebook justifies and communicates its policy decisions. Last week, Trump put the company in a bad position — and if events continue to unfold as they have for the past few days, he’ll likely soon put them into a worse one. Today’s meeting can help us understand how Facebook might react when that happens.

Here are my top takeaways from the roughly 85-minute recording.

  1. Facebook might adopt temporary speech restrictions for state actors in the United States if civil unrest escalates. “If we were entering a period where there may be a prolonged period of civil unrest, then that might suggest that we need different policies, even if just temporarily in the United States for some period, compared to where we were before,” Zuckerberg said. “And we have some precedents for what that might look like.” He said that the company had moved to remove COVID-19 misinformation because it represented a public health emergency, and that excessive policing of the use of state force could similarly be considered an emergency that would warrant policy changes.
  2. Zuckerberg acknowledged talking to Trump about his posts, but said the phone call happened only after he had decided to leave them up. Zuckerberg had been criticized for personally talking to the subject of a high-profile decision on the day he made it — a privilege granted to exceedingly few subjects of content moderation on Facebook, and a decision that disappointed some employees I know.
  3. Only one black employee was involved in the final Trump decision. That was Maxine Williams, Facebook’s global diversity officer. Zuckerberg added that there are additional black employees on the policy and integrity teams, but employees pressed him on why there weren’t more black voices included in the discussion.
  4. Facebook has a seven-point plan to address employee concerns over the decision about Trump posts. Zuckerberg said Facebook will re-examine policies about states threatening the use of force; re-examine policies that could let people use the threat of contracting COVID-19 to suppress the vote; consider adding new labels for content that doesn’t violate Facebook’s community standards but is objectionable in some way; communicate better internally about how policy decisions are made; include more diverse viewpoints on the policy team; solicit new initiatives to advance racial justice on Facebook from employees; and create a new product modeled on the COVID-19 information hub that helps people vote.
  5. Zuckerberg thinks that Facebook keeping the Trump posts up likely damaged public perception of the company. “Likely this decision has incurred a massive practical cost for the company to do what we think is the right step,” he said.
  6. Zuckerberg encouraged employees to see defending free speech as a noble cause. He noted that the original video of George Floyd’s murder by a police officer had been posted to Facebook, sparking global protests. Facebook employees should take some measure of pride in that, he said: “I would urge people not to look at the moral impact of what we do just through the lens of harm and mitigation. That’s clearly a huge part of what we have to do — I’m not downplaying that. … But it’s also about the upside, and the good, and giving people a voice who wouldn’t have previously been able to get into the news and talk about stuff. And having painful things be visible. I think that matters, too.”
  7. There is a red line Trump can’t cross, and Facebook already enforced it. Yesterday I wrote here that much employee frustration can be traced to concerns that there’s nothing Trump could do that would prompt Facebook to remove one of his posts. Zuckerberg noted for employees today that Facebook actually did remove Trump ads in March that misled users into thinking that a campaign survey was actually the US Census. That said, it’s generally much less controversial to remove an ad than a regular post — so far as I can tell, Trump never even commented about the ad situation.
  8. Zuckerberg is worried that free speech will only ever ratchet down, and that we’ll regret it someday. “Over time, in general we tend to add more policies to restrict things more and more,” he said. “If every time there’s something that’s controversial your instinct is, okay let’s restrict a lot, then you do end up restricting a lot of things that I think will be eventually good for everyone.”
  9. Employees I spoke with did not seem particularly moved by these answers. “Everyone’s grateful we have a chance to address these things directly with him,” one told me. “At the same time, no one thinks he gave a single real answer.” Another said Zuckerberg appeared “really scared” on the call. “I think he fears his employees turning on him,” the employee said. “At least that’s what I got from facial expressions and tone.”

At the same time, another employee told me that Zuckerberg’s decision was supported by the majority of the company, but that people who agreed with it were afraid to speak out for fear of appearing insensitive. (An employee who spoke on the call echoed this point.)

How you view the controversy over Trump posts ultimately depends, I think, on how you think the next year is going to play out. Some are predicting a return to equilibrium; others are expecting an ongoing escalation of violent rhetoric. Another way of saying this is that there are optimists and pessimists.

The optimists argue that social media is neither good nor bad, but simply a powerful new tool for society to use. Its misuses get more attention than its positive uses, but those positive uses — such as drawing attention to the murder of an innocent man by police — ultimately outweigh the negative ones. This year might be a particularly fraught moment, the optimists say, but eventually (and maybe even soon!), things will start getting back to normal and we’ll be glad we preserved our free speech traditions so that movements like Black Lives Matter can continue to use these tools for positive ends. Zuckerberg, in this framing, is an optimist.

The pessimists argue that social networks are getting played by bad actors who have gamed it to re-enforce and worsen existing power imbalances. The president, for example, has used these platforms to call on police and the military to “dominate” peaceful protesters and crack down on constitutionally protected gatherings — using his own free speech, coupled with the algorithmic promotion that social platforms to provide, to suppress the free speech of others. The pessimist knows that you can never truly de-platform the president of the United States, but she would rather not help a wealthy and powerful corporation build the megaphone. Especially given that she knows worse posts are to come, and that the collective damage is likely only to increase.

The optimist believes that the long arc of history bends toward justice. The pessimist turns on the TV and sees fascist boots on the ground already, tear-gassing their fellowAmericans. The optimist may have the time and privilege to wait out any civil unrest. The pessimist may not.