Sophie Zhang worked as a Facebook data scientist for nearly three years before was she fired in the fall of 2020. On her final day, she posted a 6,600-word memo to the company's internal forum — such farewell notes, if not the length, are a common practice for departing employees. In the memo, first published by Buzzfeed, she outlined evidence that governments in countries like Azerbaijan and Honduras were using fake accounts to influence the public. Elsewhere, such as India and Ecuador, Zhang found coordinated activity intended to manipulate public opinion, although it wasn't clear who was behind it. Facebook, she said, didn't take her findings seriously.
Zhang's experience led her to a stark conclusion: "I have blood on my hands."
Facebook has not disputed the facts of Zhang's story but has sought to diminish the importance of her findings.
"We fundamentally disagree with Ms. Zhang's characterization of our priorities and efforts to root out abuse on our platform," Facebook said in a statement. "As part of our crackdown against this kind of abuse, we have specialized teams focused on this work and have already taken down more than 150 networks of coordinated inauthentic behavior. Around half of them were domestic networks that operated in Latin America, the Middle East, North Africa, and in the Asia Pacific region."
This interview has been edited for length and clarity.
Q: Why were you fired from Facebook?
Zhang: I've made the news for much of the work I have done protecting elections. This might sound very important to the average person, but at Facebook I was a very low-level employee. In addition, this work was not my official job. I was conducting it entirely in my spare time, with the knowledge and acquiescence of leadership, of course. At first, the company was supportive of this. But gradually they lost patience with me. I was underperforming.
Q: In your memo, you wrote that you have blood on your hands. Why did you say that?
Zhang: Whether something was acted on was, as far as I could tell, entirely a function of how much I yelled, how much I made noise.
I know that many of the decisions they have made have had impact in the countries that they worked on. The U.S. is still deeply affected by what happened in 2016 with Russian manipulation on Facebook. For many countries like Honduras or Azerbaijan, this is their own Russia. But it's done not by a foreign power, but by their own government — and without even bothering to hide.
I tried my best to make decisions based on the information I had at the time. But of course I am just one person. Sometimes I waited on something longer than I should have. At this level of responsibility, your best is often not enough.
Q: How did you get into the work you did?
Zhang: When I joined the company I was, like many people, deeply affected by Russia 2016. And I decided to start looking for overlap between inauthentic activity and political targets. And I started finding many results in many places, particularly what we call the global South, in Honduras, Brazil, India.
Honduras got my attention because it had a very large amount (of inauthentic behavior) compared to the others. This was very unsophisticated activity we are talking about. Literal bots. And then I realized that this was essentially a troll farm being run quite openly by an employee of the president of Honduras. And that seemed extraordinarily awful.
Q: Then what did you do?
Zhang: I talked about it internally. Essentially everyone agreed that it was bad. No one wants to be defending this sort of activity, but people couldn't agree on whose job it was to deal with it.
I was trying desperately to find anyone who cared. I talked with my manager and their manager. I talked to the threat intelligence team. I talked with many integrity teams. It took almost a year for anything to happen.
Q: You've said there is a priority list of countries. What happens to countries that aren't on that list?
Zhang: It's not a hard and fast rule. Facebook does takedowns in small countries, too. But most of these takedowns are reactive, by which I mean they come from outside groups — tips from opposition groups, tips from NGOs, reporter investigations, reports from the CIA, etc. What happened in this case was that no one outside the company was complaining.
Q: Given the resources Facebook has, why it can't prioritize every country?
Zhang: The answer that I've seen at Facebook when I was there, when these questions were asked, was that even though Facebook has a ton of money, human resources are different. Even if you have infinite money, you can't expand its size by a factor of 10 every night. It takes time to train people. It takes time to grow.
And it was willing to believe that for a while when I was there. But I think in retrospect, if they genuinely believed that it was important, they would be taking steps that they aren't. They would be focusing very highly on retaining talent in the integrity teams. And they would certainly never have fired me.
Q: How do people still at Facebook try to change this?
Zhang: Like most employees, they're just average people who want to do the 9-to-6, want to go home at the end of the day and sleep.
There's also a self-selection bias. If you think that Facebook is evil, you aren't likely to join Facebook.
But there are many people also who joined Facebook because they wanted to make it better. I was very upfront with them when I joined. I don't think Facebook is making the world a better place. And I told them I wanted to fix it.
Q: Is there a concern among employees about the company's image?
Zhang: I think employees have gotten more pessimistic over time. But there's also a very strong insularity and perhaps paranoia toward the mainstream press. People are skeptical of what the press says about the company.
I don't want to diminish that Facebook has been very open historically. We had regular access to the CEO. I was able to, as a very low level employee, be involved in our discussions with a company vice president. But it's also been changing over time because of fear and worry about employee leaks.
Q: Who is doing the work you did now?
Zhang: I don't know. I was the only person who was going out on my own to look for this behavior rather than waiting for people to tell us that something was going on. The reason I found so many things so easily was because there was so much low-hanging fruit.
Q: Facebook says it's taking down many inauthentic accounts and has sought to dismiss your story.
Zhang: So this is a very typical Facebook response, by which I mean that they are not actually answering the question. Suppose your spouse asks you, "Did you clean up the dishes yesterday?" And you respond by saying, "I always prioritize cleaning the dishes. I make sure to clean the dishes. I do not want there to be dirty dishes." It's an answer that may make sense, but it does not actually answer that question.