Photo/Illutration Frances Haugen in Austin, Texas, on March 13 (Daisuke Igarashi)

AUSTIN, Texas--A former Facebook (now Meta) employee-turned-whistleblower shocked the world when she disclosed tens of thousands of internal documents she took from the company.

Frances Haugen exposed more than 20,000 pages of internal documents from the social networking giant.

What is contained in them provided information on many problems, such as Facebook’s algorithms to select items to pop up and rank them, which drew huge criticism for accelerating social divides and causing adverse psychological effects on youths.

In a recent interview with The Asahi Shimbun, Haugen spoke about how people should deal with giant social media services. She emphasized that Facebook should fulfill its responsibility and suggested measures for more transparency.

Excerpts of the interview follow:

Question: You pointed out that after Facebook changed its algorithm in 2018, more anger and false information became easier to spread. What surprised me when I read the Facebook documents was that employees had been discussing this for years and couldn’t make a decent improvement.

Haugen: There is a document in November 2020, where Facebook says, we’re going to start valuing anger less. If you look at the comment thread on that document, it’s huge. People were like, “Oh, finally, we’ve only been saying this for 18 months.” And about six weeks later, they released the final weights, where they’re like, “OK, this is what we’re actually going to do for the recalibration, and that had gone away.” If you look at the comments on that, everyone is like, “what happened? You told us six weeks ago, it’s going to happen and now you’re not going to do it, what’s up?”

Q: In a congressional hearing last September, you said that Mark Zuckerberg, Facebook’s CEO, decided that they would not change the algorithm.

A: I think the issue is this question on responsibility. What I bring up often is the idea that it’s hard to admit you have power. Because in the process of admitting you have power, you also have to acknowledge that you have responsibility. It’s much easier to say our hands are tied, we’re the defenders of free speech, there’s nothing we can do, than to come in and say we actually have all these different levers that don’t involve content. They involve things like how easy should it be to reshare? Twitter makes you click on a link before you reshare. Should Facebook do that? What about waiting 30 seconds?

Because those have direct trade-offs and little slivers of profit. I think it’s a thing where Facebook doesn’t want us realizing that that’s the conversation we have to have. Because the way the system is today is much more profitable. And not actually even much more, it’s like, we could probably reduce misinformation by 75 percent. For 1 or 2 percent of profits, the price is not insane. It’s not a huge thing.

Q: One of major issues the Facebook documents pointed out is the lack of measures to moderate content in developing countries. Could you elaborate?

A: For example, Ethiopia has 120 million people. They have six major language families and 95 dialects. It’s incredibly linguistically diverse. Facebook’s current strategy does not work there. Because censorship doesn’t scale. With censorship, you have to rebuild those safety systems in every individual language in the most fragile places in the world.

In places like Ethiopia, Libya, Syria, every single time, the top 10 most popular content was horrible. It would be accusing the opposition of mutilating children, it would be severed heads, it would be incredibly sensationalistic.

I think people who live in places that have a rich internet forget that. I’d say the majority of languages in the world, 80 to 90 percent of the content available on the internet, in that language is only available on Facebook. And we just forgot this when we speak English or Japanese or whatever.

Q: Why did you decide to come out as a whistleblower?

A: I didn’t become convinced. I needed to get information to the public until after they resolved civic integrity. I have an MBA, I went to Harvard. And I literally took a class on change management. Like the study of how you have organizations change, it’s very well established. You have to have institutional support within your company, where you say, here’s a change center. And when Facebook dissolved the civic integrity group, it was a demonstration that they didn’t want to have critical mass for change inside the company.

I felt that they had missed the lessons of the 2020 U.S. presidential election. The 2020 election happened, and they’re like, “Oh, there wasn’t blood in the streets. Therefore, we succeeded.” I felt like that was ignoring how big the risks were. And if you’ll notice that Jan. 6th happened right after that. So clearly, we were not out of the woods yet.

Q: You have been working in the tech industry for a long time. Why did you join Facebook?

A: I’d worked at three social networks before Facebook. I founded the search team at Google Plus, I worked at Yelp, I helped build their machine learning team. And I was the lead product manager for ranking at Pinterest. I’m not trying to brag here. But there’s not a lot of people in the industry that are like both algorithmic specialists, who also look at the human impacts of the system. I would guess there’s something on the order of maybe 200 or 300 people in the whole industry, that have the depth of experience with these recommender systems that I do, across these kinds of development.

The question I had to answer was, if my fears came true, so I genuinely believe there’s like tens of millions of lives on the line in the next 20 years.

Q: We heard that one of the reasons you joined Facebook was your friend’s experience where he believed in conspiracy theories. Could you tell us about it?

202205XX-Haugen-interview-3-L
Frances Haugen addresses the South by Southwest conference in Austin, Texas, on March 14. (Daisuke Igarashi)

A: Part of why I took the job at Facebook was I think people who have not personally experienced someone getting radicalized, don’t understand it. You can really trivialize it. You can say, “Well, smart people don’t have that happen to them, or educated people don’t have that happen to them.” Like that happens to other people.

My friend was a friend of my little brother, and that’s how I met him. He’s the same age younger than my brother that my brother is younger than me. My mother always wanted a third kid and so I always thought of this guy as like my little brother. I can walk today, I’m healthy, because he saved me. Watching him fall into this darkness made me feel so powerless.

Because I’m a child to scientists. They’re academics. I believe that truth is the thing that we can see. If you interact with someone who is falling victim to the echo chamber of misinformation, you realize that humans are so susceptible to the social context of facts. If you are in an echo chamber where the same messages get reinforced, and the actual consensus reality is eroded, you lose the chance to even connect.

These systems are not neutral. As someone falls down the rabbit hole, it’s not like it’s pleasant for them either. If you believe your government is trying to poison you, if you believe teachers are trying to hurt kids, like all these things, it’s not like it makes your day-to-day existence more pleasant. You watch someone fall away from our community, like our clean, our shared reality. If you feel like you can’t pull them back, it’s a real horror. I think there’s lots of families that have experienced this, like you’ve had a relative go into the dark corners of the internet. I think if we treated with more respect for the people who had those experiences, and understood that the platforms are far more responsible for these experiences than the people who fall down these rabbit holes, because the products are designed to be addictive.

Q: What is the best way for the society to treat the algorithms that have such strong power to decide what we are seeing on the Internet?

A: Right now, there’s no route for academics for governments to say, “Hey, you need to do the following ongoing reporting.” If you were a bank, we have other systems that are really opaque and complicated, that drive our lives, that we treat differently than we currently treat social media companies. So, for example, medicine is very complicated, and intimately impacts our lives. We have codes of ethics for doctors because of that. Because we know that they have more information than we do. They have a duty to take care of us. Lawyers, the legal system is complicated. There’s a duty of care there, where they have to look out for us with the things that we know these systems are complicated, and very few people understand.

Facebook is opaque. We each only get a little tiny peephole that is our own individual experience. We don’t get to see the aggregate, like tableau. Up until now, Facebook is taking advantage of this, because no one else could do the research. No one else could see anything more than their own little peephole.

Right now, Facebook doesn’t have the most viewed content report for every country in the world. They only have it for the United States and they only show you 20 pieces of content, which is crazy. I think the reason they do that is if they show the top 5,000 pieces of content, we would be outraged. The reason they don’t show us is they don’t want us to see that. The fact that Facebook won’t disclose, even when they were asked, shows you how messed up the system was.

Basically, the only question to discuss here is censorship. We are all used to free speech. So we’re not going to go further than this. In reality, the entire time they had the solutions to keep us safe, that didn’t involve censorship. They chose not to do them. They chose to not even let us talk about them.

We have to figure out a different feedback loop, where maybe Facebook has to disclose what they know. Maybe it’s a thing where there’s an aggregate data that could be released. That would give a different incentive for them to actually accept these changes, if they had to publish their misinformation numbers all the time or show their most viewed content.

Q: When you spoke at congressional hearings in the United States and Europe, you called for stronger regulations. What do you think is the most important point?

A: We need to write laws that require Facebook to disclose more information. Because right now, we don’t have the public muscle of accountability. Let’s say we’re talking about an oil company. Every year, we graduate worldwide tens of thousands of environmental science majors. These are people who have been trained in like, “Here’s the ways that we monitor companies to make sure they’re healthy.”

Right now, we don’t graduate anyone who has a depth of knowledge in the systems that we have, because you can’t take a college class on it. You can’t get a graduate degree on it. You have to go to these companies and work there to learn about it. That is profoundly dangerous. We need to invest in how we build that public muscle of accountability.

Q: There has been a huge knowledge gap between tech companies and the general public. How can we fill the gap?

A: When you were in high school, you probably took a chemistry class. The thing that’s cool about taking a chemistry class is it lets you simulate what it’s like to be a chemist. You blow up stuff, you breathe in stuff you shouldn’t breathe in. You kind of get a sense of what it’s like to be a chemist. But if you want to be a data scientist, or you want to be an algorithmic engineer, there’s no equivalent lab to begin to learn those meta skills and allow you to do those things.

I really want to build that lab bench for social media, where you have a simulated world, where we can come in there and have a conversation about things like when we reshare content, once it gets beyond friends of friends, what happens? If we had a copy and paste, what would happen? Who would win? Who would lose? What would happen? Because on a lot of these things, the questions aren’t the number bigger or lower. It’s about there are some winners and losers. How do we weigh all those trade-offs? Right now, Facebook is resolving those trade-offs in a really simple way. Did profits go up? Did profits go down? Did growth go up? Did growth go down?

If we had 19-year-olds arguing about some of these things, we teach that class at a bunch of different levels, we can compute the numbers for people who aren’t very mathematical. They could just look at graphs. We could allow someone who wants to be a data scientist actually try to do it themselves. They can build those skills. We need to start thinking about what is that method of education. Because we’re entering a world where we have to think beyond censorship.

Q: In Japan, it’s been said that it’s really hard for whistleblowers, especially women, to come out to the public. Do you have any message for them?

A: I don’t know enough things about gender issues in Japan to feel responsible for giving commentary. That’s all I can say. I feel grateful for how much support I’ve gotten. I think the advice I give to all whistleblowers is you need at least one person you trust so that you can talk to. Because it’s true for all whistleblowers. Because fundamentally, the art of being a whistleblower is first you have to believe what you’re seeing is real. And that’s hard. Because usually those systems are aligned to tell you it’s not real. Because they can only continue to exist if their employees continue to go along with whatever the frame of reference is.

Q: I know you are optimistic about the future of social media. Why could you be so optimistic?

A: I think why I’m optimistic is we know ways to do things in more safe ways. We also can look to the past. Like we love social media, like social media used to be about our family and friends. When you ask people what is Facebook about, people still say that even though you have very little content from your family and friends today. So I think it’s interesting.

I’m optimistic because I see things like decentralized platforms. I see other companies coming along and saying, “Let’s do this, but do it the right way.” I believe that if we just change the incentives on Facebook. Facebook already knows 20 things they could do that would make these platforms safer overnight. So, I believe that we can hold them accountable. Once the incentives change, the behavior will change.

(Marie Louise Leone contributed to this article.)