Related
Topics
Guests
- Zeynep Tufekciassociate professor of information and library science at the University of North Carolina at Chapel Hill. She is also a faculty associate at the Harvard Berkman Klein Center for Internet & Society. Her book is titled Twitter and Tear Gas: The Power and Fragility of Networked Protest. Her recent piece for The New York Times is headlined “We Already Know How to Protect Ourselves from Facebook.”
In Burma, seven soldiers have been sentenced to 10 years in prison for participating in the massacre of Rohingya Muslims in the village of Inn Din in western Rakhine State. The bodies of 10 Rohingya men were discovered in a mass grave there last September. The victims are among thousands of Rohingya who have been killed by the Burmese military’s ethnic cleansing campaign against the minority Muslim group. For years, activists have demanded Facebook regulate hate speech against Rohingya on its platform, saying this speech has contributed to the rise in violence against the persecuted community. For more, we speak with Zeynep Tufekci, associate professor of information and library science at the University of North Carolina at Chapel Hill. She is also a faculty associate at the Harvard Berkman Klein Center for Internet & Society. Her book is titled “Twitter and Tear Gas: The Power and Fragility of Networked Protest.”
Transcript
AMY GOODMAN: Facebook has been accused by U.N. investigators and human rights groups of facilitating violence against the minority Rohingya Muslims in Burma by allowing anti-Muslim hate speech and false news to spread on its platform. Vermont Democratic Senator Patrick Leahy questioned Facebook’s Zuckerberg about this at the hearing yesterday.
MARK ZUCKERBERG: Senator, what’s happening in Myanmar is a terrible tragedy. And we need to do more.
SEN. PATRICK LEAHY: And we all agree with that.
MARK ZUCKERBERG: OK.
SEN. PATRICK LEAHY: But U.N. investigators have blamed you, blamed Facebook, for playing a role in the genocide. We all agree it’s terrible. How can you dedicate, and will you dedicate, the resources to make sure such hate speech is taken down within 24 hours?
MARK ZUCKERBERG: Yes, we’re working on this. And there are three specific things that we’re doing. One is we’re hiring dozens of more Burmese-language content reviewers, because hate speech is very language-specific. It’s hard to do it without people who speak the local language. And we need to ramp up our effort there dramatically. Second is, we’re working with civil society in Myanmar to identify specific hate figures, so we can take down their accounts, rather than specific pieces of content. And third is, we’re standing up a product team to do specific product changes in Myanmar, and other countries that may have similar issues in the future, to prevent this from happening.
AMY GOODMAN: So that’s Mark Zuckerberg. Zeynep Tufekci, if you can respond?
ZEYNEP TUFEKCI: So, I think there are some things that are genuinely hard and complex about this transition to the digital economy, and we all need to figure it out, including Facebook. So, there are some things I have sympathy for.
The situation in Myanmar and what Facebook allowed is inexcusable, because I know that—I personally know that, at least since 2013, civil society groups have been literally begging Facebook to step up. What happened is, as the country transitioned from this military junta to a more democratic, more open situation, Facebook came in, along with the rapid spread of digital SIM cards and phones, but without the proper oversight. Like imagine—like consider all the problems that this kind of new public sphere is causing in Europe and U.S., where we have a lot of countervailing institutions.
So, in Myanmar, there is an extremist Buddhist group that is very anti-Muslim and is promoting ethnic cleansing. This is not a joke. It’s the second-biggest refugee outflow in the world. And what they started doing is they started using Facebook to spread their, literally, blood libel, you know, all these false accusations. There was an interview with a Burmese person about the Muslim minority. And he was like, “They’re horrible. They’re doing all these horrible things.” And they asked him, “How do you know?” He said, “I know it, thanks to Facebook.” Enormous amount of hate speech went viral, without Facebook putting in the kind of things that Zuckerberg is finally saying they’re going to put in.
This is a company with half a trillion market capitalization. This is a company where people have been asking, begging, crying, pleading with, you know, privately, publicly, for years to hire an enormous amount of people and try to get a handle on the situation. It’s not that they’re causing the whole problem by themselves, of course. But they are instrumental. They were the Radio Rwanda. They were the, you know, Triumph of the Will. They were the—that’s the kind of role they played in this country.
And it was only recently, and only in the wake of the scandal, I’m finally hearing the CEO address this. And I think, when we look back on it, you know, some of the stuff is—again, really, the digital transition is hard, and we’re all in over our heads. I think there’s no excuse for their ignorance, their negligence, and the fact that dozens of people, for a country, you know, of—where ethnic cleansing is going on, after so many years, it’s such a minimal step. They should have had as large a team as necessary—they have the money—as early as possible. It’s the kind of thing, like if you wake up as the CEO of this company, it should be the first question you ask everyone, and say, “I’m going to spend as much time and as much energy on this,” because this is the kind of thing that’s going to go down in history as causing enormous human misery. It’s destabilizing the whole region with refugee outflows, people stuck in Bangladesh, people stuck elsewhere.
So, I think it’s an example—it’s, for me, the most egregious example of what Facebook’s lack of internal drive for proper oversight and what our society’s lack of attention to intervene and say, “Look, Burma, Myanmar, might be another country, but it’s our company that’s implicated here.” And, you know, U.S. and Europe could have done more, too. And it’s the saddest situation, because now that it’s so out of hand, there are a couple dozen moderators, new products finally happening, will probably have a marginal effect, whereas maybe earlier, with enough civil society and global attention, plus Facebook doing what it needed to do, maybe we wouldn’t be here. And this is millions of people who are displaced. There are thousands of people who were killed. So, I think it’s just the worst example, and it’s just shameful. I don’t know what else to say.
AMY GOODMAN: Professor Tufekci, you write in Twitter and Tear Gas about Facebook censorship in Turkey, in your country. You write about a Facebook page of the mayor of the biggest majority-Kurdish city in the region banned, even though almost 400,000 people had liked his page before it was taken down. This issue of Turkey was not raised in yesterday’s hearing. We’ll see what happens in the House hearing today.
ZEYNEP TUFEKCI: So, this is an interesting thing. I also write—like I want to sort of present the complexity. Around the world, there are a lot of countries in which Facebook is also an anti-censorship tool. Right? So this is why it’s so crucial that it happens better, because there are a lot of countries where TV is censored, where radio is censored, where newspapers are censored. And social media is one way in which—we see this in the United States: Black Lives Matter, Women’s March. A lot of movements that have difficulty first getting either respect or traction from traditional media or mainstream media can use social media to circumvent the censorship.
But what we usually see, again, as the problem, is that they, Facebook, the company, just doesn’t have the kind of staffing you need to understand which page is correct, which page is not. And what happens is, political opponents report one another’s pages, and then it gets taken down. And, you know, Facebook is just this giant company. They’re in Menlo Park. They’re in Dublin. And you’ve got people, say, in Egypt, just begging them, “Can you do this?” And it’s in a queue somewhere, and maybe somebody will get to it. They are just not equipped to be the moderators of speech for the whole world, as things stand.
But again, the complexity is, social media and digital tools have also allowed enormous amount of anti-censorship around the world. So you don’t want a situation in which Facebook is just shutting everything down as it wants. What you want is sensible rules, where a situation like, you know, something promoting ethnic cleansing is actually shut down, whereas political speech is not. And that’s not an easy thing. And that is not something Facebook has done well.
And, you know, all these hearings, people are asking Mark Zuckerberg questions. And I’ve been saying, “Look, these are political questions. I don’t want to hear what Mark Zuckerberg has to say.” I want the political discussion on this. I want legislation. And I want us to tell Mark Zuckerberg and all of Silicon Valley what to do. I mean, who died and made them king of the world in the political sphere? Right? We need to sort of have this conversation, decide how we’re going to regulate this, what kind of oversight is necessary, and tell these companies how to—what rules they have to follow. And until now, it’s just been the Wild West.
AMY GOODMAN: Well, that’s extremely interesting, because so many of the senators were saying, “Do you want to be regulated? Will you support regulation?” I mean, regulation of his own company. Now, I have a question about an issue that has been discussed a bit, and that is—I mean, we have highways of the United States. It’s the infrastructure of our country, how we get around.
ZEYNEP TUFEKCI: Absolutely.
AMY GOODMAN: For the most part, they are not owned by private corporations. Do you think Facebook, which is the information highway, in a sense, all over the world, should be nationalized?
ZEYNEP TUFEKCI: Well, the threat with nationalization is that it’s such an attractive tool for governments to do their own propaganda and their own surveillance, right? So, if a government owned something like Facebook, can you imagine the kind of surveillance that it would allow and the kind of manipulation it would allow? So what you want is a method and rules that make it so that it’s not a tool of surveillance, that it’s not a tool of authoritarianism.
So, there’s a lot of things that can be done. Anti-trust is certainly on the table, because one of the questions Mark Zuckerberg was struggling to answer was “Who’s your competitor?” Well, the answer is: not really anyone. Right? So, that just brings up a lot of monopoly power and anti-trust questions. But maybe we also need to say, “What about limiting data retention? What about limiting the kind of surveillance they can do? What about mandating some kind of oversight and appeals process? What about breaking certain things up?” There’s all these things that can be done without taking Facebook, as it is, and just appending it to the U.S. government, because I think that would actually be quite dangerous, to give any government this kind of potent tool.
I think what we should do is make sure this tool is less dangerous, that it serves people, not just profit, that the political decisions that we face are made as political decisions, and we have this discussion. And we bring, like when you go on the highway—right?—we have safety rails, we have seat belts, we have emission controls, we have all sorts of things that companies fought tooth and nail. It’s not perfect, but at least it got us to a better place. We need that version for the digital public sphere, just the way we needed it with traditional media, just the way we need it with any other information distribution thing.
AMY GOODMAN: Do you see Europe as a model? How is Europe dealing with this, versus the United States?
ZEYNEP TUFEKCI: It’s gotten started. So, at least in Europe, on May 25th, there is going to be a legislation that’s going to go into effect. It’s called GDPR, the General Data Protection. It’s got a lot of good things. So, it’s a big start. And a lot of people are like, “Oh, this is so onerous.” Actually, GDPR is just a start. It’s bringing some better privacy rights and some more control to users, and it will probably benefit United States and other countries, too, because the companies are going to comply with it, and it’s easier to comply globally.
I think we’ve just begun. As I can tell from the hearings, even Mark Zuckerberg, I think he’s in over his head, and I don’t think any one person can deal with this. The senators are trying to grapple with it. These are very political questions. Even if you’re not on Facebook, it doesn’t matter. This is going to affect people even if you don’t care about social media. This is how politics happens. So I think we’re beginning. I don’t—
AMY GOODMAN: We’ve just lost that satellite link to the University of North Carolina at Chapel Hill, but I want to thank Professor Zeynep Tufekci, associate professor of information and library science at the University of North Carolina at Chapel Hill, also faculty associate at Harvard Berkman Center for Internet & Society. Her book, Twitter and Tear Gas: The Power and Fragility of Networked Protest. And we’ll link to her piece in The New York Times, “We Already Know How to Protect Ourselves from Facebook.”
This is Democracy Now! When we come back, we’re going to really home in on this issue of surveillance and the threat to privacy. Stay with us.
Media Options