Local NPR for the Cape, Coast & Islands 90.1 91.1 94.3
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

Far right is using Twitter's new policy against extremism researchers and activists

ARI SHAPIRO, HOST:

A new policy at Twitter aims to limit harassment. The so-called private media rules make it more difficult for users to post photos and videos of people without their consent. Almost as soon as the policy took effect last week, far-right trolls began a transparent campaign to exploit it for their gain. Their targets include antifa activists, independent journalists and extremism researchers who say this is already having a chilling effect on public interest reporting.

Here to explain are NPR's Shannon Bond, who covers tech, and Odette Yousef, who covers domestic extremism. Odette, I believe you've been talking to some of the people who were affected by this. What have you heard?

ODETTE YOUSEF, BYLINE: That's right. So I spoke with Gwen Snyder. She's an organizer and extremism researcher who documents activities of the far right in her hometown of Philadelphia. Last week, she got notification that a Twitter thread she'd put out back in May of 2019 had been flagged under the new policy. It was a thread that was regarding a Proud Boys event held in Philadelphia.

GWEN SNYDER: This was a thread that was documenting this public march by right-wing extremists and people very publicly affiliated with the Republican Party in Philadelphia. And I walked through and identified both the extremists and the politicians. And I got a notification at about 2:30 a.m. on Friday, I believe, saying that the thread had been reported under the new Twitter privacy policy and that I had a choice of either deleting, appealing or permanently having my account suspended.

YOUSEF: And, Ari, ultimately, she opted to delete it. But she says that Twitter later acknowledged in other reporting that this never should have been flagged. You know, this was a thread where she documented a public event in a public space.

SHAPIRO: Shannon, I understand Twitter acknowledged that the policy has been misapplied in some cases. What have they said?

SHANNON BOND, BYLINE: Well, what they said is - well, maybe let's back up, actually, and talk about what this policy is.

SHAPIRO: Sure.

BOND: The idea is if someone tweets a photo or a video of you in a way that you feel is threatening or harassing, you can ask the company to take it down. And what Twitter says this is about is to stop abuse, right? It's seen these kind of photos and videos used to go after people, especially women, activists, members of minority communities.

An example Twitter gave me is, you know, there was an incident where a victim of sexual assault was publicly identified against their will but in a post that didn't break any Twitter rules. So this policy about these photos gives the company a basis to take something like that down.

SHAPIRO: OK.

BOND: But Twitter says what had happened last week is that it saw the policy being exploited by extremists who were, sort of in a coordinated way, using it to target activists. And then when Twitter's own trust and safety team reviewed the reports, it actually messed up, the company says, and suspended these accounts.

SHAPIRO: So Shannon, what does Twitter say is going wrong here?

BOND: Well, it says, you know, that this was a mistake in enforcement that - you know, that when it was overwhelmed with these reports, these coordinated reports, you know, it made mistakes and content like what Gwen posted should not have been take down - taken down. And Twitter, you know, has had this policy in place in other countries since at least 2014, in countries that have legal rights to privacy. But as far as we can tell, it has not been abused in this same way there the way it's been abused here in just the past week.

SHAPIRO: Odette, aside from how this is affecting individual accounts, what are people saying about the wider implications of these restrictions?

YOUSEF: So there's been concern in some channels that have been working to identify participants of the January 6 riot at the Capitol that the work that they've been doing over the past nearly year might go away, that they might be flagged for violations and have to delete research and findings that they've published on Twitter. You know, you - I would note that Twitter has been a really key platform for these so-called sedition hunters to organize themselves, solicit leads and ultimately share information pertaining to important federal investigations.

One person I spoke with about this is Chad Loder. They're an antifa activist and citizen journalist based in LA. And they say probably the big accounts that are doing that work won't be affected, but other people who are kind of doing the hyperlocal, in the weeds investigations may be. Here's what Chad said.

CHAD LODER: I think that ongoing work from the individual researchers is at risk more so than, let's say, the bigger accounts that have 60,000 followers that are recorded in multiple news articles, which essentially function as aggregators of research that's been done so far.

YOUSEF: You know, Ari, I think some people listening may be thinking, you know, so what? It's just a Twitter account. But for researchers and activists like Gwen who have their eyes on extremist activity in hyperlocal settings, Twitter's really important. It lets them, you know, publish a report in real time and reach press and politicians, and ultimately help paint a picture of trends that we're seeing in this country.

And they're saying that the uncertainty around this rule is having a chilling effect on them. You know, they don't want to lose access to their accounts, but they also aren't sure what they're allowed to publish anymore.

SHAPIRO: Shannon, what does Twitter say to that?

BOND: Well, you know, Twitter is - I think it wants - what's important to note here is that there are a lot of exceptions to this policy. And so Twitter's saying, you know, this shouldn't have happened in the first place. This policy is not supposed to apply, for example, to public figures, and it's not supposed to apply in cases of public interest or newsworthiness. And Twitter specifically says things like public events, like protests, don't violate these rules.

But what it comes down to here, Ari, is really a question of - you know, these are judgment calls, right? Deciding if a photo is in the public interest, you know, even if I identify as a private person, you know, is a judgment call that this company has to make, that its team that's reviewing these reports has to make. And we've seen already how this can be gamed, how this can be exploited. And so now, you know, Twitter says it is doing a top-to-bottom internal review to make sure the policy is actually being applied the way it's intended - to curb harassment and not actually be used to harass people.

But ultimately, you know, this is a real tension here that tech companies face to balance safety against free expression and, you know, to make these kinds of decisions. It's a problem none of these companies, including Twitter, has really solved. And so I think we're going to have to look to them to see what they tell us about, you know, the transparency about making these decisions and whether they're more effective at enforcing this policy in the future.

SHAPIRO: Does this have anything to do with Twitter getting a new CEO?

BOND: No, the company tells me that this policy change was well in the works before the new CEO was announced last week.

SHAPIRO: NPR's Shannon Bond and Odette Yousef, thank you both.

YOUSEF: Thank you.

BOND: Thanks.

(SOUNDBITE OF MUSIC) Transcript provided by NPR, Copyright NPR.

Shannon Bond is a business correspondent at NPR, covering technology and how Silicon Valley's biggest companies are transforming how we live, work and communicate.
Odette Yousef
Odette Yousef is a National Security correspondent focusing on extremism.