Twitter is developing a feature aimed at making the site less toxic for users. NPR’s Michel Martin speaks with Amnesty International’s Rasha Abdul-Rahim, who has studied harassment on Twitter.
MICHEL MARTIN, HOST:
We’re going to return now to our Troll Watch series.
(SOUNDBITE OF MUSIC)
MARTIN: This is where we bring you stories of cybersecurity attacks, bots and of course, internet trolls. This week, Twitter confirmed that users will eventually be able to press a button that says hide tweet that would, as you might imagine, allow users to hide certain responses to their tweets. And that means if you tweet something and you get nasty or abusive replies back, you could make those replies invisible to others.
Now one reason this is of interest of course is the abuse directed at women on Twitter, something Amnesty International researched extensively in a report recently. We spoke with Amnesty about that report, so we wanted to follow up to ask them what they make of this new planned feature. Joining us now is Rasha Abdul Rahim, deputy director of Amnesty Tech. She’s with us via Skype from London. Rasha, thanks so much for talking to us.
RASHA ABDUL-RAHIM: No problem. Thank you for having me.
MARTIN: So tell us about this new Twitter feature. What’s your understanding of how it would work?
ABDUL-RAHIM: Yeah, so my understanding is that Twitter’s developed this new feature as a way to allow people – women – who received tweets that may not reach the threshold of being abusive or hateful to allow them another way to hide problematic tweets they may be receiving so that they’re not as visible to them and to others. But my understanding is that people will still be able to view those tweets if they click the tab that shows the hidden tweets.
MARTIN: So what are the pros and the cons of this?
ABDUL-RAHIM: I see four different issues with this. I think the first one is the – I think there’s a danger here of brushing the issue under the carpet, so brushing the issue of problematic tweets under the carpet and not holding people who are sending those tweets accountable. So these kinds of tweets, as I said before, may not necessarily reach the threshold of abuse or hateful conduct, but they still contain hurtful or hostile content, and especially if they’re repeated to an individual on multiple occasions.
And these are the kinds of things that can reinforce negative or harmful stereotypes against a group of individuals, such as women, such as women of color. And they may still have a silencing effect on them. So I think here is – the key is, you know, will the effect of this be that those kinds of repeat offenders will not have any kind of accountability leveled to them for sending, you know, a barrage of these kinds of problematic tweets?
MARTIN: Two criticisms I’ve seen come from different angles – one is that this still puts the onus on women to solve the problem…
ABDUL-RAHIM: Absolutely. Yeah, that was the next point I was going to make.
MARTIN: …As opposed to putting the onus on Twitter. But the other side of the equation is some are arguing that this allows for censorship. I mean, it could allow for say, public officials to, you know, decide that they don’t want other people to see legitimate criticism directed at them just cause they don’t like it.
ABDUL-RAHIM: Yeah, totally. These are two issues as well that we’ve spotted. So the burden is still, as you say, on the individuals experiencing the abuse to label or to hide every single tweet. And this doesn’t only take time, but it also takes an emotional toll on those individuals who are receiving that abuse. And it’s almost as if Twitter is kind of outsourcing that responsibility to the people who are experiencing this abuse. And as you say, it could also have a silencing effect on free speech if powerful public figures such as politicians can hide dissent or prevent users from holding public figures to account.
MARTIN: Could you just remind people for those who did not hear our prior conversation why you feel that abusive tweets, this kind of communication, rises to the level of a human rights concern, such that Amnesty would take as much time and effort as it has to investigate it? Why do you think this is a problem?
ABDUL-RAHIM: It’s a problem because if women are disproportionately experiencing abuse or harassment or are targets of problematic tweets, then this means that this could have a silencing effect on them. And this is something that we found in our research that women tend to change the way in which they interact on these platforms. They tend to, you know, withdraw from Twitter or change the way in which they use their language on Twitter in order to not subject themselves or open themselves up to abuse.
And, you know, our research has shown that 7.1 percent of tweets that were sent to women in this study were problematic or abusive, which amounted to 1.1 million tweets mentioning 778 women across the year, which amounts also to one every 30 seconds. And we also found that women of color were 34 percent more likely to be mentioned in abusive or problematic tweets than white women. Black women were disproportionately targeted, being 84 percent more likely than white women to be mentioned in abusive or problematic tweets.
So this is clearly a problem. And if Twitter is not responding or addressing this problem effectively, then this obviously will have an impact on women’s ability to freely express themselves on the platform.
MARTIN: That’s Rasha Abdul Rahim, deputy director of Amnesty Tech at Amnesty International. We reached her via Skype in London. Rasha, thank you so much for talking with us.
ABDUL-RAHIM: Thank you, too. Good bye.
NPR transcripts are created on a rush deadline by Verb8tm, Inc., an NPR contractor, and produced using a proprietary transcription process developed with NPR. This text may not be in its final form and may be updated or revised in the future. Accuracy and availability may vary. The authoritative record of NPR’s programming is the audio record.