was successfully added to your cart.

 

7th December 2021, 14:54:03 UTC

Twitter is still not doing enough to protect women and non-binary users from online violence and abuse, new analysis from Amnesty International found.

The Twitter Scorecard grades the social media company’s record on implementing a series of recommendations to tackle abuse against women and non-binary persons on the platform.

Despite some welcome progress stemming from recommendations put forth in Amnesty’s 2020 Scorecard, Twitter needs to do much more to address the online abuse of women and/or marginalized groups. The company has fully implemented just one of the ten recommendations in the report, with limited progress in improving transparency around the content moderation process and appeals process.

“Despite our repeated calls to improve their platform, Twitter is still falling short on its promises to protect users at heightened risk of online abuse,” said Michael Kleinman, Director of Technology and Human Rights at Amnesty International USA.

“For a company whose mission is to ‘give everyone the power to create and share ideas instantly without barriers,’ it’s become abundantly clear that women and/or marginalized groups disproportionately face threats to their online safety.”

A survey commissioned by Amnesty International also shows that women who are more active on the platform were more likely to report experiencing online abuse, compared to those less active – 40 per cent of women who use the platform more than once a day report experiencing abuse, compared to thirteen per cent who use the platform less than once a week.

Amnesty International also asked women who chose not to report abuse why they did not do so. Notably, 100 per cent of the women who use the platform numerous times a week and who didn’t report abuse responded that it was “not worth the effort”.

Though Twitter has made some progress, it is far from enough. They have increased the amount of information available through their Help Center and Transparency Reports, while also launching new public awareness campaigns, expanding the scope of their hateful conduct policy, and improving their reporting mechanisms and privacy and security features. Though these are important steps, the problem remains.

In response to this report, Twitter shared with Amnesty International: “We’re committed to experimenting in public with product solutions that help address the fundamental problems our users are facing, and empowering them with controls to set their own experience. While many of these changes are not directly captured in your report scorecard, we believe these improvements will ultimately enable our most vulnerable communities to better engage in free expression without fear, a goal we share with Amnesty.”

Yet Twitter must do more in order for women and non-binary persons – as well as all users, in all languages – to be able to use the platform without fear of abuse. As a company, Twitter has a corporate responsibility and moral obligation to take concrete steps to avoid causing or contributing to human rights abuses, including by providing effective remedy for any actual impact it has inflicted on its users.

“We have seem time and time again that Twitter has continuously failed to provide effective remedies for the real harm and impact its platform has caused women and/or marginalized groups,” added Michael Kleinman, Director of Technology and Human Rights at Amnesty International USA.

“As our world has become increasingly dependent on digital spaces during the COVID-19 pandemic, it’s critical that Twitter meet this moment with demonstrated commitment to improving the online experiences of all users, regardless of their identity.”

Methodology

This Scorecard synthesizes all of the recommendations Amnesty International has made to Twitter since 2018 and distils them into ten key recommendations upon which to evaluate the company. These 10 recommendations coalesce into four high-level categories: Transparency, Reporting Mechanisms, Abuse Report Review Process, and Privacy & Security Features. The analysis is focused on these four categories of change because of the positive impact each can have on the experiences of women on Twitter.

Each individual recommendation is comprised of one to four separate sub-indicators. Amnesty International then determined whether Twitter has made progress against each sub-indicator, grading each indicator as either Not Implemented, Work in Progress, or Implemented. In the context of ongoing public awareness campaigns, Amnesty International looked at whether these campaigns had addressed all the issues which we raised, as well as whether these campaigns and related materials were available in languages other than English.

Ahead of publishing the Scorecard, Amnesty International wrote to Twitter to seek an update on the progress of implementing our recommendations and the company’s response has been reflected throughout the report.