Twitter’s recent claim to “stand with women around the world” rings hollow in light of the multi-billion-dollar social media platform’s longstanding failure to protect women users from violence and abuse, said Amnesty International today as it published new research into women’s experiences on the platform.
The new report, #ToxicTwitter: Violence and abuse against women online, shows that the company is failing to respect the human rights of women because of its inadequate and ineffective response to violence and abuse. It includes a series of concrete recommendations for how Twitter can become a safer place for women.
“Women have the right to live free from discrimination and violence, both online and offline. But by letting abuse against women flourish, Twitter is undermining these rights. Despite repeated promises to clean up the platform, many women are logging onto Twitter to find death threats, rape threats and racist or homophobic slurs littering their feeds,” said Azmina Dhrodia, Technology and Human Rights Researcher at Amnesty International.
“Our investigation shows that Twitter is failing to provide adequate remedies for those who experience violence and abuse on their platform. As a company it needs to do much more to respect the human rights of women.”
CEO Jack Dorsey issued a plea for help this month, pledging to make the company publicly accountable on efforts to improve the “health” of conversation on its platform. However, despite several requests from Amnesty International, Twitter refused to provide meaningful data publicly on how the company responds to reports of violence and abuse.
“It is great that Jack Dorsey is asking for help and feedback on this issue, but Twitter’s refusal to disclose meaningful information about how they are dealing with online violence against women makes it hard to know how to address the problem. Twitter should take concrete steps proactively, such as – at a minimum – committing to respond to users who report abuse,” said Azmina Dhrodia.
Twitter said it disagreed with Amnesty International’s findings. In a statement, the company said it “cannot delete hatred and prejudice from society”, and explained it had made more than 30 changes to its platform in the past 16 months to improve safety and had increased the instances of action taken on abusive tweets. The company repeated its refusal to share data on how it addresses reports of abuse. It said such data “is not informative” because “reporting tools are often used inappropriately”.
Amnesty International acknowledges that context is important when sharing any raw data, but there is nothing to stop Twitter providing context alongside data, and the company’s human rights responsibilities means it has a duty to be transparent in how it deals with reports of violence and abuse.
“Twitter has repeatedly tried to shift attention away from its own responsibilities by focusing on the wider issue of hatred and prejudice in society. We are not asking them to solve the world’s problems. We are asking them to make concrete changes that truly demonstrate that abuse against women is not welcome on Twitter,” said Azmina Dhrodia.
The report is based on a combination of quantitative and qualitative research conducted over the past 16 months. It is based on interviews with 86 women and non-gender binary individuals, including politicians, journalists, and ordinary users across the UK and USA, about their experiences of Twitter’s failure to take reports of abuse seriously.
Twitter’s own policies on hateful conduct prohibit violence and abuse against women, and Twitter has a reporting system in place for users to flag accounts or Tweets that are in violation of this policy.
However, the report says Twitter fails to let users know how it interprets and enforces these policies or how it trains content moderators to respond to reports of violence and abuse. The report concludes that abuse is inconsistently enforced or sometimes not even responded to at all, meaning abusive content stays on the platform despite violating the rules.
Miski Noor, a gender non-conforming communications specialist for Black Lives Matter Global Network, said: “Twitter is going to have to say whether they’re for the people or they’re not. Twitter has the power to change the way women and femmes are experiencing abuse, or even if we experience abuse, on their platform. After all, they are the convenors of the space and they have the power to change our experiences.”
The impact of abuse
Like all businesses, Twitter has a responsibility to respect human rights, including the rights to live free from discrimination and violence and to freedom of expression and opinion. However, Amnesty International’s research shows that Twitter’s failure to adequately tackle violence and abuse by its users contributes to silencing women on the platform.
In 2017 Amnesty polled 4,000 women in eight countries and found that over three quarters (76%) of women who had experienced abuse or harassment on a social media platform made changes to how they use the platform. This included restricting what they post: 32% of women said they’d stopped posting content that expressed their opinion on certain issues.
An Amnesty poll carried out in Australia found that three in 10 Australian women experienced online abuse, and that effects suffered included stress, anxiety, panic attacks and disturbed sleep.
In 2018, the same Amnesty poll carried out in Australia found that three in 10 Australian women experienced online abuse, and that effects suffered included stress, anxiety, panic attacks and disturbed sleep.
Amnesty International has documented how women of colour, women from ethnic or religious minorities, LGBTI women, non-binary individuals and women with disabilities, are targeted with additional and particular types of abuse. This can have the effect of driving already marginalised voices further out of public conversations.
US journalist Imani Gandy told Amnesty International, “I get harassment as a woman and I get the extra harassment because of race and being a black woman. They will call white women a ‘c*nt’ and they’ll call me a ‘n*gger c*nt’. Whatever identity they can pick they will pick it and use it against you. Whatever slur they can come up with for a marginalised group – they use.”
Curating a less toxic experience
The report outlines concrete recommendations for how Twitter can become a safer and less toxic place for women. These include:
- Sharing specific examples of violence and abuse that will not be tolerated;
- Sharing data on response times to reports of abuse, setting targets and reporting regularly; and
- Ensuring that decisions to restrict content are consistent with international human rights law and standards
Twitter should also focus on enabling and empowering users to curate a safer and less toxic Twitter experience. This should include creating awareness campaigns about the different safety and privacy features available.
“The past few months have seen a surge of solidarity and activism from women around the world, and there’s no doubt that Twitter has an important role to play in movements like #MeToo,” said Azmina Dhrodia.
“Twitter’s recent initiatives shows that it wants to be a part of this change, but women who’ve experienced abuse on the platform simply aren’t buying it. Without taking further concrete steps to effectively identify and account for violence and abuse against women on its platform, Twitter cannot credibly claim to be on women’s side.”