Amnesty International has released a report revealing that X, formerly known as Twitter, has facilitated technology-facilitated gender-based violence (TfGBV) against members of the LGBTQ+ community in Poland. The organization asserts that following Elon Musk‘s acquisition of the platform, the company’s Community Guidelines have been relaxed, leading to an escalation of harmful content. The report calls for immediate changes to address what it describes as a platform overwhelmed with TfGBV material.
Alia Al Ghussain, a Researcher and Advisor on technology and human rights at Amnesty International, stated that inadequate content moderation and insufficient human rights diligence have enabled abuses against Poland’s LGBTQ+ individuals. The research highlights a significant prevalence of homophobic and transphobic content, particularly among accounts connected to politicians who oppose LGBTQ+ rights.
The report identifies a critical issue with X’s algorithm, particularly the “For You” feed, which is designed to enhance user engagement by prioritizing content that generates interaction. This algorithmic approach has resulted in the amplification of hateful messages, according to the findings. Amnesty International characterizes X’s business model as “surveillance-based,” heavily reliant on data collection for targeted advertising.
Moreover, the lack of adequate funding for Polish language content moderation is alarming. Currently, only two Polish-speaking content moderators are tasked with overseeing a user base of 5.33 million X users in a country of 37.45 million. This disparity contributes significantly to the platform’s inability to manage TfGBV content effectively, leading to a concerning environment for LGBTQ+ individuals.
Compliance with European Regulations Under Scrutiny
Amnesty International’s report also critiques X for failing to adhere to the European Union’s Digital Services Act (DSA). Article 34(1) of the DSA mandates that providers of Very Large Online Platforms (VLOPs) must proactively identify and assess systemic risks associated with their services, particularly those impacting human rights. This includes evaluating risks to fundamental rights as laid out in the Charter of Fundamental Rights of the European Union (CFREU).
Additionally, Article 34(1)(b) specifically notes the importance of assessing whether systemic human rights risks exist, including the right to human dignity, respect for private and family life, freedom of expression, and non-discrimination. In light of these requirements, Amnesty International asserts that X’s operations have compromised the ability of LGBTQ+ individuals in Poland to express themselves freely, live without discrimination, and feel secure within their society.
Previously, in December 2023, the European Commission had initiated action against X under the DSA, demanding that the platform enhance its recommender system by January 2025. Amnesty International urges that any ongoing investigations by the European Commission should thoroughly evaluate X’s effectiveness in mitigating the risks associated with TfGBV.
The findings presented by Amnesty International reflect a broader concern regarding the safety and rights of marginalized communities online, emphasizing the urgent need for reform in how social media platforms manage content moderation and the implications of their algorithms. Without significant changes, the LGBTQ+ community in Poland may continue to face heightened levels of online abuse, posing serious challenges to their rights and safety in the digital space.