Women in public life are facing growing and increasingly sophisticated forms of online violence, the UN has said, warning that “AI-assisted ‘virtual rape’ is now at the fingertips of perpetrators”.
Female rights campaigners, journalists and other public communicators face a deepening threat owing to a combination of artificial intelligence, anonymity and the absence of effective laws and accountability, a report published by UN Women found.
Of more than 600 women in public life, 6% said they had been victims of deepfakes, while nearly a third said they had received unsolicited sexual advances online. About 12% said they had had images of themselves shared without their consent, including intimate or sexual content.
“Artificial intelligence is making abuse easier and more damaging,” said Kalliopi Mingeirou, who leads UN Women’s efforts to end violence against women. “While anonymity, as well as the speed of how this information and narratives circulate in mainstream media, make this content more dangerous.”
Released on Thursday, the report adds to mounting evidence suggesting that, for millions of women and girls across the globe, the digital sphere has become a space synonymous with abuse.
Many women have to choose between being online and accepting the threat of violence or self-censoring, perhaps even staying offline, and potentially paying a professional and personal cost for doing so, it found. “When women in general, or journalists and human rights defenders, are driven out from digital spaces, we all lose,” said Mingeirou.
“We know that female journalists are essential to how truth is told and whose stories get told,” she said. “And when we have women human rights defenders, or more broadly, women in public life being pushed out of the digital spaces, we see an erosion of hard-won rights.”
This erosion was particularly concerning given that it was taking place against a broader backdrop of rising authoritarianism, democratic backsliding and networking misogyny such as the manosphere, the report noted.
“Gender rights rollback is both enabled and exacerbated by technologies which – by design – amplify misogynistic hate speech for profit,” it said, pointing to generative AI apps that are capable of stripping clothes from photos of women without their consent or simulating them being sexually assaulted. “AI-assisted ‘virtual rape’ is now at the fingertips of perpetrators,” it added.
The result was a “supercharging” of threats against women in public life aimed at pushing women out of visible public roles and ushering in a broader rollback of gender rights, said Julie Posetti, the lead researcher and author of the report.
“So if you call anybody a liar and a criminal frequently enough, broadly enough, and with as much fake evidence as you can, then you are likely to convince followers that this is in fact the reality,” said Posetti, who is a professor of journalism and chair of the Centre for Journalism and Democracy at City St George’s University of London.
The pattern was similar to that of movements aligned with authoritarianism and that seek to undercut progress on women’s rights, such as reproductive rights, she said. “And we see these sorts of messages around, you know, women not being fit for leadership echoed not just online, but in the partisan press and within certain political circles. So it’s all about reinforcing an environment which is creating permission for the rollback of gender rights.”
She said women who identified as LGBTQ+ were more likely to experience online violence, as were those who were racialised or from religious backgrounds, such as Muslim or Jewish women. “So being a woman makes you a target, but being a woman with intersectional identities makes you a bigger target,” she said. “Add to that a woman who is perceived to be out of place, speaking her mind, speaking truth to power – particularly truth to power to men – these are all aspects of behaviour that tend to trigger the most malign misogynistic attacks.”
The view was echoed by Mingeirou, of UN Women, who described the attacks as “coordinated and deliberate”, and aimed at silencing women’s voices while undermining their professional credibility and reputations. “This is part of a phenomenon of a broader pushback against gender equality,” she said. “We see misogynistic networks that are very well coordinated attacking the narratives on gender equality and women in public life more broadly.”
Her view was backed by a study released earlier this year, in which researchers argued the silencing of women online was not an incidental byproduct of the digital sphere, but rather a “coordinated, systemic practice”.
The research had laid bare how “online hostility is made frictionless and rewarding for misogynists”, citing how the platforms’ algorithmic amplification and formation of ad hoc hostile groups had served to “convert misogynistic hostility into a coordinated apparatus for suppressing women’s participation in public discourse”.
The extent of this threat burst into public view earlier this year after it emerged that hundreds of thousands of requests had poured into Elon Musk’s AI tool, Grok, asking it to strip clothes from photographs of women. Months later, a German TV star Collien Fernandes alleged that her ex-husband had spread AI-generated pornographic images of her online.
Last year the UN Women’s executive director, Sima Bahous, said: “What begins online doesn’t stay online. Digital abuse spills into real life, spreading fear, silencing voices and – in the worst cases – leading to physical violence and femicide.”
Thursday’s report found that a quarter of female journalists and media workers surveyed said they had been diagnosed with anxiety or depression owing to online violence, while nearly 13% said they had been diagnosed with post-traumatic stress disorder.
For many, the fear of facing online violence had meant they avoided sensitive topics or had gone silent, with 45% of female journalists saying they self-censored on social media, while nearly 22% said they were self-censoring in their professional work.
Mingeirou called for technology companies to build in safeguards to prevent abuse and create reporting tools. Governments also needed to act, she said, noting that less than 40% of countries have laws in place to protect women from cyber-harassment or cyberstalking.
Failure to do so could have long-lasting ripple effects and roll back decades of progress. “This is creating a vicious circle,” she said. “When we have such serious mental health indications, then we have women who do not want to get involved and engaged in digital spaces.”