Late one night in April 2020, towards the start of the Covid lockdowns, Shanley Clémot McLaren was scrolling on her phone when she noticed a Snapchat post by her 16-year-old sister. “She’s basically filming herself from her bed, and she’s like: ‘Guys you shouldn’t be doing this. These fisha accounts are really not OK. Girls, please protect yourselves.’ And I’m like: ‘What is fisha?’ I was 21, but I felt old,” she says.
She went into her sister’s bedroom, where her sibling showed her a Snapchat account named “fisha” plus the code of their Paris suburb. Fisha is French slang for publicly shaming someone – from the verb “afficher”, meaning to display or make public. The account contained intimate images of girls from her sister’s school and dozens of others, “along with the personal data of the victims – their names, phone numbers, addresses, everything to find them, everything to put them in danger”.
McLaren, her sister and their friends reported the account to Snapchat dozens of times, but received no response. Then they discovered there were fisha accounts for different suburbs, towns and cities across France and beyond. Faced with the impunity of the social media platforms, and their lack of moderation, they launched the hashtag #StopFisha.
It went viral, online and in the media. #StopFisha became a rallying cry, a safe space to share information and advice, a protest movement. Now it was the social media companies being shamed. “The wave became a counter-wave,” says McLaren, who is now 26. The French government got involved, and launched an online campaign on the dangers and legal consequences of fisha accounts. The social media companies began to moderate at last, and #StopFisha is now a “trusted flagger” with Snapchat and TikTok, so when they report fisha content, it is taken down within hours. “I realised that if you want change in your societies, if you come with your idea alone, it won’t work. You need support behind you.”
Four years later, this strategy is playing out on an even larger scale. McLaren and other young activists across Europe are banding together against social media and its ruinous effects on their generation. Individually, young people are powerless to sway big tech, but they are also a substantial part of its business model – so, collectively, they are powerful.
This is the first generation to have grown up with social media: they were the earliest adopters of it, and therefore the first to suffer its harms. The array of problems is ever-expanding: misogynistic, hateful and disturbing content; addictive and skewed algorithms; invasion of privacy; online forums encouraging harmful behaviours; sextortion; screen addiction; deepfake pornography; misinformation and disinformation; radicalisation; surveillance; biased AI – the list goes on. As the use of social media has risen, there has been a corresponding increase in youth mental health problems, anxiety, depression, self-harm and even suicide.
“Across Europe, a generation is suffering through a silent crisis,” says a new report from People vs Big Tech – a coalition of more than 140 digital rights NGOs from around Europe – and Ctrl+Alt+Reclaim, their youth-led spin-off. A big factor is “the design and dominance of social media platforms”.
Ctrl+Alt+Reclaim, for people aged 15 to 29, came about in September last year when People vs Big Tech put out a call – on social media, paradoxically. About 20 young people who were already active on these issues came together at a “boot camp” in London. “We were really given the tools to create the movement that we wanted to build,” says McLaren, who attended with her partner. “They booked a big room, they brought the food, pencils, paper, everything we needed. And they were like: ‘This is your space, and we’re here to help.’”
The group is Europe’s first digital justice movement by and for young people. Their demands are very simple, or at least they ought to be: inclusion of young people in decision-making; a safer, healthier, more equitable social media environment; control and transparency over personal data and how it is used; and an end to the stranglehold a handful of US-based corporations have over social media and online spaces. The overarching principle is: “Nothing for us, without us.”
“This is not just us being angry; it’s us having the right to speak,” says McLaren, who is now a youth mobilisation lead for Ctrl+Alt+Reclaim. Debates over digital rights are already going on, of course, but, she says: “We find it really unfair that we’re not at the table. Young people have so much to say, and they’re real experts, because they have lived experience … So why aren’t they given the proper space?”
McLaren’s work with #StopFisha took her on a journey into a wider, murkier world of gender-based digital rights: misogynist trolling and sexism, cyberstalking, deepfake pornography – but she realised this was just one facet of the problem. What women were experiencing online, other groups were experiencing in their own ways.
A fellow activist, Yassine, 23, is well aware of this. Originally from north Africa and now living in Germany, Yassine identifies as non-binary. They fled to Europe to escape intolerance in their own country, but the reality of life, even in a supposedly liberal country such as Germany, hit them like a “slap”, they say. “You’re here for your safety, but then you’re trying to fight not only the system that is punishing the queerness of you, but you also have another layer of being a migrant. So you have two battles instead of one.”
As a migrant they are seen as a threat, Yassine says. “Our bodies and movements must be tracked, fingerprinted and surveilled through intrusive digital systems designed to protect the EU.” For queer people, there are similar challenges. These include “shadow-banning”, for example, by which tech platforms “silence conversations about queer rights, racism or anything that is challenging the dominant system”, either wilfully or algorithmically, through built-in biases.
Measures such as identity verification “are also putting a lot of people at risk of being erased from these spaces”, says Yassine. There can be good reasons for them, but they can also end up discriminating against non-binary or transgender people – who are often presented with binary gender options; male or female – as well as against refugees and undocumented people, who may be afraid or unable to submit their details online. Given their often tenuous residency status, and sometimes limited digital literacy and access, migrants tend not to speak out, Yassine says. “It definitely feels like you are in a position of: ‘You need to be grateful that you are here, and you should not question the laws.’ But the laws are harming my data.”
On a more day-to-day level, Yassine says, they must “walk through online spaces knowing they could do harm to me”. If they click on the comments under a social media post, for example, they know they are likely to find racist, homophobic or hateful attacks. Like McLaren, Yassine says that complaining is futile. “I know that they will come back with, ‘This is not a community guidelines breach’, and all of that.”
These are not mere glitches in the system, says Yassine, who now leads on digital rights at IGLYO, a long-running LGBTQ+ youth rights organisation, founded in Brussels, with a network of groups across Europe. “The systems we design inherit the very structures they arise from, so they inevitably become systems that are patriarchal and racist by design.”
Adele Zeynep Walton’s participation in Ctrl+Alt+Reclaim came through personal experience of online harm. In 2022, Walton’s 21-year-old sister, Aimee, took her own life. She had been struggling with her mental health, but had also been spending time on online suicide and self-harm forums, which Walton believes contributed to her death. After that, Walton began to question the digital realm she had grown up in, and her own screen addiction.
Walton’s parents made her first Facebook account when she was 10, she says. She has been on Instagram since she was 12. Her own feelings of body dysmorphia began when she was 13, sparked by pro-anorexia content her friends were sharing. “I became a consumer of that, then I got immersed in this world,” she says. “Generations like mine thought it was totally normal, having this everyday battle with this addictive thing, having this constant need for external validation. I thought those were things that were just wrong with me.”
In researching her book Logging Off: The Human Cost of our Digital World, Walton, 26, also became aware of how little control young people have over the content that is algorithmically served up to them. “We don’t really have any choice over what our feeds look like. Despite the fact there are things where you can say, ‘I don’t want to see this type of content’, within a week, you’re still seeing it again.”
Alycia Colijn, 29, another member of Ctrl+Alt+Reclaim, knows something about this. She studied data science and marketing analytics at university in Rotterdam, researching AI-driven algorithms – how they can be used to manipulate behaviour, and in whose interests. During her studies she began to think: “It’s weird that I’m trained to gather as much data as I can, and to build a model that can respond to or predict what people want to buy, but I’ve never had a conversation around ethics.” Now she is researching these issues as co-founder of Encode Europe, which advocates for human-centric AI. “I realised how much power these algorithms have over us; over our society, but also over our democracies,” she says. “Can we still speak of free will if the best psychologists in the world are building algorithms that make us addicted?”
The more she learned, the more concerned Colijn became. “We made social media into a social experiment,” she says. “It turned out to be the place where you could best gather personal data from individuals. Data turned into the new gold, and then tech bros became some of the most powerful people in the world, even though they aren’t necessarily known for caring about society.”
Social media companies have had ample opportunities to respond to these myriad harms, but invariably they have chosen not to. Just as McLaren found with Snapchat and the fisha accounts, hateful and racist content is still minimally moderated on platforms such as X, Instagram, Snapchat and YouTube. After Donald Trump’s re-election, Mark Zuckerberg stated at the start of this year that Meta would be reducing factcheckers across Facebook and Instagram, just as X has under Elon Musk. This has facilitated the free flow of misinformation. Meta, Amazon and Google were also among the companies announcing they were rolling back their diversity, equity and inclusion initiatives, post-Trump’s election. The shift towards the right politically, in the US and Europe, has inevitably affected these platforms’ tolerance of hateful and racist content, says Yassine. “People feel like now they have more rights to be harmful than rights to be protected.”
All the while, the tech CEOs have become more powerful, economically, politically and in terms of information control. “We don’t believe that power should be in those hands,” says Colijn. “That’s not a true democracy.”
Europe’s politicians aren’t doing much better. Having drafted the Digital Services Act in 2023, which threatened social media companies with fines or bans if they failed to regulate harmful content, the European Commission announced last month it would be rolling back some of its data privacy laws, to allow big tech companies to use people’s personal data for training AI systems.
“Big tech, combined with the AI innovators, say they are the growth of tomorrow’s economy, and that we have to trust them. I don’t think that’s true,” says Colijn. She also disagrees with their argument that regulation harms innovation. “The only thing deregulation fosters is harmful innovation. If we want responsible innovation, we need regulation in place.”
Walton agrees. “Governments and MPs are shooting themselves in the foot by pandering to tech giants, because that just tells young people that they don’t care about our future,” she says. “There’s this massive knowledge gap between the people who are making the decisions, and the tech justice movement and everyday people who are experiencing the harms.”
Ctrl+Alt+Reclaim is not calling for the wholesale destruction of social media. All these activists say they have found community, solidarity and joy in online spaces: “We’re fighting for these spaces to accommodate us,” says Yassine. “We’re not protesting to cancel them. We know how harmful they are, but they are still spaces where we have hope.”
Colijn echoes this. “Social media used to be a fun place with the promise of connecting the world,” she says. “That’s where we started.” And that’s what they want it to be again.
Will big tech pay attention? They might not have a choice, as countries and legislators begin to take action. This week Australia will become the first country to ban social media accounts for under-16s on major platforms including Snapchat, Instagram, TikTok and X. Last week, after a two-year deliberation, X was fined €120m (£105m) by the EU for breaching data laws. But these companies continue to platform content that is hateful, racist, harmful, misleading or inflammatory, with impunity.
Meanwhile, Ctrl+Alt+Reclaim is just getting started. Other discussions on the table include campaigning for an EU-funded social media platform, an alternative to the big tech oligopoly, created by and for the public. Another alternative is direct action, either protest or consumer activism such as coordinated boycotts. “I think it’s lazy for us to be like: we don’t have any power,” says Walton. “Because we could literally say that about anything: fast fashion, fossil fuels … OK, but how do we change things?”
The other alternative is simply to log off. “The other side of the coin to this movement of tech justice, and a sort of liberation from the harms that we’ve experienced over the past 20 years, is reducing our screen time,” says Walton. “It is spending more time in community. It is connecting with people who maybe you would have never spoken to on social media, because you’d be in different echo chambers.”
Almost all the activists in Ctrl+Alt+Reclaim attest to having had some form of screen addiction. As much as social media has brought them together, it has also led to much less face-to-face socialising. “I’ve had to sort of rewire my brain to get used to the awkwardness and get comfortable with being in a social setting and not knowing anyone,” says Walton. “Actually, it would be really nice to return to proper connection.”
• In the UK and Ireland, Samaritans can be contacted on freephone 116 123, or email jo@samaritans.org or jo@samaritans.ie. In the US, you can call or text the 988 Suicide & Crisis Lifeline at 988 or chat at 988lifeline.org. In Australia, the crisis support service Lifeline is 13 11 14. Other international helplines can be found at befrienders.org
• In the UK, the charity Mind is available on 0300 123 3393 and Childline on 0800 1111. In the US, call or text Mental Health America at 988 or chat 988lifeline.org. In Australia, support is available at Beyond Blue on 1300 22 4636, Lifeline on 13 11 14, and at MensLine on 1300 789 978