On the veranda of her family’s home, with her laptop balanced on a mud slab built into the wall, Monsumi Murmu works from one of the few places where the mobile signal holds. The familiar sounds of domestic life come from inside the house: clinking utensils, footsteps, voices.
On her screen a very different scene plays: a woman is pinned down by a group of men, the camera shakes, there is shouting and the sound of breathing. The video is so disturbing Murmu speeds it up, but her job requires her to watch to the end.
Murmu, 26, is a content moderator for a global technology company, logging on from her village in India’s Jharkhand state. Her job is to classify images, videos and text that have been flagged by automated systems as possible violations of the platform’s rules.
On an average day, she views up to 800 videos and images, making judgments that train algorithms to recognise violence, abuse and harm.
This work sits at the core of machine learning’s recent breakthroughs, which rest on the fact that AI is only as good as the data it is trained on. In India, this labour is increasingly performed by women, who are part of an workforce often described as “ghost workers”.
“The first few months, I couldn’t sleep,” she says. “I would close my eyes and still see the screen loading.” Images followed her into her dreams: of fatal accidents, of losing family members, of sexual violence she could not stop or escape. On those nights, she says, her mother would wake and sit with her.
Now, she says, the images no longer shock her the way they once did. “By the end, you don’t feel disturbed – you feel blank.” There are still some nights, she says, when the dreams return. “That’s when you know the job has done something to you.”
Researchers say this emotional numbing – followed by delayed psychological fallout – is a defining feature of content moderation work. “There may be moderators who escape psychological harm, but I’ve yet to see evidence of that,” says Milagros Miceli, a sociologist leading the Data Workers’ Inquiry, a project investigating the roles of workers in AI.
“In terms of risk,” she says, “content moderation belongs in the category of dangerous work, comparable to any lethal industry.”
Studies indicate content moderation triggers lasting cognitive and emotional strain, often resulting in behavioural changes such as heightened vigilance. Workers report intrusive thoughts, anxiety and sleep disturbances.
A study of content moderators published last December, which included workers in India, identified traumatic stress as the most pronounced psychological risk. The study found that even where workplace interventions and support mechanisms existed, significant levels of secondary trauma persisted.
As early as 2021, an estimated 70,000 people in India were working in data annotation, which had a market value of about $250m (£180m) in 2021, according to the country’s IT industry body Nasscom. About 60% of revenues came from the US, while only 10% came from India.
About 80% of data-annotation and content moderation workers are drawn from rural, semi-rural ormarginalised backgrounds. Firms deliberately operate from smaller cities and towns, where rents and labour costs are lower, and a growing pool of first-generation graduates are seeking jobs.
Improvements in internet connectivity have made it possible to plug these locations directly into global AI supply chains, without relocating workers to cities.
Women form half or more of this workforce. For companies, women are seen as reliable, detail-oriented and more likely to accept home-based or contract work that could be seen as “safe” or “respectable”. These jobs offer rare access to income without migration.
A sizeable number of workers in these hubs come from Dalit and Adivasi (tribal) communities. For many of them, digital work of any kind represents an upward shift; cleaner, more regular and better-paid jobs than agricultural labour or mining.
But working from or close to home, can also reinforce women’s marginal position, according to Priyam Vadaliya, a researcher working on AI and data labour, formerly with the Bengaluru-based Aapti Institute.apti Institute in Bengaluru.
“The work’s respectability, and the fact that it arrives at the doorstep as a rare source of paid employment, often creates an expectation of gratitude,” she says. “That expectation can discourage workers from questioning the psychological harm it causes.”
Raina Singh was 24 when she took up data-annotation work. A recent graduate, teaching had been her plan, but the certainty of a monthly income felt necessary before she could afford to pursue it.
She returned to her home town of Bareilly in Uttar Pradesh and each morning logged on from her bedroom, working through a third-party firm contracted for global technology platforms. The pay – about £330 a month – seemed reasonable. The job description was vague, but the work felt manageable.
Her initial assignments involved text-based tasks: screening short messages, flagging spam, identifying scam-like language. “It didn’t feel alarming,” she says. “Just dull. But there was something exciting too. I felt like I was working behind the AI. For my friends, AI was just ChatGPT. I was seeing what makes it work.”
But about six months in, the assignments changed. Without notice, Singh was moved to a new project tied to an adult entertainment platform. Her task was to flag and remove content involving child sexual abuse.
“I had never imagined this would be part of the job,” she says. The material was graphic and relentless. When she raised concerns with her manager, she recalls being told: “This is God’s work – you’re keeping children safe.”
Soon after, the task shifted again. Raina and six others on her team were instructed to categorise pornographic content. “I can’t even count how much porn I was exposed to,” she says. “It was constant, hour after hour.”
The work affected her personal life. “The idea of sex started to disgust me,” she says. She withdrew from intimacy and felt increasingly disconnected from her partner.
When Singh complained, the response was blunt: ‘your contract says data annotation – this is data annotation.’ She left the job, but a year on, she says the thought of sex can trigger a sense of nausea or dissociation. “Sometimes, when I’m with my partner, I feel like a stranger in my own body. I want closeness, but my mind keeps pulling away.”
Vadaliya says job listings rarely explain what the work actually involves. “People are hired under ambiguous labels, but only after contracts are signed and training begins do they realise what the actual work is.”
Remote and part-time roles are promoted aggressively online as “easy money” or “zero-investment” opportunities, and circulated through YouTube videos, LinkedIn posts, Telegram channels and influencer-led tutorials that frame the work as flexible, low-skilled and safe.
The Guardian spoke to eight data-annotation and content-moderation companies in India. Only two said they provided psychological support to workers; the rest argued that the work was not demanding enough to require mental healthcare.
Vadaliya says that where there is support, the individual has to seek it out, shifting the burden of care on to workers. “It ignores the reality that many data workers, especially those coming from remote or marginalised backgrounds, may not even have the language to articulate what they are experiencing,” she says.
The absence of legal recognition of psychological harm in India’s labour laws, she adds, also leaves workers without meaningful protections.
The psychological toll is intensified by isolation. Content moderators and data workers are bound by strict non-disclosure agreements (NDAs) that bar them from speaking about their work, even with family and friends. Violating NDAs can lead to termination or legal action.
Murmu feared that if her family understood her job, then she, like many other girls in her village, would be forced to leave paid employment and into marriage.
With just four months left on her contract, which pays about £260 a month, the spectre of unemployment keeps her from flagging concerns about her mental health. “Finding another job worries me more than the work itself,” she says.
In the meantime, she has found ways to live with the distress. “I go for long walks into the forest. I sit under the open sky and try to notice the quiet around me.”
Some days, she collects mineral stones from the land near her home or paints traditional geometric patterns on the walls of the house. “I don’t know if it really fixes anything,” says Murmu. “But I feel a little better.”