Domestic abusers are increasingly using AI, smartwatches and other technology to attack and control their victims, a domestic abuse charity says.
Record numbers of women who were abused and controlled through technology were referred to Refuge’s specialist services during the last three months of 2025, including a 62% increase in the most complex cases to total 829 women. There was also a 24% increase in referrals of under-30s.
Recent cases included perpetrators using wearable tech such as smartwatches, Oura rings and Fitbits to track and stalk women, disrupting their lives through smart home devices that control lights and heating, and using AI spoofing apps to impersonate people.
Emma Pickering, head of the tech-facilitated abuse team at Refuge, said: “Time and again, we see what happens when devices go to market without proper consideration of how they might be used to harm women and girls. It is currently far too easy for perpetrators to access and weaponise smart accessories, and our frontline teams are seeing the devastating consequences of this abuse.
“It is unacceptable for the safety and wellbeing of women and girls to be treated as an afterthought once a technology has been developed and distributed. Their safety must be a foundational principle shaping both the design of wearable technology and the regulatory frameworks that surround it.”
Refuge said it was far too easy to access and weaponise smart accessories and that women’s safety needed to be factored into their design.
One survivor Refuge worked with, Mina, left behind her smartwatch in a rush to flee her abuser, who then used it to track her by using linked cloud accounts to locate her emergency accommodation.
“[It] was deeply shocking and frightening. I felt suddenly exposed and unsafe, knowing that my location was being tracked without my consent. It created a constant sense of paranoia; I couldn’t relax, sleep properly, or feel settled anywhere because I knew my movements weren’t private,” she said.
Despite police returning the device to Mina, she was located at her next refuge by a private investigator hired by her abuser, using suspected tracking via technology. She reported the breaches to police but was told no crime had been committed because she had “not come to any harm”.
“I was repeatedly asked to move for my safety, rather than the technology being dealt with directly or the smart watch being confiscated from him. Each move made me feel more unstable and displaced,” she said.
“Overall, the experience left me feeling unsafe, unheard, and responsible for managing a situation that was completely out of my control. It showed me how tech abuse can quietly and powerfully extend coercive control, and how easily survivors can be left to carry the emotional and practical burden when systems don’t fully understand or respond to it.”
Abusers were also increasingly using AI tools to manipulate survivors, Pickering said. For example, they might alter a video of the survivor so that she appeared drunk, enabling them to tell social services that “she’s acting erratic again, slurring speech, she’s got a drink problem” and that she was therefore an unfit mother or a risk to herself and others. “We’ll see more and more of that as these videos and applications advance,” Pickering said.
Pickering said she had also heard of AI tools being used to develop authentic-looking fraudulent documents, for example job offers or legal summons, which can be sent to survivors to make them believe they are in debt, or to persuade them to turn up to the same location as their abuser.
Pickering feared that in coming years, medical tech would increasingly be misused, for example by controlling insulin levels through a diabetes tracker, which can be fatal.
She urged the government to act on digital technology-enabled and online crimes, including providing more funding to develop and train digital investigations teams. “They want short-term wins, they don’t want to think about longer-term investment in this area, but if we don’t do that we’ll never get ahead,” she said.
She also wants to see the technology industry held to account for failing to ensure devices and platforms are designed and function in ways that are safe for vulnerable people.
“Ofcom and the Online Safety Act don’t go far enough,” she said.
A government spokesperson said: “Tackling violence against women and girls in all its forms, including when it takes place online or is facilitated by technology, is a top priority for this government.
“Our new VAWG strategy sets out how the full power of the state will be deployed online and offline. We are working with Ofcom to set out how online platforms tackle the disproportionate abuse women and girls face online.”