Social networking platform TikTok has removed an Australian account purporting to lure alleged paedophiles to meetings and then capture their confrontation on film, as the NSW police warn people not to take the law into their own hands.
In what appears to be a new form of the trend of online accounts hunting child sex abusers, the account, which before being taken down had thousands of views and likes, claimed to have confronted alleged paedophiles the account said came to “meet an underage kid”, seemingly lured from a dating app. The men appearing in the six short videos were often attempting to flee, or were fighting back against the person filming them.
None of the men in the video had their faces censored but the account had “innocent until proven guilty in a court of law” on the profile. The account user appeared to be based in New South Wales, based on locations in the videos.
The account had more than 3,000 followers and 4,500 likes, and one of the videos had close to 50,000 views before it was removed.
Guardian Australia asked TikTok about the account, and within a day the account was removed from the platform for violating community guidelines.
“As we make clear in our community guidelines, we do not allow content that encourages, promotes, or glorifies risky behaviour,” a spokeswoman for TikTok said.
“We also do not permit users to encourage others to take part in dangerous activities, and we remove reported content or behaviour that violates our guidelines.”
Guardian Australia sent the account questions but did not receive a response before the account was removed from the platform.
New South Wales police said people should report predatory behaviour to the police and not attempt to take the law into their own hands.
“Members of the community are encouraged not to attempt to take the law into their own hands as this can jeopardise any subsequent investigations.”
Online paedophile hunting accounts on social media are not a new phenomenon. Some of the larger Facebook pages have hundreds of thousands of followers. Many of those groups have strict rules about what they post, and attempt to cooperate with police to ensure the proper legal process happens to prosecute the people they find.
But for the rogue pages that don’t necessarily follow this process, Facebook can remove the pages.
Facebook’s violence and incitement policy allows the organisation to remove content that puts people at risk by revealing their identity and status, as a target of a sting operation as a sexual predator, but is reliant on law enforcement and non-government organisations to report this to Facebook.
“Our community standards are in place to help keep people that use Facebook safe. When we identify content or behaviour that violates these standards, we will take action,” a Facebook spokeswoman said.
Joe Purshouse, a lecturer in criminal law at the University of East Anglia, has researched similar groups in the United Kingdom and said while some groups do work with police, some can risk harm not only to themselves but any potential prosecution of those they’re targeting.
“What these groups are essentially doing is like covert surveillance – it’s luring people into a relationship of sorts, and if the police were to do this, in the UK at least, they would have to get authorisation for that.
“They’d have to operate within the corners of that authorisation that is signed off by a very senior police officer or judge [whereas paedophile hunters] are able to undercut it all and circumvent all that regulation designed to protect people from police surveillance.”
TikTok has blocked a number of the hashtags that the accounts use under its community guidelines, and Purshouse said that was sensible given Tiktok’s demographic skews younger. “It’s not appropriate for young children to be viewing or having access to,” he said.
Social network sites can often be a hive of predators stalking and sending explicit messages to children and women. The federal government is currently pushing for tech companies such as Facebook to delay making their messaging services encrypted by default, which they say would make it much harder for law enforcement to investigate child sexual abuse online.
Purshouse said it wasn’t surprising that these groups began to emerge online in response to online child abusers.
“The internet created the problem of people not knowing what their kids are doing and getting access to children and then it offers the solution as well. Fifteen or 20 years ago, covert surveillance operations and stings could only really be done by law enforcement,” he said.
“The internet is putting that power into regular citizens’ hands, and they’ve currently got almost a free hand to wield it as they see fit, and it poses all kinds of risks.”
He warned trials could collapse, people could be misidentified, and the hunters themselves could risk physical harm in confronting people.
“There was one case where one guy’s finger got bitten off from confronting somebody, so it can really be quite a dangerous game.”