Sarah Basford Canales and Josh Taylor 

More than 1,000 ‘distressing’ social media posts removed at Australian government’s request during Gaza war

Figures show rise in incidents of online extremism and violent content, partly fuelled by Israel-Gaza conflict
  
  

Internet hosting and content service providers are required to quickly stem terrorist or extremist content from going viral.
Internet hosting and content service providers are required to quickly stem terrorist or extremist content from going viral. Photograph: PA

More than 1,000 violent and extremist posts have been taken down from social media at the federal government’s request since 7 October following an increase in referrals brought on in part by the conflict between Israel and Hamas.

It follows a warning by the communications minister, Michelle Rowland, in the days following Hamas’s 7 October attacks in southern Israel that it was the job of social media companies to “prevent the spread of distressing violent and terrorist content”.

The figures, obtained by Guardian Australia, show the department of home affairs flagged 1,375 posts between 7 October 2023 and 14 December 2023 to digital platforms. Of those, 1,094 were then removed. A spokesperson said the referrals included posts related to the conflict in Gaza but the figures could not be separated.

The department is responsible for referring posts containing terrorist and violent extremist content to social media sites for their removal. Posts containing mis- and disinformation about the conflict are not covered under this arrangement.

Between 1 July 2023 and 21 December 2023, a total of 3,052 violent and extremist posts were referred to social media sites for deletion, with 71.9% of those sent to Twitter/X.

The home affairs department said Twitter/X had deleted 2,152 – or 98% – of the posts it had referred over that period while other platforms removed 43.8% of the posts referred.

In October alone, 745 posts were referred for removal with 586 – or 78.7% - of them deleted.

Referrals made to platforms other than Twitter/X were described by the spokesperson as “primarily non-mainstream social media platforms and filesharing sites”.

Separately, the department said it sent five extremist or violent posts to the eSafety commissioner between 7 October 2023 and 20 December 2023 to be considered for an online crisis event under the Online Content Incident Arrangement.

Under the arrangement, which was developed in the wake of the Christchurch mosque attack, internet hosting and content service providers are required to work with the government to quickly stem terrorist or extremist content, such as a livestream of an attack, from going viral.

In each instance, the eSafety commissioner determined the content did not meet the threshold.

Concerns about violent content spreading through social media amid the conflict in Gaza were first raised in a letter to Twitter/X by Rowland on 11 October, obtained by Guardian Australia in a freedom of information request.

“The Australian government is closely monitoring the situation following attacks on Israel by Hamas, which have included the targeting of civilians and taking of hostages,” Rowland wrote.

“I am aware that photographs and videos of these horrific attacks are circulating online, and am writing to thank you in advance for your work to prevent the spread of distressing violent and terrorist content.”

The communications minister reminded Twitter/X of its obligation to monitor and prevent violent extremist content or face potential criminal charges and fines of up to $15m or 10% of annual turnover.

It’s understood Rowland also sent letters to other digital platforms, internet service providers and industry associations in October reminding them of their obligations.

An unnamed Twitter/X official responded to Rowland, the home affairs minister, Clare O’Neil, and the eSafety commissioner, Julie Inman Grant, in a letter on 2 November 2023, detailing its commitment to taking down violent content.

“Shortly after news broke about the Hamas attack, X assembled a leadership group to assess the situation and we continue to work to address the operational needs of this fast-moving, evolving conflict,” the letter read.

“There is no place on X for terrorist organisations or violent extremist groups and we continue to remove such accounts in real time, including through proactive efforts. X is also proportionately and effectively assessing and addressing identified fake and manipulated content during this constantly shifting crisis.”

 

Leave a Comment

Required fields are marked *

*

*