Australian regulator demands information on terrorism content efforts from major social media platforms.

Australia’s e-Safety Commission has taken a proactive stance by issuing legal notices to several prominent social media platforms, including YouTube, X, Facebook, Telegram, and Reddit, requesting detailed information about their initiatives to combat terrorism-related content.

Expressing concern over the platforms’ perceived inadequacies in curtailing the spread of extremist material, particularly through features such as live-streaming, algorithms, and recommendation systems, the e-Safety Commission aims to ensure more robust measures against the recruitment of users by extremist groups.

Since 2022, the regulator has been empowered to compel large tech firms to divulge information regarding the prevalence of illegal content and the efficacy of their strategies to mitigate its dissemination. Non-compliance can lead to substantial fines.

Commissioner Julie Inman Grant highlighted Telegram as the primary platform utilized by violent extremist groups for radicalization and recruitment. Despite being ranked first in a 2022 OECD report for the frequency of terrorist and violent extremist content, Telegram has yet to respond to requests for comment from Reuters.

In an interview, Inman Grant emphasized the urgency of obtaining satisfactory responses from the platforms, acknowledging potential challenges regarding their capacity to address such notices. She affirmed the commission’s commitment to pursuing necessary measures, including fines, to ensure compliance.

YouTube, ranked second for hosting violent extremist content, was singled out for its algorithmic potential to propagate propaganda, which can lead users into extremist ideologies, whether overtly or subtly.

The spectrum of content considered as terrorism ranges from responses to global conflicts like Ukraine and Gaza to violent conspiracy theories and misogynistic narratives that incite real-world violence against women, according to Inman Grant.

While previous legal correspondence from the regulator targeted issues such as child abuse material and hate speech, the current focus on anti-terrorism efforts presents a more intricate challenge due to the diverse range of content and amplification methods, she noted.

Elon Musk’s X faced the e-Safety Commission’s first fine in 2023 for its handling of child abuse content, and the company is contesting the $386,000 penalty in court.

In its latest round of legal notices sent on Monday, the commission extended its scrutiny to Telegram and Reddit for the first time. The commissioner referenced a case where a white supremacist attributed his radicalization partly to Reddit, following his involvement in a fatal attack on Black individuals in Buffalo, New York, in 2022.

Both X and YouTube owner Alphabet were unavailable for immediate comment, while a spokesperson for Facebook owner Meta stated that the company is reviewing the commission’s notices and reiterated their commitment to combating terrorism and extremism on their platforms.

Similarly, a Reddit spokesperson affirmed the platform’s zero-tolerance policy towards terrorism content and expressed willingness to collaborate with the e-Safety Commissioner to enhance detection, removal, and prevention measures against harmful content.