The Australian Government’s eSafety office has formally issued transparency notices to Roblox, Microsoft, Epic, and Valve. The agency is demanding that these gaming giants provide specific details on the measures they are taking to prevent grooming and radicalisation within their digital ecosystems.
The eSafety office, an independent agency established in 2015, was originally created to combat youth cyberbullying and the distribution of child sexual abuse material. However, its mandate has since expanded to protect all Australians from a wide spectrum of online risks.
The Growing Threat to Australian Children
The issuance of these legally enforceable transparency notices follows ongoing concerns that platforms like Roblox, Minecraft, Fortnite, and Steam are being exploited by bad actors. Specifically, the agency is concerned about sexual predators using these spaces for grooming and extremist groups using them to spread violent propaganda.
“What we often see after these offenders make contact with children in online game environments, they then move children to private messaging services,” stated eSafety Commissioner Julie Inman Grant in a published statement.
The scale of the issue is significant due to the massive footprint of gaming in Australia:
- Approximately 9 in 10 children aged 8 to 17 in Australia play online games.
- Gaming platforms function as vital social hubs for communication and socialization.
- Predatory adults are actively targeting these spaces to embed terrorist or violent extremist narratives into gameplay.
Major Platforms Under Scrutiny
Inman Grant highlighted several media reports detailing how specific platforms have been used to host harmful content. The agency is looking for concrete actions from developers to prevent grooming and radicalisation across the following titles:
- Roblox: Reports of Islamic State-inspired games and recreations of mass shootings.
- Minecraft: Use of far-right imagery and fascist-themed gameplay by extremist groups.
- Fortnite: Content featuring WWII concentration camps and recreations of the January 6, 2021, US Capitol Building riot.
- Steam: Identified as a potential hub for various extreme-right communities.
While Valve has previously faced scrutiny regarding "tens of thousands of groups" amplifying Nazi and hate-based content, no specific new examples were noted in this latest notice.
Compliance, Penalties, and the Response from Roblox
The eSafety office has made it clear that compliance with these transparency reporting notices is mandatory. Companies that fail to respond face heavy financial consequences, with penalties reaching up to AUD$825,000 per day.
In a response provided to IGN, Roblox outlined the specific safety measures they currently employ to protect their users. A company spokesperson stated: "Roblox has policies that strictly prohibit content or behaviour that incites, condones, supports, glorifies, or promotes any terrorist or extremist organisation or individual."
To further secure their platform, Roblox is implementing several technological and structural changes:
- Advanced AI technology is used to review all images, text, and avatar items before they are published.
- The company works regularly with law enforcement and civil society groups to counter violent extremism.
- New age-based accounts for children under the age of 16 will be introduced soon to align communication settings and parental controls with a user’s age.
Roblox emphasized that while no system is perfect, their commitment to safety is ongoing, and they will continue to collaborate with eSafety to keep Australian children safe.