Update: Roblox and Epic have responded to RPS' reach-outs about the Australian government's transparency notices.
"We welcome engagement with eSafety on this important topic," a Roblox spokesperson said. "Roblox has policies that strictly prohibit content or behaviour that incites, condones, supports, glorifies, or promotes any terrorist or extremist organisation or individual, which we work tirelessly to enforce. We swiftly remove such content and take immediate account level action when we find it. We also use advanced AI technology to review all images, text, and avatar items prior to publishing, in order to prevent known extremist iconography from being published. We encourage anyone who sees anything concerning on Roblox to report it to us. Our team works regularly with law enforcement, civil society groups, and other organisations with specific subject matter expertise in countering those who would seek to promote violent extremism.
"Last week, we announced that Roblox will soon introduce new age-based accounts for children under the age of 16. These accounts will more closely align content access, communication settings, and parental controls with a user’s age. While no system is perfect, our commitment to safety never ends, and we will continue to collaborate closely with eSafety on our shared goal of keeping Australian children safe."
Meanwhile, Cat McCormack - senior communications manager at Epic Games - stated that the publisher's rules "prohibit extremism, child endangerment, dangerous or illegal activities and threats of real world violence", noting that the specific Fortnite islands mentioned in the Australian government's press release had action taken against them in 2024.
"Epic’s text chat filters remove mature language including hate speech, and our systems automatically report potentially high-harm interactions in text chat with players under 18 so we can take action," they added. "Fortnite has built-in protections for younger players including high-privacy default settings for players under 18 and voice and text chat are off for players under 16 until a parent consents. Using Epic’s Parental Controls, parents can customise their family’s experience including choosing who their child can communicate with."
Original story follows:
Valve, Epic Games, Microsoft, and the Roblox Corporation have all been issued transparency notices by the Australian government's eSafety commissioner, with the body seeking to learn what steps are being taken to keep kids safe on Steam, Fortnite, Minecraft and Roblox. The Australian government say this step's been taken as without action, all four platforms risk "becoming onramps to abuse, extremist violence, radicalisation or lifelong harm".
In a press release, Australia's eSafety commissioner Julie Inman Grant - an ex-global director of privacy and internet safety at Microsoft - wrote that predatory adults "target children through grooming or embedding terrorist and violent extremist narratives in gameplay."
"We’ve seen numerous media reports about grooming taking place on all four of these platforms as well as terrorist and violent extremist-themed gameplay," she continued. "This includes Islamic State-inspired games and recreations of mass shootings on Roblox, as well as far right groups recreating fascist imagery in Minecraft.
"Media reports have also pointed to games in Fortnite gamifying the horrific events of the WWII Jasenovac concentration camp and the January 6th US Capitol Building riots, while Steam is reportedly a hub for a number of extreme-right communities."
So, the Australian government want to make sure the four companies "take meaningful steps to prevent their services becoming onramps to abuse, extremist violence, radicalisation or lifelong harm".
I've asked Valve, Epic Games, Microsoft, and the Roblox Corporation for comment. It's worth noting that some steps have already been taken by these companies in the face of criticism over these issues. For example, Roblox Corp have moved to limit access to social hangouts and unrated games for children aged under 13 and brought in selfie-based "facial age estimation technology" in recent years, with the goal of keeping young players safer.
We'll see if any new measures come out of this, but judging by the Australian government seeking transparency above all else, the companies simply sending them lengthy rundowns of all the stuff they've already put in place (plan to going forwards) to tackle grooming and extremism seems the most likely outcome.
