FILE PHOTO: A boy poses for a photo while holding a smartphone in front of a screen displaying a character of the U.S. children's video game Roblox, in this illustration taken December 8, 2025. REUTERS/Ramil Sitdikov/Illustration/File Photo
FILE PHOTO: A boy poses for a photo while holding a smartphone in front of a screen displaying a character of the U.S. children's video game Roblox, in this illustration taken December 8, 2025. REUTERS/Ramil Sitdikov/Illustration/File Photo
Home » News » Business & Economy » Australia asks Roblox, Minecraft to detail child safety measures
Business & Economy

Australia asks Roblox, Minecraft to detail child safety measures

By Renju Jose

SYDNEY, April 22 (Reuters) – Australia’s internet regulator on Wednesday asked online gaming platforms including Roblox and Microsoft’s Minecraft to spell out how they protect children from grooming by sexual predators and shield young users from radicalisation.

Video Thumbnail

The eSafety Commissioner said it had issued legally enforceable transparency notices to Roblox, Minecraft, Epic Games’ Fortnite and Valve’s Steam, seeking details on their safety systems, staffing and measures aligned with cybersecurity protocols.

Companies must respond to the notices, with failure to comply exposing them to penalties of up to A$825,000 ($590,783) a day. They usually have 30 days to respond to compliance notices from Australian regulators.

eSafety Commissioner Julie Inman Grant said gaming-related services, including encrypted messaging, can become the first point of contact between children and offenders involved in grooming, sexual extortion and radicalisation.

“What we often see after these offenders make contact with children in online game environments, they then move children to private messaging services,” Inman Grant said in a statement.

She said gaming platforms also function as major social spaces for children, noting nine in 10 Australians aged 8 to 17 have played online games.

“Predatory adults know this and target children through grooming or embedding terrorist and violent extremist narratives in gameplay, increasing the risks of contact offending, radicalisation and other off-platform harms,” she said.

Microsoft said it was reviewing the regulator’s notice and took children’s online safety seriously.

“We continue to evolve our approach to meet the evolving threat and regulatory landscape,” a spokesperson said by email.

Roblox said it uses artificial intelligence to review and block content that incites or glorifies extremist groups or individuals, and swiftly removes violent content.

“While no system is perfect, our commitment to safety never ends, and we will continue to collaborate closely with eSafety on our shared goal of keeping Australian children safe,” a Roblox spokesperson said.

The move comes amid rising scrutiny of how gaming platforms detect and prevent online threats to minors, particularly as real-time chats with unknown users on some platforms can be harder for automated systems to police than traditional social media.

On Tuesday, Roblox reached settlements with the U.S. states of Alabama and West Virginia over allegations it failed to protect young users, agreeing to pay more than $23 million and make changes to how children access its chat and gaming features.

Roblox is facing more than 140 lawsuits in U.S. federal courts accusing the company of knowingly facilitating child sexual exploitation.

As it grapples with the legal issues, Roblox last week said it would introduce tailored accounts for younger users from June, assigning children aged 5 to 8 to “Roblox Kids” and users aged 9 to 15 to “Roblox Select.”  

($1 = 1.3965 Australian dollars)

(Reporting by Renju Jose in Sydney; Editing by Chris Reese and Kate Mayberry)

Image

Related posts

Leave a Comment