Roblox tightens safety measures: is it enough for children?
Weeks after new research labelled the gaming platform as a “a pedophile hellscape for kids”, Roblox has announced new safeguards to protect its youngest users from the riskiest games. While this is a positive step, more needs to be done to keep children safe on the gaming platform and comply with the law.
Under the new system, Roblox says user accounts self-reported as under the age of 13 will be unable to search for certain user-uploaded games. These include games without age ratings and those classified as featuring higher-risk or harder to moderate elements like on-screen drawing, and “social hangouts” devoted primarily to social text and voice chat without additional role-playing elements. These changes are likely intended to curb the levels of sexual conduct and contact taking place in “social hangout” games, and the abuse of on-screen drawing to create inappropriate messages, both of which were previously discovered in the research reports mentioned below. We note that these experiences will still be available to players over 13, and ratings will not affect the actions of users in games. This latter point remains an ongoing concern, as during our own testing we found (among other things) users wearing player-uploaded cosmetic items bearing racial slurs, and instances of inappropriately sexualised interactions with other players.
Additionally, Roblox has announced that creators will have to complete a content-rating questionnaire for any game intended for users under 13 by December 3rd. Games without a rating will become unsearchable and unplayable for younger users, the company says. The final ratings will ultimately take into consideration factors such as realistic violence, mentions of alcohol, “crude” humour, romantic themes and so on, with published guidance helping users determine where on the scale their content lies. According to this guidance, creators of games with inaccurate labels may have their ratings removed, but it is unclear how often or how rigorously this will be checked.
These changes arrive on the back of two exposé reports by Bloomberg and Hindenburg Research, uncovering significant issues around inappropriate content and predators using Roblox to groom child users, among other things. In 2022, BBC had already reported that some children had felt unsafe or unhappy on Roblox, and that Childline were receiving increasing amounts of calls related to the platform.
“These changes are good, if long overdue. It is astonishing that a platform targeted at children did not to date require even self-assessed age ratings. This is a positive first step. Let’s hope it does not require another media scandal for the next one.”
Leanda Barrington-Leach, Executive Director.
5Rights welcomes these changes to ensure some basic consideration of the age-appropriateness of games on the platform, improve transparency for children and parents, and limit how age-inappropriate games are promoted. We strongly encourage Roblox to go further to adequately respect children’s rights and ensure compliance with legislation including the UK Age Appropriate Design Code and EU Digital Services Act, by implementing stronger measures to ensure its services and the games on the platform are safe and appropriate for its child users. This will include addressing deceptive nudging that pressures children to buy and spend in-game currency, improving the efficacy of text filters, and overall integration of safety-by-design principles throughout the user experience.