Roblox Updates Child-Safety Policies
Roblox Corporation recently announced changes to its child-safety policies, aiming to enhance the protection of its younger user base. This decision comes in light of ongoing accusations regarding the platform’s failure to safeguard children and a concerning report labeling Roblox as a dangerous environment for minors.
Earlier this year, Bloomberg uncovered over 13,000 incidents of child exploitation on Roblox, leading to multiple arrests related to in-game abuse of minors. Subsequently, Hindenberg Research released a report in August alleging that Roblox had been manipulating player numbers and provided evidence of inadequate safety measures for children.
Despite denying these allegations, Roblox has now implemented several changes to prioritize child safety. According to The Verge, parents will soon need to grant permission for children under 13 to access chat features by default. Additionally, children under nine will require consent to engage in experiences with a “moderate” content rating, which may include mild violence or humor. These restrictions will be gradually lifted as users reach specific age milestones, unless manual adjustments are made by parents or users.
Moreover, Roblox plans to introduce a new account type for parents, allowing them to monitor and control their child’s online activity effectively. These updates align with the platform’s commitment to creating a safer online space, especially for its young users. Earlier guidelines introduced in July required creators to classify in-game experiences based on content rather than age.
Roblox’s presence in the gaming industry has been contentious, with accusations of exploiting young developers and questionable hiring practices. CEO Stefano Corazza’s statements regarding teenage employees raised concerns, despite the company’s clarification that they do not hire minors. The platform continues to navigate complex issues surrounding child safety and ethical practices.