Parents Sue Roblox Over Alleged Failure to Protect Children from Explicit Content

Following in the footsteps of civic groups taking legal stands against social media and tech giants, a group of families have filed a class-action lawsuit against gaming platform Roblox.

The lawsuit accuses Roblox Corporation of “negligently misrepresenting and falsely advertising” the platform, marketing the mini-game center as a platform for children while exposing minor users to inappropriate or explicit content and allowing them to engage in inappropriate encounters. The group also accuses the company of misleading parents into spending thousands of dollars on the site because of unclear pricing and the ability for children to make in-game purchases using the fictitious “Robux.”

“Parents who would never let their kids use TikTok don’t think twice about allowing them to use Roblox, even though what they encounter on Roblox could be much more harmful. The platform’s popularity soared during the pandemic as parents desperately sought social interaction for their children. But Roblox has lingered in spaces designed for children,” wrote filing law firm founder Alexandra Walsh. The lawsuit points to the site’s allegedly inadequate filtering and moderation policies as enabling this behavior.


How TikTok’s ‘Filter for Good’ raises money for Palestine

This is the second lawsuit Roblox has faced in recent months after a complaint out of Northern California accused the company of illegally facilitating children’s gambling through in-game purchases.

On November 16, Roblox released a statement regarding the latest lawsuit: “We dispute the allegations and will respond in court. Roblox is committed to providing a positive and safe experience for people of all ages. We have a team of thousands of experts dedicated to moderation and safety on Roblox 24 hours a day, 7 days a week, and we promptly block inappropriate content or behavior when detected, including sexually explicit content that violates our community standards.” .

The families involved in the lawsuit claim their children were shown nude avatars, avatars having sex and using sex toys, and virtual strip clubs. virtual children’s spaces.

In 2022, Meta’s virtual reality Horizon Worlds came under criticism for potentially exposing young users to hate speech and harassment, leading the site to introduce protective features for secondary accounts.

What you need to know about Roblox and children’s online safety

Although Roblox has evolved over time into a site for adult gamers, encouraging a wider audience with collaborations with brands and celebrities and even hinting at virtual experiences aimed at adults, it is still widely considered a children’s gaming platform. The demographic breakdown shows that gamers fall heavily into the Gen Z age group, with gamers 13 and older making up more than half of the game’s 70 million users.

Part of the free site’s popularity is due to its customizability, with millions of games and virtual worlds (or “experiences”) that users can play and use as sites to meet friends. But the platform has long been criticized for failing to expand its gaming offerings. Common Sense Media, a nonprofit children’s media watchdog, issued a warning to parents in 2022, noting the prevalence of violence, strong language and even sexual and racist content. Common Sense Media also notes that young users may encounter “ODers” or “online daters” seeking romantic and sexual encounters.

To address the growing concerns, Roblox has introduced several security features such as monitoring groups, chat filtering, password-locked pages, and experience and spending limits, as well as additional resources for parents and guardians. The platform also requires children under 18 to obtain parental consent before creating an account, and messages and chats from users 12 and under are automatically filtered for inappropriate content and personal information.

Common Sense Media has published its own parent’s guide to Roblox, and other safety groups have released similar guides to Roblox parental controls and general online safety. Setup recommendations include:

  1. Enable “Account Restrictions” for secondary accounts.

  2. Limit or disable chat settings.

  3. Enable the Allowed Features setting to limit young users to only features that are appropriate for their age.

While introducing parental controls may reassure concerned guardians in the short term, many advocates (and those who look to tech giants for help) see protecting safe online spaces aimed at children as a company’s core responsibility.

And user-side security controls can only go so far, especially as Roblox expands its engagement model with a new 1:1 chat interface launched just this week known as Roblox Connect.

“Roblox is an amazingly popular platform, on par with Barbie and Lego, as a one-word brand that inspires trust among parents and educators. But this trust is misplaced and completely undeserved,” wrote Anne Andrews, a partner at the law firm involved. Andrews and Thornton: “Roblox says it does everything it can to keep children safe, but systems that monitor inappropriate behavior often fail, and the platform makes it nearly impossible for parents to monitor, track and quantify where and how children spend theirs. money. money.”

Social good family and raising children

Source link

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button