Metaverse Safety a Big Challenge as Apps Allow Kids Into Virtual Strip Clubs

The UK’s child protection charity NSPCC has warned some metaverse apps are dangerous by design after investigations found apps that allowed kids into virtual strip clubs.
Image source: Metropol TV

Quick take:

  • A researcher posing as a 13-year-old girl gained entry into virtual strip clubs in the metaverse.
  • The BBC ran an investigation into what goes on inside the 3D virtual, uncovering dangerous truths.
  • The UK’s child protection charity NSPCC has warned that some apps in the metaverse are dangerous by design.

Metaverse safety is becoming a huge bottleneck for industry adoption amid increasing cases of sexual harassment and applications that fail to take responsibility for child protection.

In a recent expose, a researcher posing as a 13-year-old girl found strip clubs littered with sexual paraphernalia in a metaverse app with a minimum age rating of 13.

The researcher also said a group of men started to approach her while visiting one of the rooms in the app.

The revelation comes after the BBC ran an investigation to find out what goes on in the 3D virtual world. The UK’s National Society for the Prevention of Cruelty to Children (NSPCC), a child protection charity has warned that some metaverse apps are dangerous by design.

The report also comes after the UK recently launched an Online Safety Bill (OSB) meant to protect its residents even in the metaverse.

The NSPCC said it was “shocked and angry” at the findings with the head of online child safety policy Andy Burrows adding that the investigation had uncovered “a toxic combination of risks”.

“It’s children being exposed to entirely inappropriate, really incredibly harmful experiences,” he said, adding that the product is dangerous by design because of oversight and neglect.

“We are seeing products rolled out without any suggestion that safety has been considered.”

Meta recently announced that it had introduced a personal boundary feature to its metaverse to prevent sexual harassment. The feature allows users to create a boundary that feels like a 4-feet distance between their avatar and others in the virtual world.

The company also said it has added tools that allow players to block other users from invading their space.

However, one of the reasons the metaverse has gained popularity recently is because of its capacity to promote interactions between users, either in a gaming capacity or social spaces.

Some of the content in those spaces is created by the community, which means they could create anything given the freedoms accorded. Therefore, the minimum or lack of moderation is contributing to the challenge.

However, some apps are looking to create stronger boundaries to keep off all types of harassment. 

VRChat told the BBC in an interview following the expose that it was working hard to make its platform safe, adding “predatory and toxic behaviour has no place on the platform”.

Stay up to date:

Previous Post

WPP’s Hogarth is Launching the Metaverse Foundry with a 700 Strong Global Team

Next Post

Mintable Recovers 3 NFTs Stolen in OpenSea Exploit, Wants to Return Them to Owners

Related Posts
Total
0
Share