A BBC News investigation has found that the Metaverse exposes children to “entirely inappropriate” and “incredibly harmful” sexual content.
Metaverse, the infamous 3D virtual network dubbed as the future of the internet, is a platform where people can exist and interact in brand new worlds by simply fashioning a headset.
It even has a duplicate of the Kaaba and features embassies, and users can explore the virtual world with 3D avatars.
However, as it turns out, the virtual network is also a platform for sexual content.
Jess Sherwood, a BBC News researcher, made a disturbing discovery when she entered the Metaverse app VRChat with a Meta Quest headset - she was undercover as a 13-year-old girl.
She had created a fake account - neither her real identity nor her age was verified. She merely needed a Facebook account for her fake profile.
Inside the app, the researcher visited rooms where users can encounter the avatars of other Metaverse users - without being separated on the basis of age.
These were all kinds of rooms, duplicating real-life places such as a McDonald's restaurant. Among those rooms, however, were strip clubs, and “rooms where avatars were simulating sex.”
The avatars had a feature that enabled them to remove their clothes and engage in erotic role-play - the virtual world featured sex toys as well.
Sherwood “witnessed grooming, sexual material, racist insults and a rape threat” according to BBC’s account.
An ‘unnerving’ experience
“Everything about the rooms feels unnerving. There are characters simulating sex acts on the floor in big groups, speaking to one another like children play-acting at being adult couples,” Sherwood told about her experience as quoted by BBC.
The rooms appeared to match the red light district in Amsterdam, and the music - controlled by the players - completed the picture, Sherwood told the British broadcaster.
“VRChat definitely felt more like an adult's playground than a child's,” she said.
“It's very uncomfortable, and your options are to stay and watch, move on to another room where you might see something similar, or join in - which, on many occasions, I was instructed to do.”
Sherwood also reported that the disturbing experience psychologically felt like it was happening to her in real-life, bearing a potentially traumatic effect even for adults, let alone children.
'Very little moderation'
Following the BBC investigation, the UK children’s charity National Society for the Prevention of Cruelty to Children (NSPCC) urged improved safety measures on the virtual platform.
"It's children being exposed to entirely inappropriate, really incredibly harmful experiences," head of online child safety policy at the NSPCC, Andy Burrows, told BBC.
He added that tech companies had learned little from their previous mistakes, continuing to roll out products that are “dangerous by design because of oversight and neglect” by the companies.
VRChat has also responded to BBC, saying they are working to make the app a “safe and welcoming place for everyone" where "predatory and toxic behaviour has no place.”
Bill Stillwell, the product manager for VR integrity at Meta, also said in a statement that they would “continue to make improvements as we learn more about how people interact in these spaces."
Meta also noted that there were tools available that allowed Metaverse users to block and report others.
Nevertheless, children will continue to face these dangers until Meta and apps like VRChat eventually take the necessary measures.
Meanwhile, charities have urged parents to keep an eye on their children’s activities in the Metaverse - checking the apps they use on their VR Headsets and even checking for themselves to see if there is any inappropriate, disturbing content.
BBC also noted that many Metaverse apps allow users to display the visuals from a VR Headset on other devices like phones and laptops, which make it possible for parents to supervise their children’s activity as they explore the Metaverse.