Following our earlier look at the fate of the Metaverse, my attention was caught by a new white paper from Interpol, looking at the potential threats posed there. The report argues the Metaverse is still in its infancy and there are many technical, social, and ethical challenges that need to be overcome before it can unlock its full potential:
The Metaverse is a complex system of various technologies creating synergies to provide an interactive, immersive, persistent, and enhanced user experience. As these technologies continue to advance, the Metaverse will expand and mature, meeting the needs of platform providers, users and other stakeholders.
Interpol goes on to warn of a whole raft of potential new criminal issues, identifying 41 different types of crime, including rape, sexual exploitation of children, deepfakes, financial scams, identity theft:
Terrorists could misuse the Metaverse to receive financial support, which could lead to the commission of terror attacks.’ Adding to these industry perspectives, Interpol confirmed that some law enforcement agencies in the member countries have already received reports of crimes featuring the Metaverse, particularly related to financial crime. With its growing popularity, the list of crimes will only expand and challenge the police services to address these emerging criminal activities.
Indeed, the Metaverse has opened up opportunities for criminals to commit new types of crime, which can be referred to as “Metacrime”. Metacrime is a growing concern and could become a major issue as the immersive world becomes part of our daily life. In this context, it is essential for law enforcement to anticipate the challenges that may arise by listing various potential threats and identifying the gap areas including those in the legal frameworks to criminalize them.
Interpol wants to see police forces establish a base in the Metaverse:
Police departments can establish a virtual presence in the metaverse, offering services such as reporting crimes, filing complaints or even hosting virtual community meetings. This approach can make police services more accessible, especially for those with mobility issues or in remote areas…To address complex governance issues, assessing the application of existing national and international laws, conducting regular policy reviews to reduce the gaps, and devising future-proof policies are recommended. Recognizing that the Metaverse spans multiple jurisdictions, dimensions, and organizations, a holistic approach involving multi-stakeholder engagements and cross-border collaboration is pivotal for an effective law enforcement response to Metacrime.
Is it urgent?
How urgent is this issue, given the near moribund state of the Metaverse right now? Actually, it’s pretty serious. In the UK, police are investigating a 'virtual rape' in the Metaverse after a teenage girl reported her online avatar had been gang raped by strangers in a virtual reality game.
According to Julie Inman Grant, eSafety Commissioner, Office of the eSafety Commissioner in Australia:
If you're in a hyper-realistic, high-sensory environment when interactions are happening in real time then the harms that we see today, whether it's online bullying, it's misogynistic harassment, when you're wearing haptic suits or using tele-dynamics, there will be sexual uses to them. It's happening in real time. It feels like a sexual assault. There'll be a lot of the same harms, but it'll be more visceral, potentially extreme. And that's why safety-by-design and getting ahead of what these risks and harms are, and building the safety potential protections at the front end before we all arrive at the Metaverse is so important.
Grant’s office has conducted research in Australia into the Metaverse. It found that only four percent of Australian adults are using the Metaverse today, three-quarters of whom are men under 40. Some 71% of them have experienced something negative in the Metaverse. This leads to a grim conclusion:
People will find infinitely creative ways to misuse technology and companies can build and try and engineer out misuse and we won't always be successful in doing that… I just worry about being able to remediate the harm that's happening in real time. It feels real, particularly if kids are wearing haptics...Technology can be great, but it needs to be balanced with interpersonal communication, with exercise, with sleep and all these things. Will it be harder for for people to be able to distinguish between their virtual worlds and what the reality looks like? I think there are a lot of unanswered questions, frankly. I don't want to prognosticate and be thinking about all of the horrors of the Metaverse. I think there's gonna be a lot of benefits, but we need to go on with our eyes wide open
The regulatory practicalities are different with the Metaverse, counsels Grant:
Putting the regulator hat on and looking into the future so we're not playing a big game of ‘Whack a Mole’, this is different. This isn't like regulating a social media service. It's not like the FDA regulating consumer devices. It's almost like thinking of urban design as an analogy to what we're thinking about [in terms of] safety-by-design. You don't build a city in the middle of a desert without thinking about where the streets are and where you put the boulevards and that you know that the sanitation and the sewage is going to be, because we want to design pleasant places. We plan for what the parks are. We want to make it liveable. We want to make it positive.
I think the creators of the Metaverse need to be thinking this way about these worlds they're creating How do we as regulators, one really focus on making sure that they are doing this at the front end and not retrofitting safety protections after the damage has been done. The opportunity is now for countries to establish more online safety regulators, and that's happened a lot more frequently in 2023. Generative AI represented a real tipping point.
There is, she argues, an opportunity for the companies building the Metaverse to get this right:
I spent 22 years in the technology industry before I became a regulator and [with] the safety-by-design initiative we came up with six years ago, I knew that we needed to bring industry along on that journey and develop the principles together. What is service provider responsibility, and what does that look like? [In terms of] user empowerment and autonomy, the burden should not fall solely on the user. And then transparency and accountability - I think transparency and accountability obviously go hand in hand. We will be using some of our transparency powers to make sure that we're asking these companies today, 'What are you doing to mitigate harms in the future in the Metaverse?’.
We need to lift the lid or lift the hood to see what's happening underneath and that's what I think's more challenging. Today we take for granted that we've got embedded seatbelts and airbags that will deploy and anti-lock brakes. You're talking [with the Metaverse] about regulating math and algorithms and things that are very opaque, that you can't see. So we need to be engaging with the companies and understanding how they're approaching these things and how they're, they're thinking about anticipating in engineering out the misuse.
There’s a need to learn from the mistakes of the past:
Having worked in the technology industry since 1995, we saw digital sovereignty and cloud computing sovereignty. We’re starting to see a degree of AI nationalism in terms of even how things like the EU AI Act [are] playing out, and we're starting to see a patchwork of regulation around the globe in different domains. I am not an expert on supply chains, but what we're envisioning here are global borderless and boundary-less worlds. By nature, laws and regulation are national, so how do we prevent that fragmentation?…I don’t think politically we'll ever achieve total synergistic regulations across the globe, but we can try and establish a greater degree of regulatory coherence so that we can allow these technologies to thrive, we can solve real problems and prevent harms from happening in the future.
For the industry, Nicola Mendelsohn, Head of the Global Business Unit at Meta, insists her firm is playing its part in this ongoing debate:
Safety and security has been something that is fundamental to Meta as a company and it's a responsibility that we know we have, a responsibility that we've invested billions of dollars in, over $20 billion over recent years. We have over 40,000 people working in this area at the company, so it matters to us and we're investing significantly. We're not waiting for regulation. This is something that from the earliest days of us building out that we've been making sure that we've been putting products in in order that we can really get ahead.
This is an industry-wide issue, but from our side that we can get ahead of some of the things that we can look around the corners and see...We've created something called Personal Boundary, so if you don't want people to come near you, it gives you an imagined about six foot of space between you and the other person. We've also created blocking features. We've also created reporting features as well.
We very much want to work hand-in-hand with regulators around the world to make sure that we're developing thoughtful advice, guidance, guidelines and ultimately regulation. We're very happy to work towards that because it is important and we want to ensure that people coming into our worlds or people that are building worlds are doing so in in a safe and secure way that really matters.
A crucially important topic if the Metaverse is ever to become a reality. The point was made last week by Mendelsohn that the Meta vision is a long term project that’s at least a decade away from realization. That’s a deceptively comforting length of time. Work on a regulatory regime needs to get underway today and it needs to involve all kinds of stakeholders. This can’t be allowed to go the way of social media and the need to retro-fit order far too late in the day.