The metaverse is composed of virtual worlds. However, those virtual worlds are built and occupied by real people. While they offer real utility, they also pose real risks, particularly given the capacity of XR hardware to collect data and the inability of regulators to keep up.
XR Safety Initiative held a roundtable event, The XR Data Classification Roundtable, on International Human Rights Day last year, and a detailed report of their findings and positions has recently been made available.
Where Does XR Data Go?
“People relinquish their data without realizing the risks or consequences. While this is not new, the difference today is we are moving towards an era of constant reality capture, especially with increased adoption of immersive technologies and a strong push to build the next iteration of the internet, also known as the Metaverse,” reads the “Virtual Worlds, Real Risks and Challenges” report.
These strong words from the report’s executive summary establish a thematic relationship for the rest of the document. People endanger themselves when they hand over their data in exchange for experiences and tools offered by technologies that few of us really understand.
The report further says that the “few of us” that really understand these technologies seldom include the legislators and regulators that we rely on to protect us. A stated goal of XRSI, the Roundtable, and this report, was to share the potential impacts of this data collection including the means of collecting it and the contexts in which it can be used and shared.
“Most privacy laws and data protection principles of our times are going to be inadequate because they do not fully address the risks related to the processing of XR data, giving way to undermining human rights,” reads the report. The privacy laws and data protection principles best equipped to deal with XR data are the best suited for the case, but still insufficient.
Aside from the collection of data from the person using the experience, the report also mentioned the concern that immersive experiences can be used to influence individuals through the experience.
Diving Into the Report
The bulk of the report discussed the unique promises and potential issues of XR technology in three key areas: medical and healthcare, learning and education, employment and work. As the goal of the paper is to warn of data collection through XR rather than to warn of XR itself, each subsection started with something to that effect.
Medical XR and Healthcare
“Medical XR offers the opportunity to collect an extensive range of data types. Some of these are not well understood and were traditionally limited to highly controlled laboratory contexts, but now will be widely available on all consumer devices in the hands of developers with skills ranging from novice to expert and the ethical considerations left to the personal experience of individuals.”
The ability of consumer headsets to conduct medical evaluations and therapies is promising in terms of accessibility of care. However, that data needs to be taken in through the headset itself and then shared at least along the network consisting of the individual and the practitioner.
That the headset manufacturers, app developers, network providers, and potential other parties have restricted access or no access at all to an individual’s information is imperative to that individual’s privacy and potentially their safety. The report also advocates for the use of security measures like encryption and cryptography to prevent information being collected by others.
The medical section of the report also highlighted the dangers of experiences for assessing or treating mental and emotional conditions through means that are potentially manipulative or traumatizing. For example, without proper guidance, an experience to cure a phobia could do significantly more harm than good.
Learning and Education
“Education is often seen as the model or forerunner to adopt, integrate, and explain new paradigms such as XR. Yet, it is clear that educators, students, administrators, and learning communities could face significant obstacles if they fully embrace XR consistent with privacy and security expectations while maintaining agency and autonomy in the learning process.”
While the discussion of learning and education was not limited to K-12 environments, it did necessarily include concerns regarding legal minors. This group is legally recognized as being more sensitive to certain experiences as well as less able to make informed decisions – again, in a field where very few adults have the information necessary to make safe decisions.
This section also expressed concerns that XR technology could be a driver of inequality in the event that some individuals or school systems can access XR technology and the advances that it brings while others are priced out – even as the hardware becomes increasingly affordable.
Employment and Work
“While companies benefit from improved worker communication and collaboration, streamlined training environment, reduced training costs, and higher engagement and knowledge retention, more personal data points may present risks and introduce ethical dilemmas.”
The professional landscape, like the medical landscape, is potentially more fraught with risk as companies are more able to afford more expensive headsets that are more able to harvest more sensitive data. While not specifically called out in the report, the HP Reverb G2 is marketed to be able to gauge a wearer’s “cognitive load” using biometrically inferred data.
Because XR in the enterprise space can be a potential hazard unto itself and because it can have such an impact on a worker’s earning potential, the report also highlighted an increased diversity and inclusion concern in this particular field.
“XR is likely to widen the gap between persons with or without disabilities and perpetuate historical bias and lack of inclusivity. As XR is developed and adopted into the workplace, entities and individuals must account for these issues since it is likely impossible or complicated to adapt XR retroactively,” states the report.
What Can We Do?
Just like XRSI and this report don’t intend to damn XR technology wholesale, they also expressed an understanding that data collection is necessary both for the experiences themselves to work and to build larger metaverse-scale experiences. Their solution is to classify different kinds of information with different degrees of protection.
This is how most enterprises and institutions handle data, but the current models are already outdated, according to the report.
“The traditional classification does not transfer into XR since it does not accompany sufficient context and does not provide adequate guidance and protection for the data collection and processing needed for the metaverse. For this reason, each data point and each data set needs to accompany a specific context, and appropriate guidelines need to be established globally.”
Here, the classic temptation for the individual to avoid involvement on the grounds of being an individual creeps in. If these things need to be established globally, we might as well sit tight until global entities figure it out – right? This is no excuse, as global entities consist of individuals. To at least some degree, that should include you.
“As a global community and as individuals, we have a role and a responsibility to ensure human rights are embedded and protected in the metaverse,” reads the report’s conclusion. “Everyone has a role to play in preventing the potentially grave violation of human rights and ensuring that everyone has the opportunity to partake in this new digital age safely.”
This article was originally published on arpost