The Oasis Consortium wants the internet to be safe.

Oasis Consortium unveils user-safety standards for an ‘ethical internet’

The Oasis Consortium, a nonprofit devoted to making the internet safer, launched its principles for online user safety today.

Tiffany Xingyu Wang, president of Oasis, created the organization to gather thought leaders across social media, gaming, and dating to accelerate the development of an “ethical internet.” The group has unveiled its operating principles, dubbed User Safety Standards for our Digital Future. (Wang will be a speaker at our GamesBeat Summit: Into the Metaverse 2 online event on January 26-27).

On the one-year anniversary of the Capitol insurrection, The group focused on existential threats and wants to put safety guardrails at the core of online communication. This will be more important as industries shift toward Web 3 (the decentralized internet) and the metaverse, the group said.

The Standards are the first output of the think tank, launched in August to establish and popularize a new digital sustainability model for business in a Web 3 world. The consortium members include Riot Games, Pandora, The Meet Group, and others.

They worked in concert to create the best practices, and the work involved hundreds of conversations over months with professionals across gaming, dating, and social apps.

The Oasis Consortium’s highlights on user safety.

The group suggests that companies conduct an internal user safety assessment to measure performance and identify ways to improve. It also said companies will be able to earn a certification for Oasis Digital
Sustainability in User Safety.

Companies should recognize that user safety should be a company-wide initiative and should have an executive-level champion and have accountability for both vision and execution. It should be reflected in product design and company workflows, and it should have a budget.

“User safety is a challenge that continues to evolve, along with user behaviors, world events, and technical capabilities,” the group said.

Oasis also said companies should develop a living roadmap to continue iteration and improvement, and they should plan for user safety proactively, not reactively.

Policies should be based on representation, learning, and wellness. The team that develops policies and enforcement must be diverse in every dimension, especially the social background, the group said.

It noted that moderators and other employees are exposed to the worst of humanity under strict productivity goals. Companies should provide resources and design programs to protect and improve the wellness of their teams.

Companies should also be aware of both local and international regulations that require companies to understand, record, and report what they’re doing on user safety. To help, companies will often need to seek outside opinions or partners on trust and safety. That helps with accountability.

Origins

Tiffany Xingyu Wang is CEO of the Oasis Consortium.

Wang said in an interview with GamesBeat that preparation for the consortium began around August 2020. Then the consortium formally got off in the ground in August 2021, and it began work on the standards.

The intention was always to go for something larger than just gaming, which is served by organizations like the Fair Play Alliance, Wang said. She noted that if tech platforms do not address fundamental issues for user safety, then it becomes the responsibility of others like brands or those who actually implement the guard rails to help. The user safety topic is also bigger than the anti-toxicity issues that is big topic to address with games.

“Our intention is to make this available to as many companies as possible,” Wang said.

Wang said the organization’s pillar is to focus on safety by design, privacy by design, and inclusion by design. And that’s because things like harassment and hate speech are rooted in a lack of consideration for inclusion.

And there are challenges to balancing this different pillars. I can think of one example. Activision recently launched its Ricochet anti-cheat technology for Call of Duty: Vanguard. That helps reduce the rampant cheaters by identifying the computers used to cheat and banning them, not just banning individual accounts that can be quickly replaced with a new account.

Activision does this by using high-level access to the operating system of the computer to verify whether a computer has been used to cheat before. But this requires a level of trust and disclosure, as that high-level access can also be a privacy concern. Wang agree that if you want to battle toxicity, you have to be careful to do it in a privacy-perserving way.

“We are actively standing up a privacy advisory board,” she said.

Oasis is also starting to work with partners and other like-minded organizations. And it will determine how to proceed with its certification and auditing process.

Dean Takahashi

Dean Takahashi is editorial director for GamesBeat at VentureBeat. He has been a tech journalist since 1988, and he has covered games as a beat since 1996. He was lead writer for GamesBeat at VentureBeat from 2008 to April 2025. Prior to that, he wrote for the San Jose Mercury News, the Red Herring, the Wall Street Journal, the Los Angeles Times, and the Dallas Times-Herald. He is the author of two books, "Opening the Xbox" and "The Xbox 360 Uncloaked." He organizes the annual GamesBeat Next, GamesBeat Summit and GamesBeat Insider Series: Hollywood and Games conferences and is a frequent speaker at gaming and tech events. He lives in the San Francisco Bay Area.