Presented by Modulate
Open discussion on harassment and toxicity in gaming is starting to break through. In this recent VB Spotlight, experts from The Gaming Safety Coalition share critical lessons on overcoming cross-platform moderation hurdles and creating a robust trust and safety plan that keeps players in the game — and why they’re doing this.
The Gaming Safety Coalition is a collaborative initiative between Keywords Studios, Modulate, ActiveFence and Take This to establish new benchmarks in community moderation, technological innovation and mental health advocacy in online gaming communities. Keywords Studios offers expertise on player-focused moderation teams, while Modulate offers prosocial voice intelligence technology, ActiveFence contributes its AI-powered Trust and Safety suite, and the non-profit Take This brings its research, training and community support capabilities in addressing mental health issues for both players and moderators.
In this VB Spotlight, experts from the members of the coalition came together to discuss the aims of the coalition, from the impetus behind the collaboration to the team’s goals and ways gaming companies can tackle the challenges in establishing safe and welcoming communities.
Game studios have struggled to talk honestly about toxicity and the challenges of crafting a safe space in the ecosystem recently, said Mike Pappas, CEO and co-founder of Modulate.
“There are a lot of reasons for that, but broadly speaking, if one single studio comes out and says, here’s toxicity that we’re facing on the platform, more likely than not, they’re making themselves kind of a target for a negative press, a negative consumer sentiment,” he explained. “So it’s been very difficult for any single studio to really break that silence and get this conversation started. And to the degree that many studios have invested heavily in having that conversation with other studios and other members of industry behind the scenes, there has been a lot of great discussion. It just hasn’t been out in the open as much as we’d all like it to be.”
Sharon Fisher, global head of trust and safety at Keyword Studios, reinforced the need for open discussion. “The Gaming Safety Coalition comes from a need for all the gaming industry to talk more openly about toxicity and the challenges that we are all facing with the firsthand experiences that we have in the industry with our clients,” she said. “What we have found is that we are in a very special position to bring these issues upfront, but also have the community join and understand better how we can actually avoid getting to this level of challenges that we continue to face.”
The members of the coalition don’t represent any one studio — but as organizations that each work with an array of them, they see a cross-section of the industry, and a good sampling of the issues that game companies of every size and genre are facing. They’re also positioned to help open up a discussion between industry creators and those on the regulatory side of the table, especially as some overarching trends pick up speed, including the a growing social element in gaming, an increase in diversity along a number of axes, a big boost in user-generated content and more.
“We operate in trust and safety across many different industries, doing everything from detection to tooling, all the way to intelligence, with gaming entering a phase where social media was a few years ago from not talking about things,” said Tomer Poran, VP solution strategy & community at Act Defense. “I think that we’re seeing it’s way beyond toxicity, right? Things are even more severe, like extremism, and grooming.”
As researchers and advocates in this space, they recognize that this undercurrent of extremism and targeted online harassment is expanding at a dangerous rate, said Eve Crevoshay, executive director at Take This.
“We know that games are better than that,” she said. “And so this is a moment when as we talk to the community, we see an opportunity to really advance best practices and to bring some of this stuff into the light. This is a real opportunity for us to come together holistically, to identify best practice and to share a set of standards over time that make gaming safer, that make gaming the place that we all want it to be.”
The elements of a successful safety strategy
First is understanding that when putting together any kind of safety strategy, you need a coherent vision of the kind of community you’re trying to build. To use a real-world analogy, a library is a quiet space; a children’s playground is not the kind of place to be using foul language; a sports bar is going to get a little rowdy. There’s an implicit code of conduct for each of these physical spaces — for online spaces, it’s very different, Pappas said.
“Games might have a code of conduct, but more often than not, the player never saw it or they scrolled down just to click through, or it was kind of legalese and they didn’t understand that,” he explained. “The way players learn today what’s expected of them in a certain gaming community is they just play and they see who they happen to have been matched with and what they learn from that. And some people learn good habits from that. Unfortunately, some people get matched with bad actors.”
So even before a studio talks about any of the critical tools necessary to keep a community safe, it needs to understand the kind of space they’re trying to build, and from there, the ways to remind players day after day what kind of space they’ve chosen to join — and reinforce those norms.
From top to bottom, the design of a community has to be safety-first, whether that’s writing up rules of conduct, deciding on ways to reinforce community standards or implementing human moderators and sophisticated technology tools, Poran added. Another area that ActiveFence deals with a lot is adversarial planning.
“Not all actors that do bad are bad inherently, but some actors arrive to your game in order to recruit, propagate, harass, groom, whatever it is,” Poran said. “How to differentiate, how to plan for all of those topics is among the things that we want to deal with.”
Fisher said the first priority for online community safety has been the health and happiness of the players, but the health of the humans on the front line is critical for successful safety programs as well.
“Part of bringing moderation to the 2024 era is also making sure that everybody involved in this process is taken care,” she explained. “And specifically for us, it’s very important that our moderators slash superheroes, their wellbeing is taken care in consideration too, because at the end, they’re the ones that sometimes see the worst of the worst of the internet and they save lives by doing this. Hence the term superhero.”
Take This sees substantial mental health challenges among content and community managers and moderators, Crevoshay added.
“We’re very concerned about how to bring tech to the table, especially AI, to mitigate some of that and to really support folks who are experts in developing safe online spaces,” she said. “And we also are trying to build a case using data and research about the ROI on trust and safety because we actually know that people stay in games because they’re safer. They spend more money in games because they feel more welcome there. And so we want to build that case so that everyone has the tools and has the data behind them to implement some really important tools and processes and to build that infrastructure in an effective way because players want a safer, less toxic space, and we want to build that for them.”
For more on the benefits of combatting toxicity, navigating the balance point between safety and privacy, and how the coalition is pooling resources, research and best practices, don’t miss this VB Spotlight event.
Agenda:
- The state of trust & safety in gaming today
- Content moderation best practices
- How to build and scale your content moderation team and policies
- How to protect your moderator team
- How the Gaming Safety Coalition can help studios navigate player protection and boost player engagement
Speakers:
- Sharon Fisher, Global Head of Trust and Safety, Keyword Studios
- Mike Pappas, CEO and Co-Founder, Modulate
- Tomer Poran, VP Solution Strategy & Community, ActiveFence
- Eve Crevoshay, Executive Director, Take This Inc.
- Rachel Kaser, Moderator, GamesBeat