Trust and safety: Leaders from Roblox and EA say Web3 has learned from past lessons
With the metaverse and the influx of UGC, it’s time for the game industry to prove that it’s learned its lessons on running safe communities. …
The gaming industry has a justly earned reputation when it comes to ugly behavior from users â from hate groups to grooming and illegal goods. With the metaverse and the influx of user-generated content, thereâs a whole new avenue for harmful content and people intent on causing harm.
But along with this new stage of gaming and technology comes an opportunity to do things differently, and do them better â particularly when it comes to trust and safety for minors. During GamesBeat Summit Next, leaders in the trust, safety and community space came together to talk about where the responsibility for a safer metaverse lies, among creators and platforms, developers and guardians in a panel sponsored by trust and safety solution company ActiveFence.
Safety has to be a three-legged stool, said Tami Bhaumik, VP of civility and partnerships at Roblox. There has to be responsibility from a platform standpoint like Roblox, which provides safety tools. And because democratization and UGC is the future, they also have a vested interest in empowering creators and developers to create the cultural nuance for the experiences that theyâre developing. The third leg of that stool is government regulation.
âBut I also believe that regulation has to be evidence-based,â she said. âIt has to based in facts and a collaboration with industry, versus a lot of the sensationalized headlines you read out there that make a lot of these regulators and legislators write legislation thatâs far off, and is quite frankly a detriment to everyone.â
Those headlines and that legislature tends to spring from those instances where something slips through the cracks despite moderation, which happens often enough that some guardians are frustrated and not feeling listened to. Itâs a balancing act in the trenches, said Chris Norris, senior director of positive play at Electronic Arts.
âWe obviously want to make policy clear. We want to make codes of conduct clear,â he said. âAt the same time, we also want to empower the community to be able to self-regulate. There needs to be strong moderation layers as well. At the same time, I want to make sure that weâre not being overly prescriptive about what happens in the space, especially in a world in which we want people to be able to express themselves.â
Moderating enormous communities must come with the understanding that the size of the audience means that there are undoubtedly bad actors among the bunch, said Tomer Poran, VP of solution strategy at ActiveFence.
âPlatforms canât stop all the bad guys, all the bad actors, all the bad activities,â he said. âItâs this situation where a best effort is whatâs demanded. The duty of care. Platforms are putting in the right programs, the right teams, the right functions inside their organization, the right capabilities, whether outsourced or in-house. If they have those in place, thatâs really what we as the public, the creator layer, the developer and creator layer, can expect from the platform.â
One of the issues has been that too many parents and teachers donât even know that account restrictions and parental controls exist, and across platforms, the percentage of uptake on parental controls is very low, Bhaumik said.
âThatâs a problem, because the technology companies in and of themselves have great intent,â she said. âThey have some of the smartest engineers working on innovation and technology in safety. But if theyâre not being used and thereâs not a basic education level, then thereâs always going to be a problem.â
But whatever the community is, itâs the platformâs responsibility to manage it in accordance with that audienceâs preferences. Generally speaking, expecting G-rated behavior in an M-rated game doesnât fly very far, Norris said.
âAnd back to developers, how are you thoughtfully designing for the community you want, and how does that show up, whether itâs in policy and code of conduct, whether itâs in game features or platform features?â he said. âThinking about, what does this allow people to do, what are the affordances, and what are we thinking about how those might potentially impact the guardrails youâre trying to set up as a function of policy and code of conduct.â
In the end, safety shouldnât be a competitive advantage across the industry or across platforms, Norris added â these things should be table stakes.
âGenerally in the video game industry, weâve been an industry of âdonât.â Here are the five pages of things we donât want you to do,â he said. âWe havenât articulated, what do we want you to do? What sort of community do we want? How are we thinking about all the ways in which this medium can be social and connective and emotive for a lot of people?â