Trust & Safety Leaders Share Their Hopes and Fears for a Changing Industry

Trust & Safety Leaders Share Their Hopes and Fears for a Changing Industry

I've spent a fascinating decade in the video games industry as a Trust & Safety professional. It's been an incredible journey filled with excitement, learning, and occasional stress. Trust & Safety practices — especially in this industry — are always evolving, so there has never been a dull moment.  

That will remain the same for the foreseeable future as the current advancements in AI and GenAI are truly ground-breaking.  

These advancements promise to transform our industry in ways we could not have envisioned just a short while ago, putting us on the brink of significant change.  

Below, you'll find insights gathered from industry colleagues, as well as my own thoughts, about what keeps us up at night and what excites us in online safety.

The opinions expressed in this post represent my views and those of other industry experts, some of whom wish to remain anonymous. They do not reflect the positions or policies of our respective companies.


Kaila Jarvis, Trust & Safety Global Advisor @ Keywords Studios

What excites you most about the future of Trust & Safety? 

I’m excited about the transformative potential of AI in content moderation.

This technology not only shields superhero moderators from exposure to harmful content online, but also efficiently categorizes and triages vast amounts of data. With that kind of efficiency, platforms can protect more players and moderators from harm, and save time, resources, and money. 

What keeps you up at night?  

The growing perception that AI can do it all, particularly important tasks like making key content moderation decisions that have real world consequences. I'm concerned that humans will be removed from the moderation process, because it's crucial to have people validate the decisions made by AI and examine the models for any bias. Without human oversight, we face the danger of allowing AI to self-train on flawed decisions, overlooking the subtle nuances and context of human conversations.   

My excitement about the potential of AI is matched by my fears!


Brent Wilkinson, Director of Customer Support and Trust & Safety @ Together Labs

What excites you most about the future of Trust & Safety? 

We lean pretty heavily into UGC on our platform, so the leaps and bounds that ML and AI is making in correctly detecting problematic content is astounding. Being more confident in these tools will help, especially when it comes to agent burnout. If we can trust the tools to find the right things 99.99% of the time that means we expose our agents to less harmful content.   

What keeps you up at night? 

Hand in hand with what excites me, it also worries me. As AI grows, people will find ways of "outsmarting" our systems. Determining real CSAM from AI-generated or even AI grooming incidents is likely going to become an increasingly thorny issue. Are we reporting a real person to NCMEC or are we now dealing with a "grooming bot"? 


Support Leader and industry vet at a new video game developer and publisher  

I think the thing I'm most excited for also keeps me up at night.  

That thing is the massive advances with AI and machine learning. I feel these advances will be a boon and a bane for Trust and Safety. To me, it feels like a similar space as anti-cheat. While these advances are going to help those trying to protect people, they're also going to be used as tools for those that want to harm. There are always going to be the two sides trying to stay ahead of each other.  


Sharon Fisher, Global Head of Trust & Safety @ Keywords Studios  

What excites you most about the future of Trust & Safety? 

I am thrilled to see that online safety is finally receiving the attention and action it deserves. For 10 years now, we’ve been advocating for the crucial role of a healthy community at the heart of gaming. Today, it’s gratifying to see the industry come together on coalitions like the Gaming Safety Coalition, AI+HI as responsible moderation, content moderation legislation, and more. This industry collaboration gives me hope for the future of the internet, where no matter what side you play on — whether that’s technology, human moderation, law enforcement, non-profits, and more — we all win, even in the "real world". 

What keeps you up at night?   

As a mother and Trust & Safety professional who understands the challenges of online safety better than most parents, I fear the lack of understanding and resources available to parents who don’t work in my field. In conversations where I am asked what I do for work, parents often express how little they know about the reality of online harms and, more concerning, how ill-equipped we are to educate our children and teens about online safety.  

I’m also worried about the use of AI for direct and soul-wrenching harassment, where AI is trained to understand and target users based on their online behaviour.  


Director at a AAA gaming studio 

What excites you most about the future of Trust & Safety? 

The acceleration of Community (capital C) in the Trust & Safety space. I absolutely love that so many people are talking, sharing information, and taking advantage of their networks. It’s such a great way to learn and grow, to commiserate, to explore different ways of approaching problems, etc. TSPA is just one way that’s happening, but it’s happened on multiple fronts. The #TSCollective is another great example. There’s just so much power that comes from being open and vulnerable with each other. 

What keeps you up at night? 

The pace of technological change in content creation is astonishing.

From image generation to voice simulation, almost everyone has access to powerful tools to create incredible content — and harm. It’s an abuse vector that we’re just starting to see make its way into gaming. And I say “just” knowing that abusive behaviors with content (e.g. nonconsensual image sharing, or revenge porn) and frauds and fakes have been a thing for years. But we’re only at the beginning; it’s going to challenge all of us in T&S in so many ways. But that’s exciting, too. 


Want to share your thoughts? Let’s connect at TrustCon!  

Heading into TrustCon, there is clearly a shared sense of excitement and apprehension regarding the role of AI in Trust & Safety. It's thrilling to think about AI shielding our superhero moderators from exposure to harmful content and efficiently sifting through massive amounts of data. It could mean a complete transformation of online safety. 

But this rapid pace of AI advancement brings its own set of concerns. Will AI evolve too quickly for our comfort, taking over roles that should always be left to human judgment?  

The advancements are empowering. But they demand that we take a vigilant and above all, responsible approach to ensure that the technology we use to keep people safe online remains a force for good. 

That’s why I’m so excited to attend #TrustCon this year. It’s an opportunity to share our collective enthusiasm and tackle our worries, together as professionals. If you're feeling a mix of optimism and concern about the future of Trust & Safety, I'm eager to exchange thoughts — maybe we can find answers together?

Reach out on LI or email me at [email protected]

Leah MacDermid

Content Writer, Trust & Safety @ Keywords Studios

5mo

Great article, Kaila Jarvis! I share many of the same hopes and concerns. One thing that gives me a great deal of hope is how *much* we're talking about Trust & Safety these days. Today, there are far more Trust & Safety resources, best practices, and experts than I ever imagined possible when I joined the industry in 2007. 💙

To view or add a comment, sign in

Insights from the community

Others also viewed

Explore topics