By 2025, we will embark on a revolutionary era of security-by-design for our digital realms.
Billions of people around the world gather online in virtual gaming spaces to engage in interactive entertainment, forge connections with others, and find relaxation. Despite this, these platforms are also notorious breeding grounds for harassment, hate speech, and covert manipulation that often culminates in violent or sexually exploitative behaviors? Currently, a majority of online gamers claim to have been directly targeted or witnessed several instances of harassment, toxic behavior, and other forms of unacceptable conduct. Eighty-two percent of gamers have personally experienced the negative effects, while an overwhelming 88 percent have witnessed “toxic” behavior within their gaming communities. According to recent studies, sexual harassment and hate speech are disturbingly prevalent within the gaming community, with a staggering 70 percent or more of players reporting that they have personally witnessed such incidents while engaging in online gaming activities.
In extreme cases, gamers may experience egregious violations of their privacy and fundamental rights, such as when personal information is deliberately shared online with the intention of intimidation – a practice commonly referred to as doxxing. By early 2024, a controversy arose surrounding Candy Child Inc., a small narrative design studio, exemplifying its tumultuous past. The consultancy’s employees, accused of promoting a “woke agenda” in the video game industry, faced a torrent of threatening messages containing violent sexual references and death threats, sparking concern for their personal safety.
Several factors contribute to the association between video games and hate/discrimination. Despite its significance, a pressing concern remains the scarcity of industry-wide innovation. Are organizations frequently discussing online security in regulatory contexts? While proprietary knowledge is understandably sensitive, firms do need to engage in open discussions about online harms and security challenges? Video games, at their core, are businesses that must also serve as entertainment providers for gamers. Sharing your vulnerabilities with shareholders won’t necessarily elicit their support.
Despite lingering complacency, by 2025, a concerted effort to prioritize security will finally take hold across the entire industry. Some of these adjustments will stem directly from authorities’ directives. While video games have traditionally been absent from regulatory discussions, they are now subject to several recent initiatives that affect their industry. Within the European Union, gaming companies operating within its borders are compelled to provide publicly accessible reports detailing online harm mitigation strategies and the effectiveness of their measures in addressing these issues. For the first time, this will enable a comprehensive examination of methods and assess their efficacy across the entire gaming industry.
By 2025, the sports industry’s efforts at self-regulation may start yielding tangible consequences. Over the past few years, the sports industry has driven numerous initiatives focused on trust and security, with a comprehensive ecosystem-wide approach. In 2024, the Thriving in Video Games Group released a suite of educational resources, featuring step-by-step guides and academic materials designed specifically for game developers seeking to cultivate more resilient online communities and address complex issues surrounding trust, safety, and player engagement in video games. The program also encompasses guidelines for content moderation, community management strategies, as well as designing teams, fostering trust, and cultivating prosocial behaviors within gaming communities.
In a landmark development, the collaboration between Epic Games and the International Committee of the Red Cross resulted in the establishment of globally recognized ratings for all user-generated content produced for Fortnite. Historically, user-generated content has lacked official ratings, forcing consumers to rely on incomplete information such as game titles, icons, and descriptions to determine suitability for a particular age group. The introduction of a scoring system for consumer-generated content enables gamers, along with parents, to make more informed decisions about what and how they play. By 2025, recreational software developers are expected to adhere to guidelines that empower users to make informed choices regarding the safety and appropriateness of engaging with the vast array of user-generated content, comprising billions of unique items.
While a protected neighborhood may seem inherently safe, it by no means guarantees the absence of risk. Online, hate, harassment, and various forms of social harm will inevitably persist in some form or another. By 2025, the online gaming industry is expected to finally implement more comprehensive and integrated security measures to better protect players from social harm. As the world’s paramount digital entertainment sector, the online gaming industry has long been overdue for a transformative shift in focus, from mere innovation to prioritizing the safety and well-being of its participants. By 2025, we can expect a revolutionary year that sets a new benchmark for securing the virtual world.