Gaming Industry Faces Historic Walkout Over AI Rights
Over 2,600 video game performers represented by SAG-AFTRA have launched a strike against major gaming companies, marking a pivotal moment in the industry's battle over artificial intelligence protections. The walkout, which began on 26 July 2024, affects industry giants including Activision, Disney, Electronic Arts, Warner Bros, and Take-Two Interactive.
The labour action was authorised by an overwhelming 98.32% of union members, highlighting widespread concern about AI's impact on performer rights. This strike encompasses voice actors, motion-capture performers, stunt coordinators, singers, and puppeteers across the $200 billion gaming industry.
By The Numbers
- 2,600+ performers participating in the strike action
- 98.32% member approval rate for strike authorisation
- 160,000 total SAG-AFTRA members covered by the strike order
- 24 out of 25 contract proposals already agreed upon by both sides
- $200 billion estimated value of the global gaming industry
AI Consent Takes Centre Stage
At the heart of the dispute lies performers' demands for meaningful consent and fair compensation when their voices, faces, and bodies are replicated using AI technology. The union argues that current proposals from gaming companies fail to provide adequate protections against unauthorised digital reproduction.
"AI poses an equal or even greater danger to video game performers compared to film and TV actors. Companies have not offered sufficient AI protections in contract negotiations." SAG-AFTRA spokesperson, Screen Actors Guild
The strike affects all aspects of game production involving human performance, from character voices to motion capture work. This could potentially delay upcoming releases and impact the development timeline for major gaming titles currently in production.
Additional demands include wage increases to match inflation, enhanced safety measures for on-camera performances, improved rest periods to prevent vocal strain, and the presence of set medics for hazardous work. These concerns reflect broader labour issues as AI is letting anyone build games and get paid, fundamentally changing the industry landscape.
Industry Response and Historical Context
Gaming companies have expressed disappointment with the strike decision, noting that agreements have been reached on 24 of 25 contract proposals. They claim to have offered "meaningful AI protections" including consent requirements and fair compensation structures.
This isn't the industry's first major labour dispute. In 2016-2017, video game voice actors conducted a nearly year-long strike, marking the first such action in gaming history. That previous walkout focused on vocal stress, workplace safety, and compensation structures, with mixed results for performers.
"The resolution of the 2016-2017 strike led to some improvements in working conditions and pay for voice actors, but certain issues remained unresolved, particularly regarding residual payments." Industry analyst, Video Game Labour Research Institute
The current action follows the 2023 Hollywood writers' and actors' strikes, which similarly centred on AI concerns and set important precedents for creative industry labour relations.
| Strike Period | Duration | Primary Issues | Outcome |
|---|---|---|---|
| 2016-2017 | 11 months | Vocal stress, safety, compensation | Mixed improvements |
| 2024-Present | Ongoing | AI protections, consent, fair compensation | TBD |
AI's Double-Edged Impact on Gaming
Artificial intelligence has revolutionised game development, enabling more realistic characters, complex environments, and dynamic storytelling. However, as 50% of game makers view gen AI negatively, the technology's implementation raises significant ethical and legal questions.
The gaming industry's AI adoption has accelerated rapidly, with companies exploring everything from procedural content generation to AI-poweredโฆ non-player characters. Yet this technological advancement comes at a potential cost to human performers who fear their work could be replicated without consent or compensation.
Key concerns include:
- Unauthorised replication of performer likenesses and voices
- Potential replacement of background and supporting roles
- Loss of traditional career progression pathways for actors
- Inadequate compensation for AI training data derived from performances
- Lack of ongoing consent mechanisms for future AI applications
The strike's outcome could establish crucial precedents for how EA's AI push and similar industry initiatives balance technological innovation with performer rights. Companies are increasingly investing in AI capabilities, with some viewing it as essential for remaining competitive in the rapidly evolving market.
Broader Industry Implications
This labour dispute extends beyond immediate contract negotiations, touching on fundamental questions about creative ownership in the digital age. The gaming industry's approach to AI regulation could influence similar discussions across entertainment sectors.
Extended negotiations risk disrupting game development schedules, potentially affecting major releases planned for the coming year. Voice acting and motion capture work are integral to modern gaming, making this strike particularly impactful for story-driven titles.
The resolution may also influence how other creative industries approach AI integration, particularly as AI music showdowns continue between major labels and AI startups in parallel entertainment sectors.
What specific AI protections are performers demanding?
Performers want explicit consent requirements before their voices, faces, or bodies can be used to train AI systems or generate new content. They also seek fair compensation for any AI-generated work based on their performances and ongoing royalties for continued use.
How might this strike affect upcoming game releases?
The strike could delay games requiring new voice acting or motion capture work. However, titles with completed performance work may proceed normally. The impact depends on each project's development stage and reliance on SAG-AFTRA performers.
Are there precedents for AI protections in other entertainment industries?
Yes, the 2023 Hollywood strikes resulted in AI protection clauses for film and television performers. These agreements require consent for AI replication and include compensation structures that gaming performers now seek to emulate in their contracts.
What role does AI currently play in video game development?
AI assists with character creation, environment generation, dialogue systems, and gameplay mechanics. However, human performers remain essential for voice acting, motion capture, and creating authentic character performances that resonate with players emotionally.
Could this strike spread to other regions or industries?
The strike's outcome may influence similar labour actions globally, particularly in regions with strong gaming industries like Asia. Success could embolden other creative unions to demand similar AI protections across entertainment sectors.
The video game industry stands at a crossroads between technological advancement and creative rights protection. As negotiations continue, the resolution of this strike could define how AI and human creativity coexist in interactive entertainment. The precedents set here will likely influence not just gaming, but creative industries across Asia and beyond.
What's your perspective on balancing AI innovation with performer rights in the gaming industry? Drop your take in the comments below.







Latest Comments (4)
whoa, this is hitting super close to home. just shipped a new AI voice model for my indie game and now i'm thinking about how to properly credit and compensate even the tiny voice clips. need to build out some better policies for this, especially working with international talent from here in bali.
At Tokopedia, we've been using AI for product recommendations for a while now. The idea of using performer data for games without consent just feels like it misses the mark on ethics.
The part about consent and fair compensation for AI use of voices and likenesses really hits home. As a UX researcher, I'm always thinking about the user's agency. If we're talking about actors' digital selves, how do we ensure that they still feel like they own their own identity in the digital space?
interesting seeing this play out, especially as we're building LLM tutors and thinking about voice synthesis. for these performers, how do they plan to verify if their voice/likeness is AI-generated from their original work, or if it's new content created by a different model trained on public data? that feels like a real enforcement challenge.
Leave a Comment