Cookie Consent

    We use cookies to enhance your browsing experience, serve personalised ads or content, and analyse our traffic. Learn more

    Install AIinASIA

    Get quick access from your home screen

    Business

    EA's AI Push: A Cautionary Tale for Game Studios

    As the video‑game sector grapples with rising costs, falling demand and the reckoning from the COVID era, game publishers such as EA are doubling down on generative AI as a fix‑all. But while the promise of the EA AI strategy is high, the reality is far messier.

    Anonymous
    5 min read31 October 2025
    EA AI strategy

    AI Snapshot

    The TL;DR: what matters, fast.

    EA mandated its 15,000 staff to use AI tools for nearly all tasks, from code generation to concept art.

    Despite the mandate, employees report AI tools produce flawed output and "hallucinations," increasing workload and negative sentiment.

    The AI push at EA reflects a broader industry trend of job displacement and a mismatch between executive AI enthusiasm and developer concerns about creative work and tool immaturity.

    Who should pay attention: Game studio executives | AI ethicists | Employees

    What changes next: Debate is likely to intensify regarding AI implementation versus employee well-being.

    The mandate: AI everywhere

    Right, so EA has reportedly told its roughly 15,000 staff to use AI tools for "just about everything". And I mean everything – from code generation to concept art, even down to how managers communicate with their teams. According to staff who spoke to Business Insider, internal documents show employees being put through AI training courses, asked to treat AI as a "thought partner", and encouraged to feed their own work into AI systems to drive further automation. On the surface, this might seem quite forward-thinking. But the operational fallout? Well, that suggests otherwise.

    The breakdown: flawed output and increased workload

    Despite all the hype, the tools aren't actually delivering. Staff claim the AI tools are churning out flawed code and "hallucinations" which then need manual correction. One ex-senior QA employee at EA reckons his spring 2025 redundancy was because AI was already summarising play-tester feedback – the very job he'd been doing.

    Making matters worse, internal sentiment is pretty negative. Employees are taking the mick out of the AI directives on Slack, suspecting the whole push is more about cost-cutting than genuine productivity gains.

    In short: rather than reducing work, the AI mandate is actually creating new work, shuffling anxieties around and undermining trust.

    The trust gap: executives vs teams

    Here's a revealing stat for you: 87% of executives say they use AI daily. At the worker level? Just 27%. At EA, senior management may believe the AI switch will bring the next wave of innovation. CEO and Chairman Andrew Wilson said last year:

    "AI in its different forms has always been central to this creative journey... this remarkable technology is... the very core of our business."

    Yet for many developers and creatives, the experience is quite different. When the people actually doing the work don't trust the tools, you've got a recipe for friction.

    Creative work vs automation: a mismatch

    Enjoying this? Get more in your inbox.

    Weekly AI news & insights from Asia.

    Several developers have pointed out that tasks involving high personalisation, identity or creativity are poorly served by generic generative AI. As MIT Sloan's Jackson Lu observes: when "work is highly personalised, identity-laden or creative, employees want a human in the loop." MIT Sloan Review At a games company where art, narrative, gameplay feel and community connection matter deeply, the promise of AI sitting in as a "thought partner" may simply not align with what the creators believe the work genuinely is.

    Broader implications for the games industry

    The EA case is emblematic of a larger wave of stress across games development. The industry has seen massive layoffs: an estimated 45,000 jobs lost between 2022 and 2025. The push to adopt generative AI in development isn't just about efficiency – it's also about contingency planning when budgets shrink and headcounts are cut.

    But the risk is two-fold:

    1. Quality risk: When developers are forced to rely on immature tools, flawed output can hurt game quality, brand trust and player experience.
    2. Human-capital risk: If teams feel they're automating themselves out of work, morale and retention will suffer. The creative talent that defines many games may simply walk away.

    In EA's own SEC filing, the firm acknowledged that AI "might present social and ethical issues... which may result in legal and reputational harm, cause consumers to lose confidence... and negatively impact our financial and operating results." That's not just boilerplate – it suggests EA itself recognises the stakes.

    What does this mean for Asia-Pacific?

    Asia-Pacific has a burgeoning games sector, from mobile gaming in Southeast Asia to console and PC markets in Japan, Korea, China and Australia. The pressure to adopt AI will be felt here too. Some of the lessons from EA's missteps apply directly:

    1. Tool maturity matters: For studios in Singapore, Vietnam, India or Indonesia, adopting generative AI isn't just about placing a tech bet. It's about making sure the tool fits the workflow. Otherwise you're introducing inefficiencies, not solving them.
    2. Creative differentiation counts: Many APAC studios thrive by delivering culturally specific narratives or art styles – these are harder to automate than "generic" assets. For more insights on regional AI trends, see APAC AI in 2026: 4 Trends You Need To Know.
    3. Talent risk looms: With global talent competition heating up, a studio that forces creatives to "teach the AI to do their job" may lose them to more human-centric employers. This echoes the broader discussion on What Every Worker Needs to Answer: What Is Your Non-Machine Premium?.

    What can studios (including EA) do rather than push blind adoption?

    Start small with clear scope: Identify low-risk, high-repeatable tasks (asset rendering, QA logs) rather than trying to use AI for everything. Human-in-loop design: Insist on humans retaining control over creative decisions, with AI assisting rather than replacing. Train transparently: If workers are feeding their own work into models, make sure expectations and outcomes are clear and fair. Measure real value: Track not just "how many AI tools used" but actual outcomes – fewer bugs, faster cycles, higher engagement. Foster trust and dialogue: Engage teams and let them express concerns rather than mandating tools from the top down. A similar sentiment around trust and ethics is explored in We Need Empathy and Trust in the World of AI.

    Is there cautious optimism?

    EA's bold attempt to make generative AI a core driver of game development is, at present, a cautionary tale. The technology remains promising, the ambition is understandable, but the rollout is showing cracks. Flawed output, increased workload, compromised morale and a trust deficit raise serious questions.

    For game studios in Asia and beyond, this offers a lesson: integrate AI, yes, but integrate thoughtfully, respectfully, and step by step. Because when the "dogs" won't eat the "dog food", even the most advanced tech won't save the day.

    Anonymous
    5 min read31 October 2025

    Share your thoughts

    Join 4 readers in the discussion below

    Latest Comments (4)

    Jasmine Koh
    Jasmine Koh@jkoh_tech
    AI
    24 November 2025

    Wah, this totally resonates here in our local dev scene. Everyone's chasing that AI dream to cut costs, but it feels like a bit of a gamble, innit?

    Patricia Ho@pat_ho_ai
    AI
    19 November 2025

    Agree wholeheartedly. It's a proper slippery slope, innit? Seems like every company, not just games, is throwing AI at every bleedin' problem hoping it'll magically fix everything without actually looking at the root causes. Especially after the pandemic, everyone's scrambling.

    Zachary Chia
    Zachary Chia@zachchia
    AI
    7 November 2025

    It's interesting how EA reckons AI is a silver bullet for their cost issues, innit? I wonder if they've really thought through the potential blowback from gamers if they start cutting corners on human creativity. We've seen how finicky players can get.

    Amit Chandra
    Amit Chandra@amit_c_tech
    AI
    6 November 2025

    This is a really insightful read. It’s not just EA, is it? We're seeing this across industries, this quick rush to AI to solve deep-rooted problems. Feels like a bit of a mirage sometimes, promising a lot but delivering... well, we'll see. The ground reality often bites harder than boardroom projections.

    Leave a Comment

    Your email will not be published