Skip to main content

We use cookies to enhance your experience. By continuing to visit this site you agree to our use of cookies. Cookie Policy

AI in ASIA
Business

Google Fires Workers, Results Plummet

Google's massive contractor cuts leave thousands jobless while search quality plummets, revealing the hidden human cost of AI automation

Intelligence DeskIntelligence Desk4 min read

AI Snapshot

The TL;DR: what matters, fast.

Google terminated contract with Appen, cutting thousands of data training workers earning 2 cents per task

Tech sector cut 45,363 jobs in early 2026, with US accounting for 68% of global reductions

Search quality declining as human oversight diminishes and AI automation increases

Google's Contractor Cuts Highlight Human Cost of AI Automation

Google's decision to terminate its contract with data training firm Appen has left thousands of workers without jobs whilst simultaneously coinciding with reports of declining search quality. The move underscores growing tensions between AI automation and human labour, particularly in Asia-Pacific markets where tech workers face mounting uncertainty.

The termination affects workers who spent years refining Google's search algorithms through manual data training. Many received as little as two cents per task, according to a Wired investigation that described conditions as "digital slavery."

Mass Tech Layoffs Accelerate Into 2026

The broader tech sector continues aggressive workforce reductions as companies pivot towards AI-driven operations. Recent data reveals the scale of this shift across global markets.

Advertisement

Tech companies are using AI more but trusting human oversight less, creating a paradox where automation increases whilst quality concerns mount. This trend particularly affects contract workers in data training roles who lack traditional employment protections.

The timing of Google's contract termination raises questions about whether human expertise remains valued in AI development. Workers previously responsible for training search algorithms now find themselves replaced by automated systems.

By The Numbers

  • Tech companies announced 45,363 global job cuts in early 2026, with 68% occurring in the United States
  • US-based tech employers cut over 33,000 positions from January to February 2026, up 51% year-on-year
  • 51,330 tech employees impacted across 132 layoff events so far in 2026, averaging 870 job losses daily
  • 244,851 tech sector jobs eliminated in 2025, with momentum continuing into 2026
  • Appen workers reportedly earned as little as two cents per training task before contract termination

Search Quality Concerns Mount as Human Oversight Diminishes

Independent studies suggest Google's search quality has declined following workforce reductions and increased reliance on automated systems. The correlation between human contractor cuts and performance issues raises fundamental questions about AI's readiness to replace human judgement.

"The human workers behind AI are the canaries in the coal mine. Google's decision lacks transparency and fails to provide severance benefits for those who helped build their systems." , Toni Allen, Executive Board Secretary, Alphabet Workers Union

Workers who trained Google's algorithms possessed nuanced understanding of search intent and cultural context. Their removal potentially impacts the system's ability to distinguish between reliable and unreliable information sources.

The Asian market shows particular scepticism towards AI replacement of human roles, with workers expressing concerns about job security and system reliability.

Ethical Implications of AI Labour Displacement

The Appen situation exemplifies broader ethical challenges in AI development. Companies benefit from human-trained systems whilst simultaneously eliminating the human element that created their competitive advantage.

Contract workers face particular vulnerability as they lack employment protections available to full-time staff. Many Appen workers operated as independent contractors without access to unemployment benefits or severance packages.

"In 2025, automation, artificial intelligence, and sustained cost-discipline measures drove much of the downsizing, with entire departments restructured or eliminated in favour of leaner, AI-assisted workflows. This trend has continued full steam into 2026." , Alan Cohen, Analyst, RationalFX

Industry observers note that Google's AI agents are set to transform work by 2026, but question whether current systems can maintain quality without human oversight.

Period Tech Job Cuts Google Contract Changes Search Quality Reports
2024 165,000+ cuts Initial Appen reductions Stable performance
2025 244,851 cuts Full contract termination Quality decline reported
2026 YTD 51,330+ cuts Automation rollout Ongoing concerns

Industry Response and Future Implications

The controversy highlights fundamental questions about sustainable AI development practices. Critics argue that companies benefit from human training data whilst abandoning the workers who created that value.

Key concerns include:

  • Lack of transparency in contract termination decisions affecting thousands of workers
  • Absence of retraining programmes for displaced data workers
  • Potential quality degradation as human oversight diminishes
  • Ethical implications of "digital slavery" conditions in AI training roles
  • Long-term sustainability of AI systems without human feedback loops

Labour advocates push for stronger protections for contract workers in AI development. They argue that current practices create unsustainable race-to-the-bottom dynamics that ultimately harm both workers and system quality.

The AI boom's "irrationality" according to Google's own leadership suggests internal recognition of unsustainable practices. However, competitive pressures continue driving workforce reductions across the sector.

What led to Google ending its contract with Appen?

Google terminated Appen's contract as part of broader AI automation initiatives and cost-cutting measures. The decision affects thousands of workers who previously refined search algorithms through manual data training tasks.

How does this impact Google's search quality?

Independent studies report declining Google search quality coinciding with reduced human oversight. The correlation suggests automated systems struggle to replicate nuanced human judgement in content evaluation and search intent understanding.

What conditions did Appen workers face?

Workers reportedly earned as little as two cents per training task, with one describing conditions as "digital slavery." Many operated as independent contractors without traditional employment protections or benefits.

Are other tech companies making similar cuts?

Yes, the tech sector eliminated 244,851 jobs in 2025, with over 51,000 additional cuts in early 2026. Companies cite AI automation and cost discipline as primary drivers for workforce reductions.

What protections exist for contract workers in AI development?

Limited protections exist for contract workers, who typically lack access to unemployment benefits, severance packages, or retraining programmes. Labour advocates push for stronger safeguards in AI development roles.

The AIinASIA View: Google's Appen termination represents a concerning trend where companies exploit human labour to train AI systems, then abandon those same workers when automation becomes viable. This approach risks both ethical violations and quality degradation. We believe sustainable AI development requires ongoing human collaboration, not replacement. Companies must balance automation benefits with worker dignity and system reliability. The correlation between workforce cuts and search quality decline suggests that premature human removal damages the very systems these workers helped create. Responsible AI development demands better.

The Google-Appen situation serves as a cautionary tale for the broader AI industry. As automation advances, companies must balance efficiency gains with ethical obligations to human workers and system quality maintenance.

The question remains whether current AI systems can sustain quality without the human feedback loops that created their initial success. What role should human workers play in our AI-driven future, and how can we ensure their contributions are valued rather than exploited? Drop your take in the comments below.

YOUR TAKE

We cover the story. You tell us what it means on the ground.

What did you think?

Share your thoughts

Join 5 readers in the discussion below

This is a developing story

We're tracking this across Asia-Pacific and may update with new developments, follow-ups and regional context.

Advertisement

Advertisement

This article is part of the AI in ASEAN Markets learning path.

Continue the path →

Latest Comments (5)

Oliver Thompson@olivert
AI
24 January 2026

@olivert: Two pence per task is a rather rum deal, isn't it? We've seen similar pressure on rates for annotators here, especially for more nuanced financial datasets. If you squeeze the pipeline too hard on price, the quality invariably goes pear-shaped. Google shouldn't be surprised.

Maria Reyes
Maria Reyes@mariar
AI
8 April 2024

This whole Google-Appen situation really shows how important fair labor practices are, even in AI. The "digital slavery" bit is pretty jarring, especially since so many tech jobs in places like Manila are about leveraging AI for growth. We're seeing AI help a lot with financial inclusion here, but if the foundation is built on underpaid workers, it just undermines all that progress. It's not just about the tech, it's about making sure everyone benefits.

Lakshmi Reddy
Lakshmi Reddy@lakshmi.r
AI
25 March 2024

The Appen worker's "digital slavery" comment really stuck with me. We're seeing similar discussions in India when it comes to data annotation for regional language models. The push for scale often overlooks researcher concerns about ethical sourcing and fair compensation for these critical human inputs into AI. It's a systemic issue.

Ji-hoon Kim@jihoonk
AI
18 March 2024

The Appen situation just makes me think about data sovereignty. If these training tasks were happening on-device with federated learning, Google wouldn't have these outsourcing issues to begin with. Way more control over the data and the worker conditions. That's the real ethical approach.

Maria Reyes
Maria Reyes@mariar
AI
19 February 2024

It's a shame that Appen workers were apparently paid so little for those training tasks. Here in the Philippines, I've seen how fair wages for data labelling can really uplift communities. It just proves that AI development needs to be done right, with people's well-being at the core, not just cost-cutting.

Leave a Comment

Your email will not be published