Skip to main content

We use cookies to enhance your experience. By continuing to visit this site you agree to our use of cookies. Cookie Policy

AI in ASIA
News

Siri is Getting Smarter

Apple's Siri redesign faces major delays until spring 2026, but STEER technology promises revolutionary conversational AI capabilities.

Intelligence DeskIntelligence Desk4 min read

AI Snapshot

The TL;DR: what matters, fast.

Apple delays Siri's major AI redesign until spring 2026 with iOS 26.4 due to architectural challenges

STEER technology will enable natural conversations and multi-turn queries like ChatGPT competitors

Delay allows Apple to refine AI capabilities while maintaining strong iPhone sales of $85 billion

Apple's AI Revolution Faces Another Delay

Apple's ambitious artificial intelligence overhaul has hit yet another roadblock. The company's highly anticipated Siri redesign, originally promised at WWDC 2024, won't arrive until spring 2026 with iOS 26.4. This represents a significant delay from the initial 2025 timeline, highlighting the complex challenges of integrating advanced AI capabilities into consumer devices.

The postponement stems from architectural issues and internal testing setbacks that have plagued the project. Despite these hurdles, Apple remains committed to delivering a fundamentally transformed digital assistant experience.

STEER Technology Powers Conversational Breakthrough

At the heart of Apple's AI strategy lies STEER (Semantic Turn Extension-Expansion Recognition), a sophisticated system designed to revolutionise how users interact with digital assistants. This technology enables Siri to understand follow-up questions and interpret ambiguous queries with human-like comprehension.

Advertisement

STEER represents Apple's answer to the conversational AI capabilities demonstrated by competitors like ChatGPT and Gemini. The system will allow users to engage in multi-turn conversations without constantly repeating context, making interactions feel more natural and intuitive.

The delayed rollout gives Apple additional time to refine STEER's capabilities, particularly in handling complex queries that span multiple applications and services. This aligns with broader industry trends explored in our analysis of Apple's AI ambitions across Asia.

By The Numbers

  • Siri's major AI overhaul delayed until spring 2026 with iOS 26.4 release
  • Full Siri revamp targeting iOS 27 later in 2026 with multi-step task capabilities
  • iPhone sales reached $85 billion despite AI delays, showing hardware demand remains strong
  • WWDC 2024 preview followed by extended development period due to architectural challenges
  • Apple's partnership with Google's Gemini AI model for enhanced Siri functionality confirmed

Beyond Voice: AI-Powered Visual and Audio Innovation

Apple's AI initiatives extend far beyond conversational improvements. The company is developing MGIE, an AI-powered image editing system that responds to natural language commands. Users could instruct their devices to "make the sky more blue" or "add some rocks" to photographs, democratising complex editing tasks.

The technology builds upon lessons learned from Google's recent advances in AI-powered image editing, but with Apple's characteristic focus on user privacy and on-device processing.

"Apple's restrained artificial intelligence strategy may pay off in 2026 amid the arrival of a revamped Siri and concerns around the AI market 'bubble' bursting," according to reporting by The Information, as cited by MacRumors.

Music remixing capabilities represent another frontier for Apple's AI research. The system can separate vocals from instruments in recorded tracks, enabling users to create personalised versions of their favourite songs. This technology could integrate seamlessly with Apple Music and GarageBand, offering creative tools previously reserved for professional producers.

Ferret: The Context-Aware AI Assistant

Perhaps Apple's most ambitious AI project is Ferret, a system designed to understand user context across applications and devices. Ferret can describe objects viewed through iPhone cameras, assist with app navigation, and interpret visual content for future devices like the Vision Pro.

This contextual awareness represents a significant leap from traditional voice assistants. Ferret aims to anticipate user needs based on current activities, location, and historical behaviour patterns.

"Apple's long-delayed AI-powered Siri upgrade remains on track for a 2026 debut," confirmed CEO Tim Cook during recent investor discussions.

The system's integration with Apple's broader ecosystem could create unprecedented personalisation opportunities. However, the company faces the challenge of balancing intelligent assistance with privacy protection, a cornerstone of its brand identity.

AI Feature Current Status Expected Release Primary Function
Enhanced Siri In Development iOS 26.4 (Spring 2026) Conversational AI
STEER Technology Testing Phase 2026 Context Understanding
MGIE Image Editing Research Phase TBD Voice-Controlled Editing
Ferret System Early Development Post-2026 Context-Aware Assistance
Music Remixing Prototype TBD Audio Separation

Privacy-First AI Architecture

Apple's approach to AI development prioritises on-device processing over cloud-based solutions. This strategy aligns with the company's privacy commitments but presents significant technical challenges. Processing large language models on mobile hardware requires innovative optimisation techniques.

The company's collaboration with Google's Gemini model for specific tasks represents a pragmatic compromise. Apple plans to run customised versions of Gemini on its own servers, maintaining control over user data while accessing advanced AI capabilities.

Key benefits of Apple's privacy-focused approach include:

  • Reduced data transmission to external servers
  • Faster response times for common queries
  • Enhanced user privacy protection
  • Offline functionality for core AI features
  • Integration with existing Apple security frameworks

The strategy also positions Apple advantageously as regulatory scrutiny of AI systems intensifies globally. Our coverage of Apple Intelligence developments in Asia highlights the growing importance of privacy-compliant AI solutions.

Biometric Data Integration and Health Applications

Apple's AI research extends into health and fitness applications through advanced biometric data interpretation. The Apple Watch and AirPods generate vast amounts of physiological data that AI systems could analyse for health insights.

Potential applications include early detection of health anomalies, personalised fitness recommendations, and stress management guidance. However, this sensitive data requires careful handling to maintain user trust and regulatory compliance.

The integration of AI with health data represents a significant opportunity for Apple to differentiate its ecosystem. Unlike purely software-based competitors, Apple controls the entire hardware-software stack, enabling deeper integration and more sophisticated analysis.

What makes Apple's AI strategy different from competitors?

Apple emphasises on-device processing and privacy protection, contrasting with cloud-heavy approaches from Google and OpenAI. This creates unique technical challenges but offers better privacy and offline functionality for users.

When will the new Siri features be available to users?

The major Siri overhaul is scheduled for spring 2026 with iOS 26.4, while the complete revamp including multi-step tasks will arrive with iOS 27 later in 2026.

How will Apple's AI improvements affect battery life?

On-device AI processing typically consumes more battery power than cloud-based solutions. Apple is likely developing optimised silicon and software to minimise this impact while maintaining performance.

Will Apple's AI features work in all languages and regions?

Initial rollouts typically focus on English-speaking markets before expanding to other languages and regions. Apple hasn't announced specific localisation timelines for the enhanced Siri features.

How does Apple's partnership with Google affect user privacy?

Apple plans to run customised versions of Google's Gemini model on its own servers rather than sending user data directly to Google, maintaining greater control over privacy and data handling.

The AIinASIA View: Apple's cautious approach to AI deployment reflects both prudent engineering and strategic positioning. While competitors rush feature-rich but potentially unstable AI systems to market, Apple's methodical development cycle could yield more reliable, privacy-compliant solutions. The 2026 timeline allows for thorough testing and optimisation, potentially avoiding the pitfalls that have plagued other AI rollouts. However, this conservative strategy risks ceding market leadership to more aggressive competitors, particularly in Asia where AI adoption accelerates rapidly. Apple's success will ultimately depend on whether users value polish and privacy over cutting-edge capabilities.

The intersection of AI advancement and user privacy will define the next phase of smartphone evolution. As our analysis of AI adoption among Asian professionals demonstrates, the market increasingly demands both intelligence and security. Apple's measured approach may prove prescient as regulatory frameworks mature and user sophistication grows.

Apple's AI journey reflects broader tensions between innovation speed and responsible deployment. The company's willingness to delay features for quality and privacy considerations sets it apart in an industry often criticised for rushing unfinished products to market. As these enhanced capabilities finally reach users in 2026, they could redefine expectations for what constitutes truly intelligent personal assistance.

What aspects of Apple's AI development timeline concern or excite you most? Drop your take in the comments below.

YOUR TAKE

We cover the story. You tell us what it means on the ground.

What did you think?

Share your thoughts

Join 3 readers in the discussion below

This is a developing story

We're tracking this across Asia-Pacific and may update with new developments, follow-ups and regional context.

Advertisement

Advertisement

Latest Comments (3)

Priya Ramasamy@priyaram
AI
26 July 2024

The STEER project sounds promising for making digital assistants less frustrating, but I keep thinking about how much of this "human-like conversation" will actually translate to languages like Bahasa Malaysia. We still struggle with basic intent recognition for local dialects and slang in our current AI implementations. It feels like a lot of these advancements are still heavily geared towards English and Western contexts. I wonder if Apple is putting enough resources into localizing these complex conversational AI models for diverse markets like ours, where the nuances are just so different. Looking forward to seeing some real-world examples in Malaysia.

Elaine Ng
Elaine Ng@elaineng
AI
19 July 2024

The "semantic turn expansion-recognition" aspect of STEER is particularly interesting. I'm curious how Apple plans to address the cultural nuances of "ambiguous queries" across different linguistic contexts. Will it be a one-size-fits-all approach, or will the system adapt based on region and language? I'm coming back to this idea.

Pierre Dubois
Pierre Dubois@pierred
AI
21 June 2024

en effet, the notion of STEER for semantic turn extension is admirable. but the real challenge, from a research perspective, is not just understanding ambiguous queries, but consistently discerning true intent within continuous, evolving dialogue. many current LLMs struggle with sustained coherence, beyond a few turns. it's a deep problem.

Leave a Comment

Your email will not be published