Skip to main content

Cookie Consent

We use cookies to enhance your browsing experience, serve personalised ads or content, and analyse our traffic. Learn more

AI in ASIA
Singapore AI Alibaba
Business

Singapore AI Ditches Meta, Embraces Alibaba

AI Singapore abandons Meta's Llama for Alibaba's Qwen architecture, creating a powerful Southeast Asian-focused language model that dominates regional benchmarks.

Intelligence Deskโ€ขโ€ข4 min read

AI Snapshot

The TL;DR: what matters, fast.

AI Singapore switches from Meta's Llama to Alibaba's Qwen3-32B for Sea-Lion v4 language model

New model trained on 36 trillion tokens across 119 languages plus 100 billion Southeast Asian tokens

Qwen-Sea-Lion-v4 tops SEA-HELM benchmark among open-source models under 200B parameters

Advertisement

Advertisement

Singapore's Bold AI Pivot Signals New Chapter for Southeast Asian Language Models

AI Singapore (AISG) has made a strategic shift that's reverberating across the regional tech landscape, abandoning Meta's Llama architecture in favour of Alibaba Cloud's Qwen foundation for its flagship Sea-Lion large language model. The latest iteration, Qwen-Sea-Lion-v4, represents more than just a technical upgrade: it's a calculated bet on regional linguistic supremacy.

This move underscores Singapore's commitment to developing AI that truly understands Southeast Asian languages and cultural nuances. The collaboration between AISG and Alibaba Cloud demonstrates how national AI programmes are increasingly prioritising regional relevance over Western-centric models.

The Technical Powerhouse Behind the Switch

Built on Alibaba Cloud's Qwen3-32B foundation model, the new Sea-Lion variant boasts impressive credentials. The base model underwent pre-training on a staggering 36 trillion tokens spanning 119 languages and dialects, establishing a robust multilingual foundation that extends far beyond English-dominant training datasets.

Alibaba Cloud enhanced this foundation with over 100 billion Southeast Asian language tokens, whilst AISG contributed its regional datasets and handled the crucial evaluation phase. This division of labour played to each partner's strengths: Alibaba's computational resources and Qwen's proven architecture, combined with Singapore's deep understanding of regional linguistic patterns.

The model now excels at handling colloquial speech, mixed-language inputs, and specific ASEAN market requirements like translation tasks. It's designed to navigate the linguistic complexity of a region where code-switching between languages is commonplace in daily conversation.

By The Numbers

  • 36 trillion tokens across 119 languages in the Qwen3-32B base model
  • Over 100 billion Southeast Asian language tokens added for regional enhancement
  • Alibaba Cloud's Singapore hub supports over 5,000 businesses and 100,000 developers globally
  • Partnership targets training 100,000 AI professionals annually through collaborations with 120+ universities
  • Up to $250,000 in technical credits available for Southeast Asian applications via Model Studio

Performance That Commands Attention

Qwen-Sea-Lion-v4 isn't just another model release, it's currently dominating the leaderboard. The model holds the top position among open-source models under 200 billion parameters in the South-east Asian Holistic Evaluation of Language Models (SEA-HELM) benchmark.

This achievement is particularly significant because SEA-HELM specifically evaluates LLM proficiency in regional languages including Indonesian, Malay, Thai, Vietnamese, and Filipino. Topping this benchmark validates the strategic decision to prioritise regional linguistic competence over sheer model size.

"It embodies our shared vision of accelerating AI innovation across the region and ensuring that developers, enterprises, and public institutions have access to AI that is open, affordable, and locally relevant and is designed to truly understand the languages, cultures, and communities of this region," says Leslie Teo, Senior Director of AI Products at AI Singapore.

The model's accessibility adds another layer of appeal. Available as an open model through the AI Singapore website and Hugging Face hub, it includes lower-precision versions that can run on consumer hardware with 32GB of RAM. This democratisation of access aligns with Singapore's broader AI adoption initiatives, making advanced language capabilities available to smaller developers and organisations.

Strategic Implications for Regional AI Development

The switch from Meta to Alibaba reflects broader geopolitical and technological currents in Asia's AI landscape. Unlike Western models that often treat Asian languages as afterthoughts, this collaboration prioritises regional linguistic authenticity from the ground up.

"By combining the model's multilingual and reasoning strengths with AI Singapore's deep regional expertise, Qwen-SEA-LION-v4 demonstrates how open collaboration can make advanced AI more inclusive and locally relevant," explains Choong Hon Keat, General Manager of Alibaba Cloud Intelligence Singapore.

This partnership also highlights the intensifying competition for AI dominance in Asia. Whilst Meta's Llama family has gained significant traction globally, Alibaba's focused investment in Asian language capabilities is paying dividends. The move comes as Southeast Asia faces significant data challenges in AI development, making strategic partnerships like this increasingly valuable.

Singapore's approach could inspire other ASEAN nations to develop similar regionally-focused AI initiatives. The success of Qwen-Sea-Lion-v4 demonstrates that targeted, collaborative development can produce models that outperform larger, more generalised alternatives in specific contexts.

Model Generation Base Architecture Regional Focus Performance Benchmark
Sea-Lion v1-v3 Meta Llama Limited Southeast Asian Standard multilingual
Qwen-Sea-Lion-v4 Alibaba Qwen3-32B Enhanced Southeast Asian Top SEA-HELM ranking

The collaboration extends beyond just model development. Alibaba Cloud's Singapore innovation hub, launched in July 2025, partners with Nanyang Technological University and Singapore University of Social Sciences to develop AI talent and solutions. This ecosystem approach suggests a long-term commitment to regional AI development that goes well beyond a single model release.

Open Access Driving Innovation

The open-source nature of Qwen-Sea-Lion-v4 represents a significant advantage in Asia's competitive AI landscape. Developers can download and modify the model freely, fostering innovation across the region's diverse startup ecosystem.

This accessibility is particularly important given Singapore's challenges with SME AI adoption. By providing a high-quality, regionally optimised model at no cost, AISG is removing one of the key barriers preventing smaller organisations from experimenting with advanced AI capabilities.

The model's efficiency also matters. Lower-precision versions enable deployment on modest hardware configurations, making it viable for organisations without massive computational budgets. This democratisation aligns with Singapore's broader strategy of empowering workers with AI tools.

Why did Singapore switch from Meta's Llama to Alibaba's Qwen?

The switch prioritised regional linguistic capabilities over global reach. Qwen3-32B's extensive multilingual training and Alibaba's specific enhancements for Southeast Asian languages offered superior performance for AISG's regional focus compared to Meta's more Western-centric approach.

How does Qwen-Sea-Lion-v4 compare to other regional AI models?

It currently ranks first among open-source models under 200 billion parameters on the SEA-HELM benchmark, which specifically evaluates Southeast Asian language proficiency. This performance validates its regional optimisation approach over larger, less targeted alternatives.

What are the hardware requirements for running the model?

Lower-precision versions can run on consumer hardware with 32GB of RAM, making it accessible to smaller developers. Higher-precision versions require more substantial computational resources but offer enhanced performance for production deployments.

Is the model available for commercial use?

Yes, Qwen-Sea-Lion-v4 is released as an open model, available for free download and commercial use through the AI Singapore website and Hugging Face hub, enabling widespread adoption across the regional business ecosystem.

What languages does the model support?

Built on a foundation of 119 languages, the model includes enhanced support for key Southeast Asian languages including Indonesian, Malay, Thai, Vietnamese, and Filipino, with additional training on regional dialects and colloquial expressions.

The AIinASIA View: Singapore's pivot to Alibaba's Qwen architecture represents shrewd strategic thinking rather than mere technological opportunism. By prioritising regional linguistic authenticity over Western AI hegemony, AISG is positioning Southeast Asia as a player rather than a consumer in global AI development. This collaboration could catalyse similar initiatives across ASEAN, creating a network effect that challenges the dominance of US and European AI models in Asian markets. We expect other nations to follow Singapore's lead, potentially reshaping the global AI landscape towards more regionally-optimised solutions.

The success of Qwen-Sea-Lion-v4 demonstrates that targeted, collaborative AI development can produce superior results for specific markets compared to one-size-fits-all global models. As Singapore continues its ambitious AI investments, this partnership with Alibaba Cloud suggests a mature understanding of how to leverage international expertise whilst maintaining regional relevance.

What impact do you think this shift will have on Southeast Asia's AI ecosystem, and should other ASEAN countries follow Singapore's collaborative approach? Drop your take in the comments below.

โ—‡

YOUR TAKE

We cover the story. You tell us what it means on the ground.

What did you think?

Written by

Share your thoughts

Join 5 readers in the discussion below

This is a developing story

We're tracking this across Asia-Pacific and may update with new developments, follow-ups and regional context.

Advertisement

Advertisement

This article is part of the This Week in Asian AI learning path.

Continue the path รขย†ย’

Latest Comments (5)

Le Hoang
Le Hoang@lehoang
AI
25 December 2025

hey, this is really interesting! the 100 billion extra southeast asian language tokens alibaba added for qwen-sea-lion-v4, how does that actually get integrated? like, is it fine-tuning, or more of a pre-training step on top of the 36 trillion? trying to understand the technical side.

Natalie Okafor@natalieok
AI
20 December 2025

The shift to Alibaba Cloud's Qwen architecture with its 119-language training and added SEA language tokens is interesting. For us in healthcare AI, multilingual capabilities are critical, especially when considering patient safety across diverse populations. It makes me wonder about the full implications for clinical applications and data privacy with a non-Meta foundation.

Tony Leung@tonyleung
AI
17 December 2025

Moving to Alibaba makes sense for regional language capability, but I'd be looking closely at the data governance. Especially in a place like Singapore where regulatory compliance is so critical, the shift from a US-based Meta to a Chinese cloud provider introduces a whole new layer of scrutiny for data residency and access, which could create complexities down the line.

Lakshmi Reddy
Lakshmi Reddy@lakshmi.r
AI
12 December 2025

It's interesting how they boosted Qwen's training with 100 billion Southeast Asian language tokens. I wonder if they'll publicly share the specifics of that dataset, it could be really valuable for other regional language models, especially for Indic languages.

James Clarke@jamesclarke
AI
11 December 2025

This is a solid move for Singapore, tapping into that Alibaba focus on SEA languages. Makes you wonder if a similar regional specialisation could boost UK models, maybe with a nod to our diverse Northern dialects!

Leave a Comment

Your email will not be published