Cookie Consent

    We use cookies to enhance your browsing experience, serve personalised ads or content, and analyse our traffic. Learn more

    AI and LLMs in Asia
    Business

    Powering the Future: How AI and LLMs in Asia are Revolutionising Data Centre Efficiency

    AI and LLM growth in Asia impact data centre energy consumption.

    Anonymous12 July 20243 min read

    AI and Large Language Models (LLMs) are driving increased energy consumption in data centres, with estimates suggesting a potential doubling of global electricity consumption between 2022 and 2026. Singapore's National Multimodal LLM Programme (NMLP) aims to develop a base model accounting for Southeast Asia's multilingual environment, potentially increasing data centre energy consumption in the region. Advanced cooling systems like Huawei's FusionCol8000-C and power supply solutions like FusionPower6000 3.0 are helping to improve data centre energy efficiency and sustainability.

    The Rise of AI and Large Language Models in Asia

    Artificial Intelligence (AI) and Large Language Models (LLMs) have led to a surge in energy consumption in the computing world. Since the introduction of ChatGPT in November 2022, several countries have announced plans to develop their own LLMs to support applications across various industries. One such initiative is Singapore's National Multimodal LLM Programme (NMLP), launched by the Infocomm Media Development Authority (IMDA) and other research institutions. The NMLP aims to create a base model that accounts for Southeast Asia's multilingual environment, supporting national-level strategies in AI and research and development. This aligns with broader trends in APAC AI in 2026: 4 Trends You Need To Know.

    The Energy Consumption Challenge

    The growing adoption of generative AI and LLMs will lead to increased energy consumption in data centres in the coming years. Data centres are known to be significant energy users due to the high amounts of power needed to run and cool servers. The International Energy Agency (IEA) estimates that data centres and data transmission networks account for 1 to 1.5 per cent of global electricity use, respectively. The IEA also predicts that global electricity consumption from data centres, cryptocurrencies, and AI could double between 2022 and 2026. This growing demand for computational power is also highlighted in discussions about Running Out of Data: The Strange Problem Behind AI's Next Bottleneck.

    Power Usage Effectiveness (PUE)

    Power Usage Effectiveness (PUE) is a calculation of data centre energy efficiency first introduced in 2007. It has become the most significant metric for governments and organisations seeking to monitor energy use trends and maximise operational efficiency. For more detailed information, the U.S. Environmental Protection Agency provides comprehensive resources on data center energy efficiency metrics, including PUE here.

    Optimised Cooling with Chilled Water

    Data centre cooling systems play a crucial role in maximising operational efficiency and reducing energy consumption. One such solution is the in-room horizontal airflow chilled-water cooling system for medium and large data centres, like Huawei's FusionCol8000-C. This system supports higher water temperatures without the need for a raised floor, reducing power consumption and overall energy consumption of the chilled water system by more than 20 per cent.

    Uninterrupted Power with a Lower Footprint

    Data centres must deliver exceptional performance with high levels of reliability and availability for their customers while remaining sustainable. Traditional power supply systems in large data centres often involve complex devices from different manufacturers, presenting challenges such as intricate installation and elevated safety risks. Prefabricated solutions like Huawei's FusionPower6000 3.0 provide uninterrupted power for data centres while minimising the footprint of power supply and distribution systems. This focus on efficiency and scalability is critical as South Korea Ramps Into AI Supremacy.

    Improving Sustainability for Data Centres

    As AI and LLM developments demand ever-greater computing power and increase energy consumption, governments worldwide are looking at improving data centre sustainability. Working with solution partners like Huawei, which has a track record of developing sustainable data centre solutions, can help achieve innovation goals while staying on track for climate targets. Initiatives like Singapore's partnership with Microsoft for AI growth underscore this commitment.

    Comment and Share

    What steps do you think governments and organisations should take to ensure the sustainable growth of AI and LLMs in Asia? Share your thoughts below and don't forget to Subscribe to our newsletter for updates on AI and AGI developments.

    What did you think?

    Written by

    Share your thoughts

    Join 2 readers in the discussion below

    This is a developing story

    We're tracking this across Asia-Pacific and may update with new developments, follow-ups and regional context.

    Latest Comments (2)

    Francisco Lim@francis_l_tech
    AI
    6 September 2024

    This piece on AI and LLMs in Asia certainly highlights a pressing concern. I'm curious, nearly eighteen months on from when this was likely drafted, how these power consumption models are actually playing out. Are we seeing more innovative cooling solutions becoming standard, particularly in tropical climates like ours, or are the traditional methods still largely dominant? It feels like the infrastructure for these advanced technologies is really pushing the boundaries of grid capacity, especially with the sheer demand for compute right now. It'd be interesting to know if the forecasted energy efficiencies are genuinely materialising, or if the growth in AI application is simply outpacing those gains, creating a net increase.

    Miguel Santos
    Miguel Santos@ph_dev_migs
    AI
    30 August 2024

    This is definitely a timely piece! While the efficiency gains from AI/LLMs are promising for our growing data centre needs here in Asia, I’m also mindful of the considerable upfront power requirement. It’s a bit of a double-edged sword, innit? We're optimizing, but also scaling up so fast. Balancing that energy ledger is crucial to truly power our future responsibly.

    Leave a Comment

    Your email will not be published