Skip to main content

Cookie Consent

We use cookies to enhance your browsing experience, serve personalised ads or content, and analyse our traffic. Learn more

Install AIinASIA

Get quick access from your home screen

Install AIinASIA

Get quick access from your home screen

AI in ASIA
Google Project Suncatcher
Life

Google's Moonshot?

Google's fascinating project called "Project Suncatcher" is a a real moonshot, aiming to put powerful AI computing capabilities into orbit. Imagine an entire constellation of solar-powered satellites, all kitted out with Google's special AI chips, and linked together using super-fast laser beams. The idea is to create an AI data centre in space.

Anonymous6 min read

AI Snapshot

The TL;DR: what matters, fast.

Project Suncatcher proposes building AI compute infrastructure in space using satellite constellations equipped with Google TPUs.

Space-based solar panels are more efficient and can generate continuous power, potentially making space ideal for scaling AI compute.

The project addresses challenges such as high-bandwidth inter-satellite communication, orbital dynamics, and radiation effects on hardware.

Who should pay attention: AI researchers | Space engineers | Sustainability advocates

What changes next: Further research and debate on the feasibility of space-based AI infrastructure is expected.

Why on Earth (or off it!) Would We Do This?

AI is a game-changer, isn't it? It's pushing the boundaries of scientific discovery and helping us tackle some massive global challenges. The big question is, where can we really let AI reach its full potential?

Here's the clever bit: the Sun is our solar system's ultimate power source. It churns out an incredible amount of energy, far more than humanity uses. Now, a solar panel up in space, in the right orbit, can be up to eight times more efficient than one on Earth.

Plus, it can generate power almost continuously, meaning you don't need huge, heavy batteries. This hints at a compelling idea: perhaps space is actually the best place to scale up AI compute.

Project Suncatcher takes this idea and runs with it. It envisions these compact fleets of satellites, each carrying those Google TPUs and communicating via "free-space optical links", which are essentially laser connections. This isn't just about massive scale; it's also about minimising the impact on our terrestrial resources. It's a genuinely fresh approach.

Google's shared their early research in a preprint paper, "Towards a future space-based, highly scalable AI infrastructure system design."^[Towards a future space-based, highly scalable AI infrastructure system design]

This paper dives into how they're tackling the big challenges, like incredibly high-bandwidth communication between satellites, understanding orbital dynamics, and how radiation affects computers in space. By designing things in a modular way, with smaller, interconnected satellites, they're laying the groundwork for an AI infrastructure that could grow enormously in space.

This isn't Google's first rodeo with moonshots, either. They've got a history of taking on huge, seemingly impossible scientific and engineering problems. Think about how they started building a large-scale quantum computer a decade ago when many thought it was a pipe dream, or how they envisioned autonomous vehicles over 15 years ago, which eventually led to Waymo, now serving millions of passenger trips globally. Project Suncatcher fits right into this tradition of ambitious thinking.

Getting Down to Brass Tacks: The System and Its Hurdles

So, what does this system look like, and what are the main technical challenges they're facing?

The plan is to have a constellation of networked satellites, probably in a special "dawn-dusk sun-synchronous low Earth orbit". This particular orbit means they'd be exposed to sunlight almost constantly, which is brilliant for continuous power generation and cuts down on the need for heavy batteries.

But to make this work, there are a few significant hurdles to clear:

Imagine a huge data centre on Earth; the connections between all its computers are incredibly fast and have very low lag. AI workloads need this kind of performance, with tasks spread across many accelerators. For space-based AI to compete, the links between satellites need to handle tens of terabits per second. Google's analysis suggests this is actually possible using advanced optical transceivers and something called spatial multiplexing.

However, getting this kind of bandwidth means the power levels received need to be thousands of times higher than what's typical for long-range communication. Since signal strength drops off rapidly with distance, the solution is for the satellites to fly very, very close together – think kilometres or even less. This "closes the link budget", essentially making sure enough signal gets through. The team's already shown this works with a bench-scale demonstrator, achieving a whopping 800 Gbps each way (1.6 Tbps total!) with just one pair of transceivers. Pretty impressive, right?

2. Controlling Huge, Tightly Packed Satellite Formations

Those super-fast inter-satellite links mean these satellites have to fly much closer together than any current system. To figure out how to manage this, they've developed numerical and analytical physics models. They started with well-known orbital equations and then refined them with more advanced models to account for all sorts of gravitational quirks and even atmospheric drag at their planned altitude.

Their models show trajectories for an illustrative 81-satellite constellation. The satellites would be just hundreds of metres apart. This close proximity means they'd only need relatively small "station-keeping" manoeuvres to keep the constellation stable in its sun-synchronous orbit. It's a delicate dance!

3. Making TPUs Tough Enough for Space

For these AI chips to actually work in space, they need to survive the harsh environment of low-Earth orbit, especially the radiation. Google put their Trillium TPUs (which are their v6e Cloud TPUs) through some pretty rigorous testing, blasting them with a 67MeV proton beam to check for "total ionizing dose" (TID) and "single event effects" (SEEs).

The results were really positive. While the memory (HBM subsystems) was the most sensitive part, it only started showing issues after a cumulative dose of 2 krad(Si). That's nearly three times the expected radiation dose over a five-year mission! Crucially, there were no "hard failures" (meaning permanent damage) from TID up to the maximum tested dose of 15 krad(Si). This suggests that Trillium TPUs are surprisingly robust for space applications.

4. The Nitty-Gritty of Money: Economic Feasibility and Launch Costs

Historically, launching things into space has been incredibly expensive, which has been a major roadblock for large-scale space systems. However, Google's looked at historical and projected launch costs and reckons that if the current rate of improvement continues, prices could drop to less than $200/kg by the mid-2030s.

If that happens, the cost of getting a space-based data centre up and running, and then operating it, could actually become comparable to the energy costs of an equivalent data centre on Earth, when you look at it on a per-kilowatt/year basis.

What's Next for Project Suncatcher?

This initial analysis is very encouraging, suggesting that the core idea of space-based AI computing isn't blocked by fundamental physics or insurmountable financial hurdles. However, there are still some big engineering challenges ahead. Think about things like keeping these systems cool in space (thermal management), getting all that AI data back down to Earth quickly (high-bandwidth ground communications), and making sure the entire system is super reliable once it's in orbit.

To start tackling these challenges, the next big step is a "learning mission" in partnership with Planet, an Earth imaging company. They're planning to launch two prototype satellites by early 2027. This experiment will be crucial for seeing how their models and TPU hardware perform in real space conditions and for validating their optical inter-satellite links for distributed AI tasks. This isn't the first time Google has explored AI's potential in unusual places; earlier this year, Google's AI landed on a tiny Aussie island.

Looking further ahead, for constellations that might eventually generate gigawatts of power, we might see even more radical satellite designs. This could involve new computing architectures specifically designed for space, combined with mechanical designs where power collection, computing, and thermal management are all tightly integrated. This kind of advanced integration is also being explored in other areas, such as Elon Musk's big bet on data centres in orbit.

Just like how modern smartphones pushed the boundaries of system-on-chip technology, the sheer scale and integration needed for space-based AI will undoubtedly advance what's possible. It's truly exciting stuff!

What did you think?

Written by

Share your thoughts

Join 4 readers in the discussion below

This is a developing story

We're tracking this across Asia-Pacific and may update with new developments, follow-ups and regional context.

Latest Comments (4)

AIinASIA fan
AIinASIA fan@loyal_reader
AI
3 December 2025

hey, i remember you guys talking about the energy consumption of data centers last month, and that's exactly what i'm thinking here. putting all those AI TPUs in space for power efficiency makes sense on paper, but surely the launch costs and maintenance for a whole constellation of satellites cancel out any gains. feels like a much bigger carbon footprint overall.

Maria Reyes
Maria Reyes@mariar
AI
30 November 2025

Okay, this whole idea of putting AI in space with "Project Suncatcher" really makes me think about financial services here in the Philippines. We have so many remote islands, and reliable internet can still be a big challenge for financial inclusion. If these space-based AI centers could somehow translate into more stable and accessible AI-powered services on the ground, especially for things like credit scoring for small businesses or fraud detection in underserved areas, that would be huge. The article mentions minimizing terrestrial resource impact, and I wonder if that also means making powerful AI more accessible where ground infrastructure is lacking.

Zhang Yue
Zhang Yue@zhangy
AI
24 November 2025

This idea of using space for AI compute is interesting. Reminds me a bit of the discussions around distributed training for models like Qwen or DeepSeek, but taken to an extreme. The paper from Google, "Towards a future space-based, highly scalable AI infrastructure system design," seems to be the core, addressing the radiation aspect which is critical for continuous operation.

Tony Leung@tonyleung
AI
23 November 2025

The power efficiency of solar in orbit compared to Earth is definitely an angle many in HK fintech haven't considered. Regulatory approvals for space infrastructure, now that's a new challenge.

Leave a Comment

Your email will not be published