Install AIinASIA

    Get quick access from your home screen

    Life

    Google's Moonshot?

    Google's fascinating project called "Project Suncatcher" is a a real moonshot, aiming to put powerful AI computing capabilities into orbit. Imagine an entire constellation of solar-powered satellites, all kitted out with Google's special AI chips, and linked together using super-fast laser beams. The idea is to create an AI data centre in space.

    Anonymous
    6 min read9 November 2025
    Google Project Suncatcher

    AI Snapshot

    The TL;DR: what matters, fast.

    Project Suncatcher proposes building AI compute infrastructure in space using satellite constellations equipped with Google TPUs.

    Space-based solar panels are more efficient and can generate continuous power, potentially making space ideal for scaling AI compute.

    The project addresses challenges such as high-bandwidth inter-satellite communication, orbital dynamics, and radiation effects on hardware.

    Who should pay attention: AI researchers | Space engineers | Sustainability advocates

    What changes next: Further research and debate on the feasibility of space-based AI infrastructure is expected.

    Why on Earth (or off it!) Would We Do This?

    AI is a game-changer, isn't it? It's pushing the boundaries of scientific discovery and helping us tackle some massive global challenges. The big question is, where can we really let AI reach its full potential?

    Here's the clever bit: the Sun is our solar system's ultimate power source. It churns out an incredible amount of energy, far more than humanity uses. Now, a solar panel up in space, in the right orbit, can be up to eight times more efficient than one on Earth.

    Plus, it can generate power almost continuously, meaning you don't need huge, heavy batteries. This hints at a compelling idea: perhaps space is actually the best place to scale up AI compute.

    Project Suncatcher takes this idea and runs with it. It envisions these compact fleets of satellites, each carrying those Google TPUs and communicating via "free-space optical links", which are essentially laser connections. This isn't just about massive scale; it's also about minimising the impact on our terrestrial resources. It's a genuinely fresh approach.

    Google's shared their early research in a preprint paper, "Towards a future space-based, highly scalable AI infrastructure system design."^[Towards a future space-based, highly scalable AI infrastructure system design]

    This paper dives into how they're tackling the big challenges, like incredibly high-bandwidth communication between satellites, understanding orbital dynamics, and how radiation affects computers in space. By designing things in a modular way, with smaller, interconnected satellites, they're laying the groundwork for an AI infrastructure that could grow enormously in space.

    This isn't Google's first rodeo with moonshots, either. They've got a history of taking on huge, seemingly impossible scientific and engineering problems. Think about how they started building a large-scale quantum computer a decade ago when many thought it was a pipe dream, or how they envisioned autonomous vehicles over 15 years ago, which eventually led to Waymo, now serving millions of passenger trips globally. Project Suncatcher fits right into this tradition of ambitious thinking.

    Getting Down to Brass Tacks: The System and Its Hurdles

    So, what does this system look like, and what are the main technical challenges they're facing?

    The plan is to have a constellation of networked satellites, probably in a special "dawn-dusk sun-synchronous low Earth orbit". This particular orbit means they'd be exposed to sunlight almost constantly, which is brilliant for continuous power generation and cuts down on the need for heavy batteries.

    But to make this work, there are a few significant hurdles to clear:

    Enjoying this? Get more in your inbox.

    Weekly AI news & insights from Asia.

    1. Achieving Data Centre-Scale Links Between Satellites

    Imagine a huge data centre on Earth; the connections between all its computers are incredibly fast and have very low lag. AI workloads need this kind of performance, with tasks spread across many accelerators. For space-based AI to compete, the links between satellites need to handle tens of terabits per second. Google's analysis suggests this is actually possible using advanced optical transceivers and something called spatial multiplexing.

    However, getting this kind of bandwidth means the power levels received need to be thousands of times higher than what's typical for long-range communication. Since signal strength drops off rapidly with distance, the solution is for the satellites to fly very, very close together – think kilometres or even less. This "closes the link budget", essentially making sure enough signal gets through. The team's already shown this works with a bench-scale demonstrator, achieving a whopping 800 Gbps each way (1.6 Tbps total!) with just one pair of transceivers. Pretty impressive, right?

    2. Controlling Huge, Tightly Packed Satellite Formations

    Those super-fast inter-satellite links mean these satellites have to fly much closer together than any current system. To figure out how to manage this, they've developed numerical and analytical physics models. They started with well-known orbital equations and then refined them with more advanced models to account for all sorts of gravitational quirks and even atmospheric drag at their planned altitude.

    Their models show trajectories for an illustrative 81-satellite constellation. The satellites would be just hundreds of metres apart. This close proximity means they'd only need relatively small "station-keeping" manoeuvres to keep the constellation stable in its sun-synchronous orbit. It's a delicate dance!

    3. Making TPUs Tough Enough for Space

    For these AI chips to actually work in space, they need to survive the harsh environment of low-Earth orbit, especially the radiation. Google put their Trillium TPUs (which are their v6e Cloud TPUs) through some pretty rigorous testing, blasting them with a 67MeV proton beam to check for "total ionizing dose" (TID) and "single event effects" (SEEs).

    The results were really positive. While the memory (HBM subsystems) was the most sensitive part, it only started showing issues after a cumulative dose of 2 krad(Si). That's nearly three times the expected radiation dose over a five-year mission! Crucially, there were no "hard failures" (meaning permanent damage) from TID up to the maximum tested dose of 15 krad(Si). This suggests that Trillium TPUs are surprisingly robust for space applications.

    4. The Nitty-Gritty of Money: Economic Feasibility and Launch Costs

    Historically, launching things into space has been incredibly expensive, which has been a major roadblock for large-scale space systems. However, Google's looked at historical and projected launch costs and reckons that if the current rate of improvement continues, prices could drop to less than $200/kg by the mid-2030s.

    If that happens, the cost of getting a space-based data centre up and running, and then operating it, could actually become comparable to the energy costs of an equivalent data centre on Earth, when you look at it on a per-kilowatt/year basis.

    What's Next for Project Suncatcher?

    This initial analysis is very encouraging, suggesting that the core idea of space-based AI computing isn't blocked by fundamental physics or insurmountable financial hurdles. However, there are still some big engineering challenges ahead. Think about things like keeping these systems cool in space (thermal management), getting all that AI data back down to Earth quickly (high-bandwidth ground communications), and making sure the entire system is super reliable once it's in orbit.

    To start tackling these challenges, the next big step is a "learning mission" in partnership with Planet, an Earth imaging company. They're planning to launch two prototype satellites by early 2027. This experiment will be crucial for seeing how their models and TPU hardware perform in real space conditions and for validating their optical inter-satellite links for distributed AI tasks. This isn't the first time Google has explored AI's potential in unusual places; earlier this year, Google's AI landed on a tiny Aussie island.

    Looking further ahead, for constellations that might eventually generate gigawatts of power, we might see even more radical satellite designs. This could involve new computing architectures specifically designed for space, combined with mechanical designs where power collection, computing, and thermal management are all tightly integrated. This kind of advanced integration is also being explored in other areas, such as Elon Musk's big bet on data centres in orbit.

    Just like how modern smartphones pushed the boundaries of system-on-chip technology, the sheer scale and integration needed for space-based AI will undoubtedly advance what's possible. It's truly exciting stuff!

    Anonymous
    6 min read9 November 2025

    Share your thoughts

    Join 4 readers in the discussion below

    Latest Comments (4)

    Pallavi Srinivas
    Pallavi Srinivas@pallavi_s_ai
    AI
    3 December 2025

    This "Project Suncatcher" sounds absolutely bonkers, in a brilliant way! But my biggest query is about the data transmission latency. Even with laser links, will the speed be truly negligible for real time AI processing on Earth, considering the sheer astronomical distances involved?

    Luis Torres
    Luis Torres@luis_t_ph
    AI
    26 November 2025

    Wow! This is next level, Google! "Project Suncatcher" sounds wild. For us in the Philippines, with our archipelagic geography and frequent natural disasters, this could really revolutionise remote sensing and disaster response. Imagine AI analysing weather patterns and damage assessments in real-time from space, even for the most isolated islands. The processing power in orbit is a game-changer for underserved areas.

    Gaurav Bhatia
    Gaurav Bhatia@gaurav_b
    AI
    17 November 2025

    A space AI data centre, wow! Project Suncatcher sounds like something straight out of a Sci Fi flick. My only quibble is the "solar powered" bit. Given how finicky satellites can be, relying solely on solar for such powerful AI computing seems a bit ambitious, no? Still, kudos to Google for thinking so big!

    Wendy Sim
    Wendy Sim@wendysim_sg
    AI
    10 November 2025

    Wah, an AI data centre in space? Sounds a bit like something out of a sci-fi flick. While it's cool tech, I wonder about the environmental impact of launching all those satellites. Plus, what if we run into issues with space junk or orbital congestion? Lots of brilliant engineering, but also lots of potential headaches to consider.

    Leave a Comment

    Your email will not be published