
The ultimate solution to the "energy bottleneck"? Musk, Bezos, and Google are all eyeing "space data centers"

A paper from Alphabet points out that the sun's output power reaches 3.86 × 10^26 watts, which is over 100 trillion times the total electricity generation of humanity. At some point in the future, the best way to power artificial intelligence may very well be to harness this enormous energy source more directly. Although sending data centers into orbit faces multiple severe challenges such as cost, heat dissipation, and reliability, this move may ultimately become a necessary option as the demand for AI computing power grows exponentially
The endless demand for energy from AI is increasingly approaching the supply ceiling of Earth's resources. To this end, tech giants are turning their attention to a field that seems to belong to the realm of science fiction but could become the ultimate solution: establishing data centers in space.
On November 4th, Google announced the launch of the "Suncatcher Project," planning to launch two prototype satellites equipped with its self-developed TPU artificial intelligence chips in early 2027, in collaboration with satellite company Planet Labs to develop hardware. Google CEO Sundar Pichai stated that preliminary research shows its chips can withstand the radiation environment of low Earth orbit.
Just a week before Google's announcement, Musk also publicly declared that his company SpaceX "will do this." He stated that by expanding the scale of its V3 version of the "Starlink" satellites, a space-based data center could be constructed.
Meanwhile, Amazon founder Jeff Bezos also predicted that gigawatt-level data centers will appear in space in ten years. The startup Starcloud has already successfully launched a test satellite equipped with NVIDIA GPUs, and the number of participants in this race is continuously increasing.
The common goal of these industry leaders is to harness the immense energy of the sun to power AI computing demands, thereby circumventing the increasingly strained resources of energy, land, and water on Earth.
Although sending data centers into orbit faces multiple severe challenges such as cost, heat dissipation, and reliability, with the exponential growth of AI computing demands, this move may ultimately no longer be an "optional choice," but rather a "necessary option."
Why Space? The Answer is Energy
The seemingly science fiction idea of sending data centers into space has one core driving force: energy.
With the surge in demand for AI model training and inference, the scale, power consumption, and cooling costs of ground data centers are expanding at an unprecedented rate, putting immense pressure on Earth's resources such as land, water, and electricity.
Space theoretically provides the ultimate solution. Google pointed out in its research paper:
The sun is the largest energy source in the solar system to date, with an output power of 3.86 × 10^26 watts, which is over 1 trillion times the total electricity generation of humanity. Deploying a constellation of satellites equipped with solar panels in space can "harvest energy" under nearly constant sunlight.
Google believes that in the long run, space-based data centers "may be the most scalable solution," while minimizing the impact on Earth's resources.
Philip Johnston, CEO of startup Starcloud, expressed a similar view, claiming:
Apart from the environmental costs of the launch itself, space-based data centers can save 10 times the carbon emissions compared to operating data centers on Earth.
Tech Giants' Space Blueprint
Despite having a common goal, the paths to realization differ, with each relying on its own strengths to build a space blueprint.
Google's blueprint is a solar-powered, interconnected satellite network that forms an AI computing cluster in orbit. Google plans to validate core technologies with prototype satellites by 2027, including enabling the satellite constellation equipped with its custom TPU chips to transmit data via space-based lasers In contrast, Musk's plan is based on its existing massive infrastructure.
He proposed that it would only require "simply scaling up" the upcoming Starlink V3 satellites. These satellites are equipped with high-speed laser links designed for gigabit-level internet speeds. The key to realizing this plan lies in whether SpaceX's next-generation heavy-lift rocket, "Starship," can succeed and reduce launch costs.
At the same time, startups represented by Starcloud are also actively entering the field.
The company recently launched a test satellite equipped with NVIDIA's H100 GPU, aiming to provide GPU computing power that is 100 times more powerful than any existing space-based computing. Its ultimate goal is to establish an orbital data center that is 2.5 miles wide and has a power output of 5 gigawatts.
Severe Challenges: Cost, Heat Dissipation, and Reliability
Despite the enticing prospects, deploying massive computing systems in space still requires overcoming a series of severe technical and economic obstacles.
First is the launch cost.
Google's research analysis indicates that only when launch prices continue to decline, reaching below $200 per kilogram by the mid-2030s, can the launch and operational costs of space-based data centers be roughly comparable to the energy costs of equivalent data centers on Earth.
Achieving this goal heavily relies on the success of companies like SpaceX in reusable launch technology.
Second is heat management. This is one of the biggest technical challenges faced by in-orbit computing.
Space is a vacuum environment, lacking air as a medium to dissipate heat, making it extremely difficult for equipment to cool down. Google briefly mentioned in its paper that cooling would be achieved through "thermal systems of heat pipes and radiators," but did not provide detailed plans, highlighting the challenges of this technology.
Finally, system reliability, high-bandwidth ground communication, and radiation protection are also issues that must be addressed.
Electronic devices are prone to errors in the space radiation environment. Although initial test results of Google's TPU are optimistic, ensuring the long-term stable operation of the entire data center in orbit still requires a significant amount of technical validation and engineering innovation.
The ambition displayed by space data centers is astonishing, blurring the boundaries between science fiction and reality. However, this is not merely a technical showcase. As AI capabilities and societal utility continue to grow, converting energy into "computing power" may become one of the core tasks of future society.
If the demand for computing power from AI continues to grow exponentially at the current rate, the Earth's limited resources will inevitably become the ultimate bottleneck for its development. At that time, seeking solutions in space and utilizing the "main engine" of the solar system to drive humanity's "thinking machines" may shift from a distant vision to a logical and ultimately necessary step.
As noted by Google Research, this may be the only path with sufficient scalability to meet the needs of future AI civilization.

