As of November 2025, there were more than 4,100 data centers operating in the United States, the most of any country in the world. These server farms are the backbone of the digital economy, powering everything from video streaming to AI model training. But they’re also becoming one of the planet’s biggest environmental liabilities, devouring electricity, draining freshwater reserves, and pushing local grids to their limits.
A startup in Redmond, Washington, believes the solution lies off the planet.
StarCloud’s vision is to build orbital data centers which are GPU-powered satellites that run on solar energy and radiate heat into deep space instead of evaporating fresh water on Earth. Founded less than two years ago by Philip Johnston, the company aims to move computing beyond the constraints of land, power grids, and environmental limits.
Yesterday, StarCloud became the first company to launch an Nvidia H100 GPU into orbit, marking the birth of data centers in space.
A Data Center That Orbits Earth
StarCloud’s first satellite, StarCloud-1, weighs about 60 kilograms and is roughly the size of a small refrigerator. It launched aboard a SpaceX Bandwagon mission and has successfully separated from the rocket, entering a 325-kilometer orbit. The spacecraft, built on Astro Digital’s Corvus-Micro bus, will operate for around 11 months before deorbiting and burning up in the atmosphere.
The onboard H100 GPU provides 100 times more computing power than any previous space-based system. Johnston says the satellite will run demonstration workloads—including Google’s Gemini model and even attempt to fine-tune and train AI models in space, something no one has done before.
The purpose of the mission is to prove that data center-grade hardware can operate efficiently in orbit, surviving radiation exposure and dissipating heat in a vacuum.
From Idea to Orbit
StarCloud was founded just 21 months before the launch of its first satellite. The startup designs and assembles spacecraft components at its facility in Redmond, including aluminum frames, compute modules, and large radiators. Each system undergoes vibration and environmental tests before flight.
What typically takes aerospace startups four years from founding to deployment, StarCloud accomplished in 15 months. Johnston credits the team’s multidisciplinary expertise.
StarCloud’s founding team combines deep expertise across disciplines, with CTO Ezra, who holds a PhD in engineering and previously designed deployable structures for NASA’s lunar pathfinder mission, and Addi, a veteran of Microsoft and SpaceX, who leads compute integration and ensures reliability in radiation-heavy environments.
The team’s goal is to develop massive orbital data centers, eventually scaling up to 5-gigawatt clusters powered by solar arrays spanning several square kilometers. Their next satellite, StarCloud-2, scheduled for October 2026, will feature Nvidia’s upcoming Blackwell GPUs, higher compute density, and optical communication terminals for 24/7 connectivity.
Why Space?
On Earth, data centers face three growing problems. Land scarcity, grid strain, and water consumption. Large facilities consume millions of liters of water annually for evaporative cooling. StarCloud’s orbital systems use a closed-loop cooling mechanism that radiates heat into space through infrared emission, eliminating water use entirely.
Operating in a sun-synchronous orbit, the satellites receive constant solar exposure, enabling uninterrupted power generation without batteries or fossil fuels. Johnston says the energy costs in space could be ten times cheaper than on Earth even after accounting for launch expenses.
“In space, you get almost unlimited, low-cost renewable energy,” he said. “The only cost on the environment will be the launch. After that, we can achieve tenfold carbon-dioxide savings over the life of the data center compared with running it terrestrially.”
The company didn’t start with the idea of space data centers. Initially, Johnston and his team explored space-based solar power, designing satellites that would beam energy down to Earth. But their analysis revealed that such systems lose around 95 percent of transmitted energy, and would only become viable at a launch cost of $50 per kilogram, far below today’s prices.
Instead, they pivoted: why not move computation to where the power is? At an estimated $500 per kilogram, the economics of placing compute hardware in orbit made far more sense.
Competitors To Starcloud
While StarCloud was once alone in the idea, major technology firms are now moving in the same direction.
In early November, Google announced Project Suncatcher, an initiative to deploy satellites equipped with high-performance processors for machine learning in space. The company said solar panels in orbit can be up to eight times more efficient than those on Earth, offering continuous power without large battery systems.
Elon Musk has said SpaceX plans to deploy data centers within its Starlink network, while Jeff Bezos of Blue Origin predicted gigawatt-scale orbital data centers within the next decade. Even Eric Schmidt, former CEO of Google, recently acquired Relativity Space with a stated goal of enabling compute infrastructure in orbit.
Johnston acknowledges that skepticism remains high especially around the challenge of heat dissipation but insists StarCloud’s proprietary radiator technology, designed by co-founder Ezra, addresses the problem through lightweight, deployable thermal panels. Half of StarCloud’s engineering team focuses on refining these systems.
The company’s second satellite will serve as its first commercial offering, with GPU clusters, persistent storage, and continuous access for other satellites and research institutions. StarCloud is also partnering with Crusoe, a U.S. compute infrastructure company, to offer limited access beginning in 2027.
If the concept scales, orbital data centers could dramatically reduce the resource footprint of computing on Earth. By operating beyond land and water constraints, such systems would relieve stress on power grids and cut the carbon intensity of AI workloads.
Still, Johnston acknowledges that the path forward is long. “Anything worth doing is going to be hard,” he said. “If something is too easy, it probably doesn’t have the same potential outcome. So we decided to do the biggest, most ambitious thing we could build data centers in space.”
StarCloud-1 is circling above the planet, running the first GPU-accelerated workloads in orbit. But its success could mark the start of a radical shift in how computing infrastructure is built.
As Johnston puts it, “Within ten years, nearly all new data centers could be built in outer space.”








