SpaceX’s recent filing with the FCC for a vast million-satellite data center network, initially met with some skepticism, has quickly proven to be a profoundly serious undertaking by Elon Musk.
A pivotal development underscoring this ambition was the formal merger between SpaceX and xAI on Monday. This consolidation of Musk’s space and artificial intelligence ventures strongly suggests a coordinated infrastructure initiative, making their combined efforts significantly more coherent.
Beyond the merger, the concept of orbital AI data clusters—essentially, interconnected computing networks operating in space—is rapidly solidifying into a concrete proposal. The FCC officially accepted the filing on Wednesday, initiating a public comment period. While this is typically a routine procedural step, FCC Chairman Brendan Carr notably shared the filing on X. Given Chairman Carr’s established record of supporting allies and challenging adversaries, the proposal is widely anticipated to proceed smoothly, provided Musk maintains a favorable relationship with the current administration.
Concurrently, Elon Musk has begun publicly articulating the rationale behind orbital data centers. During a recent episode of Stripe co-founder Patrick Collison’s podcast, “Cheeky Pint,” featuring guest Dwarkesh Patel, Musk presented the fundamental case for relocating a substantial portion of AI computing power into space. His primary argument centers on the fact that solar panels generate significantly more power in space, thereby reducing one of the major operational expenditures for data centers.
“It’s harder to scale on the ground than it is to scale in space,” Musk stated on the podcast. He elaborated, “Any given solar panel is going to give you about five times more power in space than on the ground, so it’s actually much cheaper to do in space.”
However, a closer look at the economic premise reveals certain considerations. While it is true that solar panels exhibit higher power output in space, this alone does not definitively conclude that orbital operations are universally cheaper, as power is not the sole cost factor in data center operation, nor are solar panels the only power source. Dwarkesh Patel, during the podcast, highlighted this point and also raised concerns regarding the maintenance and servicing of GPUs that might fail during intensive AI model training.
Despite these challenges, Musk remained undeterred, designating 2028 as a critical tipping point for orbital data centers. He confidently asserted, “You can mark my words, in 36 months but probably closer to 30 months, the most economically compelling place to put AI will be space.”
Musk’s projections extended further, with a bold forecast: “Five years from now, my prediction is we will launch and be operating every year more AI in space than the cumulative total on Earth,” he continued.
For context, global data center capacity is projected to reach an estimated 200 GW by 2030, representing approximately a trillion dollars’ worth of infrastructure if developed solely on Earth.
This strategic alignment is undeniably advantageous for Musk, particularly as SpaceX’s core business revolves around launching payloads into orbit, now complemented by its integrated AI company. With the newly formed SpaceX-xAI conglomerate reportedly heading for an IPO in the coming months, increased discourse surrounding orbital data centers is highly probable. Given that technology companies continue to invest hundreds of billions of dollars annually in data center infrastructure, there is a distinct possibility that a portion of this substantial investment could eventually transition beyond Earth’s atmosphere.
The Editorial Staff at AIChief is a team of professional content writers with extensive experience in AI and marketing. Founded in 2025, AIChief has quickly grown into the largest free AI resource hub in the industry.