Claude Code Doubles Rate Limits for Paid Plans After SpaceX/xAI Compute Deal

Claude Code

On May 6, 2026, Anthropic announced a landmark compute agreement with SpaceX and xAI granting access to the full capacity of the Colossus 1 data center in Memphis β€” more than 300 megawatts of GPU infrastructure. The immediate user-facing result is that Claude Code rate limits for Pro and Max subscribers have been doubled, peak-hour usage caps have been removed, and request volume for Claude Opus models has increased sharply. The partnership also includes a roadmap to co-develop multi-gigawatt compute capacity deployed in space, positioning Anthropic for long-term infrastructure independence.


A Compute Deal at Unprecedented Scale

On May 6, 2026, Anthropic announced it has secured access to the entire capacity of the Colossus 1 data center in Memphis, Tennessee β€” operated by SpaceX and xAI β€” through a new long-term compute agreement. The facility spans more than 300 megawatts of GPU infrastructure, making it one of the largest AI compute deployments in the world.

The deal gives Anthropic immediate access to significantly more inference capacity than it has previously operated, and it comes with a forward-looking commitment: Anthropic, SpaceX, and xAI intend to co-develop multi-gigawatt compute installations designed to be deployed in orbit, a move that would place AI inference infrastructure beyond the constraints of terrestrial power grids and cooling systems.

What Changes for Claude Code Users

The most tangible impact is on rate limits. Effective immediately:

  • Claude Code Pro subscribers see their rate limits doubled across all model tiers
  • Claude Code Max subscribers receive equivalent capacity increases, with peak-hour throttling removed entirely
  • Request volume for Claude Opus models β€” previously the most constrained tier β€” has increased sharply, making sustained use of the most capable model significantly more accessible
  • Peak-hour usage caps, which had previously throttled requests during high-demand windows, have been lifted for paid subscribers

For developers running long autonomous sessions, multi-agent pipelines, or high-frequency tool calls, these changes translate directly into fewer interruptions and lower latency on Opus-class requests.

Strategic Context: Musk, xAI, and the SpaceX Merger

The deal carries notable context. Earlier in 2026, Elon Musk β€” whose SpaceX now provides the compute β€” was publicly critical of Anthropic and its safety-focused positioning in the AI industry. SpaceX and xAI completed a corporate merger earlier this year, consolidating Musk's AI and space infrastructure under a unified entity.

The Anthropic-SpaceX/xAI agreement is therefore striking both for its scale and its counterintuitive pairing. Anthropic's CEO Dario Amodei framed the partnership as pragmatic: access to the world's largest available GPU cluster accelerates Claude's capabilities and availability, regardless of competitive dynamics at the organizational level.

For users, the subtext matters less than the outcome β€” more compute means better access, higher throughput, and fewer rate-limit interruptions during critical work sessions.

The Road to Gigawatt-Scale Space Compute

The most forward-looking element of the announcement is the joint roadmap for orbital compute infrastructure. Current large-scale AI inference is constrained by the availability of power and cooling on the ground β€” both resources that data centers compete for intensely. Orbital deployment bypasses these constraints: solar power is abundant in space, radiative cooling is unlimited, and latency to ground-based users can be managed with low-earth orbit satellite constellations.

SpaceX's Starship and Starlink infrastructure provide a credible path to deploying and operating data centers in orbit at scale β€” something no other entity in the world currently has the launch capacity to attempt. For Anthropic, co-developing this infrastructure represents a long-term hedge against terrestrial compute scarcity and a potential competitive advantage as AI model sizes and inference demands continue to grow.

Availability

The rate limit increases and removal of peak-hour caps are live for all paid Claude Code subscribers as of May 6, 2026. No action is required β€” existing Pro and Max subscriptions automatically receive the expanded capacity.