A certain type of announcement appears subtly in the morning news cycle and continues to reverberate for days. One of those was the news from Argonne National Laboratory this past week. The Department of Energy, NVIDIA, Oracle, and a few supercomputers with names derived from astronomy make it appear on paper to be just another government collaboration.
However, after a minute of sitting with it, the scale starts to feel different. One system with 100,000 Blackwell GPUs. It’s not an upgrade to the lab. That’s a claim.
| Key Information | Details |
|---|---|
| Institution | Argonne National Laboratory |
| Location | Lemont, Illinois (near Chicago) |
| Parent Agency | U.S. Department of Energy |
| Director | Paul Kearns |
| Major Partners | NVIDIA, Oracle, HPE, World Wide Technology |
| Flagship System | Solstice — 100,000 NVIDIA Blackwell GPUs |
| Secondary System | Equinox — 10,000 Blackwell GPUs |
| Expected Delivery | Equinox in 2026 |
| Additional Systems | Minerva, Janus, Tara |
| Software Stack | NVIDIA Megatron-Core, TensorRT |
| Connected Facilities | Advanced Photon Source, DOE scientific instruments |
| Strategic Purpose | Sovereign AI capability, scientific discovery, national competitiveness |
The flagship device, named Solstice, will be the biggest AI supercomputer in the DOE lab complex. With 10,000 Blackwell chips, Equinox, a smaller sibling that is still enormous by any standard, is already under construction at the Argonne Leadership Computing Facility. It is anticipated that Equinox will occur in 2026. There will be a solstice. Even though the deadlines are ambitious, there’s a feeling that the participants genuinely think they can meet them, which isn’t always the case with projects this size.
It’s not just the hardware that makes the partnership intriguing. It’s the framework. While the larger machines are being constructed, Oracle is giving researchers instant access to NVIDIA Hopper and Blackwell systems, so they won’t have to wait two years to begin working. In federal science, such a bridge arrangement was once uncommon.

It is now being used as a model. Chris Wright, the Energy Secretary, called it a “commonsense approach to computing partnerships,” which is a neat way to put it, but it ignores how uncommon this level of speed is for government work.
The political timing is difficult to ignore. The announcement frames the project as part of a larger national push and specifically references President Trump’s executive orders on data center permitting and AI leadership. Depending on your point of view, you might interpret that as clever industrial policy or branding. In any case, physics doesn’t give a damn about politics; the machinery is being built.
Jensen Huang, who is always full of metaphors, referred to it as a “AI factory”—his go-to phrase for these kinds of systems these days. The phrase has begun to sound more like a literal description and less like marketing. These facilities actually produce something: weather systems that used to take years to run, trained models, reasoning abilities, and simulations of materials and molecules. In order to enable AI to direct experiments in real time rather than merely analyze them after the fact, Argonne intends to directly connect Solstice and Equinox into already-existing experimental hardware, such as the Advanced Photon Source.
Despite being overshadowed by the headline numbers, there are three other systems that are worth discussing. Minerva is optimized for inference and was constructed using WWT. The focus of Janus, which was developed in collaboration with HPE, is workforce development, which is a polite way of saying that someone needs to train the next generation of researchers who will actually use these machines.Tara completes the group. In the long run, these might be more important than the massive systems that people are tweeting about, even though they aren’t glamorous.
Whether this level of investment truly results in discoveries quickly enough to justify itself is the deeper question, which no one fully addresses in the press release. Supercomputing has a long history of both costly setbacks and remarkable outcomes. It’s easy to feel hopeful as you watch this develop. It is more difficult to predict whether science will advance at the same rate as silicon.
