Nvidia is launching open-source AI models to speed up quantum computers
The Ising model family targets quantum error correction and processor calibration, 2 of the biggest obstacles to practical quantum computing

NurPhoto / Getty Images
Nvidia $NVDA has launched a family of open-source AI models called Ising, designed to help researchers and companies build quantum processors capable of running real-world applications. The announcement coincides with World Quantum Day.
The Ising family launches with two model domains. Using a vision language model architecture, Ising Calibration handles the continuous tuning of quantum processors automatically, reducing a process that previously took days to one measured in hours, according to Nvidia. For error correction, the Ising Decoding component offers two distinct versions of a 3D convolutional neural network — one built to prioritize processing speed, the other to maximize accuracy.
Related Content
Benchmarked against PyMatching — the prevailing open-source standard for this work — the combined model suite achieves error correction decoding that is up to 2.5 times faster and three times more accurate, Nvidia said.
"AI is essential to making quantum computing practical," Nvidia CEO Jensen Huang said in a statement. "With Ising, AI becomes the control plane — the operating system of quantum machines — transforming fragile qubits to scalable and reliable quantum-GPU systems."
The two model domains address what Nvidia describes as quantum computing's core bottleneck: qubits are inherently noisy. Sam Stanwyck, Nvidia's director of quantum product, told reporters that current top-tier quantum processors achieve roughly one error per thousand operations — impressive by today's standards, but far short of the one-in-a-trillion threshold that would make them genuinely useful accelerators, according to CIO.
Calibration reduces noise in each processor, while error correction decoding catches and fixes errors in real time faster than they accumulate, Nvidia said. The Ising Calibration model, trained on data spanning multiple qubit types including superconducting qubits, quantum dots, ions, and neutral atoms, outperformed several general-purpose AI models on QCalEval, a benchmark Nvidia developed with quantum partners to evaluate calibration tasks.
Nvidia is pairing the model release with a set of supporting resources: a cookbook covering quantum computing workflows and training data, plus NIM microservices developers can use to adapt the models to their specific hardware configurations. The models can also run locally on researchers' systems to protect proprietary data.
Institutions already adopting Ising include Fermi National Accelerator Laboratory, Harvard John A. Paulson School of Engineering and Applied Sciences, Lawrence Berkeley National Laboratory's Advanced Quantum Testbed, IQM Quantum Computers, Infleqtion, Cornell University, Sandia National Laboratories, and the U.K. National Physical Laboratory, among others, Nvidia said.
Ising joins Nvidia's broader open model portfolio, which includes Nemotron for agentic systems, Cosmos for physical AI, Alpamayo for autonomous vehicles, Isaac GR00T for robotics, and BioNeMo for biomedical research. Model weights are available on Hugging Face and build.nvidia.com, with training frameworks on GitHub under the Apache 2.0 license.
AI and quantum computing have been converging as each technology proves useful to the other. AI tools have been improving quantum circuit design and error correction, while quantum processors show promise for specific AI tasks such as fraud detection and generating synthetic training data. The infrastructure requirements for the two technologies differ significantly, however — quantum systems require extreme cooling and specialized facilities that have kept the technology largely confined to laboratories.
A report published the same day by the Quantum Economic Development Consortium put the global quantum market at $1.9 billion for 2025, per CIO. Looking ahead, the consortium projects annual growth of 30%, with the market expected to hit $3 billion by 2028.