By Jane Lanhee Lee
(Reuters) – Nvidia Corp, the computing company powering the bulk of artificial intelligence, is positioning itself as a key player in quantum computing with the launch of new software and hardware.
On Tuesday at its developer conference GTC, Nvidia unveiled CUDA Quantum, a platform for building quantum algorithms using popular classical computer coding languages C++ and python. The program would help run the algorithm across quantum and classical computers depending on which system is most efficient in solving the problem.
The new platform is named after CUDA, the software most AI developers use to access Nvidia’s graphics processing unit (GPU) and which has given Nvidia chips a huge competitive edge.
“CUDA Quantum will do the same for quantum computing, enabling domain scientists to seamlessly integrate quantum into their applications and gain access to a new disruptive computing technology,” said Tim Costa, Nvidia’s director of HPC and quantum.
One difference, Costa said, is while CUDA is proprietary, CUDA Quantum is open source and was developed with input from many quantum computing companies.
Nvidia also launched a new hardware system called DGX Quantum to connect the quantum computer with classical computers. It was designed in partnership with Israeli-based startup Quantum Machines whose hardware communicates with quantum processors.
“We see more and more demand to integrate these quantum computers with standard computers,” said Itamar Sivan, co-founder and CEO of Quantum Machines.
While quantum computers could potentially speed up some calculations millions of times faster than the fastest supercomputer, it is still uncertain when that would happen. And even when they become good enough to be useful, they would have to be paired with powerful digital computers to operate, said Sivan.
“All quantum today is research, not production, and that isn’t going to change next week,” said Costa. With DGX Quantum, researchers will be able to develop hybrid applications and critical methods for quantum computing’s future, he added.
(Reporting by Jane Lanhee Lee; Editing by Richard Chang)