Dell, EMC, Dell Technologies, Cisco,

Saturday, December 9, 2017

Why Google Is Poised to Hit the Next Critical Milestone in Quantum Computing

MOUNTAIN VIEW, CA—Earlier this week, representatives from automotive and airline companies, big banks, software companies, and the military met to learn the basics of #quantummechanics at #NASA. And that was only a small part of it. #Quantumcomputing software startup @QCWare hosted the first quantum computing for business, or Q2B conference, at NASA Ames. Businesspeople seriously considering quantum computing met with experts, entrepreneurs, and professors to learn and see what future quantum computing might have in store for their businesses. There’s been plenty of progress and a lot of hope despite the fact that quantum computing is still in its 1950s, room-sized punch card computing phase. And importantly, conference participants also learned how to realistically judge how “good” a quantum computer really is. ADVERTISEMENT  Rumors have swelled that Google will announce “quantum supremacy” soon—essentially, that they will have created a quantum processing device that can solve a problem provably and unequivocally faster than a regular computer can. Such would be a major milestone in the quantum computing world, and Google hinted to some of the details at the conference. “Press releases always talk about quantum space race in number of qubits,” said John Martinis, University of California, Santa Barbara physicist working with Google on their Quantum Supremacy project. “It’s more than just quantity, it’s qubit quality.” More on that in just a bit. As a reminder, conventional computers are machines that perform calculations using a series of bits—any physical system whose most basic parts can take on one of two discrete values, like a coin. A quantum computer instead has quantum bits, or qubits. Qubits have some probability of simultaneously being zero or one during a calculation. Quantum computing algorithms perform calculations by manipulating these qubits via the mathematics of quantum mechanics. At its core, that math is just probability combined with complex numbers and the linear algebra you may have learned during your freshman year of college. ADVERTISEMENT  The next biggest quantum computing milestone is for some company to show unequivocally that their machine can easily solve problems difficult for a classical computer to solve. Not only does that company need a good quantum computer, but it needs the right problem to stump a classical computer. That problem must be hard, require a lot of high-quality qubits, and be generalizable to any quantum computer, said Martinis. In Google’s case, the team will set up a quantum circuit with their qubits by entangling them (essentially, setting up a quantum link between them) and then allowing the system to evolve over time. At the end, how the qubits evolve is set by the rules of quantum mechanics, but the final measurement could take on different values with different probabilities. Figuring out the possible outcomes of the qubits, alongside the probability of measuring the outcomes, is so complex that the classical computer needs to simulate the quantum computer in order to do so, and might take weeks to do what the quantum computer can do in minutes. This problem will offer the researchers a metric that can demonstrate not only that the quantum computer has too many qubits to simulate classically, but also require the qubits to be good ones. That means they can’t produce the wrong value or turn into regular bits by interacting with the environment or other qubits for a long time. There are other quantum supremacy problems, too. These problems aren’t that useful for industry, but test the limits of regular computing and even the limits of quantum mechanics. And since they require the most advanced classical computers running their fastest algorithms for comparison, these problems will help push the boundaries of regular computing, too.

https://gizmodo.com/why-google-is-poised-to-hit-the-next-critical-milestone-1821121798

No comments:

Post a Comment