Hacker News new | past | comments | ask | show | jobs | submit login
Quantum Virtual Machine to accelerate research and learning (blog.google)
96 points by jedwhite on July 19, 2022 | hide | past | favorite | 52 comments



I just tried IBM Quantum Composer[1] after reading through this Colab and finding I didn't know enough about quantum circuits to do anything besides clicking play. Quantum Composer gave me a super simple drag and drop GUI for getting familiar with basic circuits/building blocks.

I made it 20 minutes before having to look up a Bloch sphere (happens when you start experiments with 'S' and 'Z' blocks which add phase shifts). I don't directly use a lot of IBM products, and I had a great experience with this one!

Link: https://quantum-computing.ibm.com/composer


Have you looked into https://qiskit.org ?


Yup, from my understanding and experience IBM is by far the furthest and easiest to use. As in, I can throw up my python shell, write 20 lines of code, and that thing will run my circuit on a real quantum computer in the next 10 minutes.


So this is a tool for developing and experimenting with quantum algorithms without needing access to an actual quantum computer. It runs in Colab and simulates the expected results on a regular computer. I don’t know whether there are other existing tools in this space, but that seems really cool since it lets regular people explore the ideas behind quantum computing, and makes it faster for developers to iterate on algorithms.


Not exactly new no. QC simulators have been in use since before the actual machine have. And even a schmuck like me has developed a QVM for a phd nearly a decade ago. https://github.com/yvdriess/qvm

Their qvm is probably more accurately stimulating the exact machine behaviour of their hardware.


Doesn't Amazon AWS already have this?


We don't want our monthly invoice to be uncertain.


Sorry you must be thinking of classical AWS


That took like 20 seconds. Great catch.



> Several decades ago, quantum computers were only a concept — a distant idea discussed mostly in lecture halls. Flash forward to today, and the race is on to build fault-tolerant quantum computers and discover new algorithms to apply them in useful ways.

Uhh, they’re still a concept. Fast forward to today and that’s why you only have a “quantum virtual machine” - more like quantum vapor ware.


Uhhh no they aren’t, we have working quantum computers today


From https://www.technologyreview.com/2022/03/28/1048355/quantum-... :

> The qubit systems we have today are a tremendous scientific achievement, but they take us no closer to having a quantum computer that can solve a problem that anybody cares about.

-- Sankar Das Sarma, Distinguished University Professor, Condensed Matter Theory Center, Univ. of Maryland

From https://www.linkedin.com/pulse/quantum-computing-hype-bad-sc... :

> Crazy headlines abound: "quantum computing will change life as we know it," "quantum computing will solve global warming," "Quantum computing will revolutionize science and industry," etc etc. These statements are not based on any research or reality at all, they are not even wishful thinking. The number of known quantum algorithms, which promise advantage over classical computation, is just a few (and none of them will "solve global warming" for sure). More importantly, exactly zero such algorithms have been demonstrated in practice so far and the gap between what’s needed to realize them and the currently available hardware is huge, and it's not just a question of numbers. There are qualitative challenges with scaling up, which will likely take decades to resolve (if ever).

-- Victor Galitski, Professor, Joint Quantum Institute, Univ. of Maryland


This can be true, and the part quoted by the OP ("Flash forward to today, and the race is on to build fault-tolerant quantum computers and discover new algorithms to apply them in useful ways.") can also be true.

The reason we don't have "a quantum computer that can solve a problem that anybody cares about" is because of that lack of fault tolerance, not because we don't have some experimental quantum computers that work in laboratory tests.


What has that got to do with the comment you’re replying to?


Depends on how generously you define "working". No one has yet implemented even a single logical qubit.


Well that is incorrect.

Google's experimental quantum computer has been used to build logical qubits using up to 21 physical qubits.

However "The team believes that mature quantum computers will need 1000 qubits to make each logical qubit – Sycamore currently has just 54 physical qubits."

(A "mature quantum computers" is one where the fault tolerance is high enough for it to be generally useful outside experimental settings)

https://www.newscientist.com/article/2283945-google-demonstr...


A logical qubit is by definition one which has a low enough error rate to preform quantum computations. That article is misleadingly using the term "logical qubits" to refer to a grouping of physical qubits which merely attempts to reduce the error rate. The resulting rate is still too high to perform computations so it cannot, yet, be called a logical qubit in the accepted sense.


Well "perform quantum computations" is being done by the Google quantum computer, so... they have done it?


If you have to verify it on a classical computer then no.


Good thing then that you don't!

Eg: https://scottaaronson.blog/?p=5122


“A somewhat similar story can be traced back to the 13th century when Nasreddin Hodja made a proposal to teach his donkey to read and obtained a 10-year grant from the local Sultan. For his first report he put breadcrumbs between the pages of a big book, and demonstrated the donkey turning the pages with his hoofs. This was a promising first step in the right direction. Nasreddin was a wise but simple man, so when asked by friends how he hopes to accomplish his goal, he answered: “My dear fellows, before ten years are up, either I will die or the Sultan will die. Or else, the donkey will die.”

Had he the modern degree of sophistication, he could say, first, that there is no theorem forbidding donkeys to read. And, since this does not contradict any known fundamental principles, the failure to achieve this goal would reveal new laws of Nature. So, it is a win-win strategy: either the donkey learns to read, or new laws will be discovered.”


No we don't. There exists no physical working quantum computer today. There never has been a working physical quantum computer. There exists a "theoretical model that is simulated on classical computers" which is not a physical working quantum computer.

https://scottlocklin.wordpress.com/2019/01/15/quantum-comput...

I wish it weren't vaporware, truly.


I read this post when it came out, nodded to myself and assumed it was correct because there seemed to have been no progress in quantum computing in my lifetime.

Turns out he is wrong: https://www.nature.com/articles/d41586-019-03213-z

I'd invite you to read the blog of that other HN favorite (Scott Aaronson) on the topic:

For me, though, the broader point is that neither party here—certainly not IBM—denies that the top-supercomputers-on-the-planet-level difficulty of classically simulating Google’s 53-qubit programmable chip really is coming from the exponential character of the quantum states in that chip, and nothing else.

https://scottaaronson.blog/?p=4372

And later when Jianwei Pan and Chao-Yang Lu, got BosonSampling working on a quantum computer:

While directly verifying the results of n-photon BosonSampling takes ~2n time for any known classical algorithm, I said, surely it should be possible with existing computers to go up to n=40 or n=50? A couple weeks later, the authors responded, saying that they’d now verified their results up to n=40, but it burned $400,000 worth of supercomputer time so they decided to stop there. This was by far the most expensive referee report I ever wrote!

https://scottaaronson.blog/?p=5122


https://www.nature.com/articles/s41586-022-04725-x

> Quantum computational advantage with a programmable photonic processor

There is a programmable photonic computer that has an advantage over classical computers


Oh, where can I buy one? See it in use at least, physically?


Me and my wife have been exploring the current state of quantum computing to apply it to procedural generation and games - I don't think we'll be able to create something we couldn't with classical computing, but just moving from a PRNG to what I always call "quantum chaos" is just fun.


have a blog, or any writeups/screenshots/videos of your quantum chaos? I'm interested!


Nothing yet - but I'll be sure to write up something as we go!


I hope, the Haskell-based Quipper [1][2] quantum programming language will get more attention and support by this QVM. The linear types[3] recently added to Haskell make very good fit for quantum computing, unlike many other existing languages.

[1] https://www.mathstat.dal.ca/~selinger/quipper/

[2] https://hackage.haskell.org/package/quipper

[3] https://ghc.gitlab.haskell.org/ghc/doc/users_guide/exts/line...


I have often wanted to know if quantum computing is near or one of those technologies always 'just around the corner'. Should I commit some time to learning the basics so that I am one of the few with cross over knowledge of say quantum computing and agricultural data.


It's the 1950s for it. It's real, people use it, but it's more a toy and has some breakthroughs ahead of it to be at scale.

Fwiw after following it closely for about 3 years now, I wouldn't speculatively load up on it. Even the theory of problems that could benefit from it, if it existed as scale, only has a few


I thought so too, until I waded into a post-quantum crypto algorithm paper.

It turns out you need to understand how a quantum computer works in order to design an algorithm resistant to it.


Ok thanks. I appreciate that insight.


https://www.microsoft.com/en-us/research/video/quantum-compu...

^ This is a good intro that doesn't do much hand-wavy pop-sci.


Off-topic but what ever happened to this [1] Google quantum computing breakthrough?

Major buzz happened in 2019 but since then, it feels, crickets. Did the NSA swoop in and silence all publication of further development or something?

[1] https://www.nytimes.com/2019/10/23/technology/quantum-comput...



Major buzz happened because Google made big claims. Were those actually justified, or were the critics cited in the article correct?


If they were shown to be false, I would have expected that discussion to surface occasionally. However, all I've heard is silence.


Seems rather strange to me that this is an emulated process when the whole point of quantum is that it can do things classic computing can't?


It's not true that quantum can perform beyond classical computations. It's only a resource advantage for some time complexities.


I'm going to use this to make AI to figure out the killer app for blockchain, then use the infinite funds I generate to build a fully self driving car.


Is there a term for always pointing out that something is over hyped? It's becoming a thing now


People thought that cars were overhyped. Napoleon mocked steamships. It’s not a new phenomenon.


How much of this is historical survivorship bias, though? There are also Segways.


in Rust or in Go?


Javascript obviously since the language itself operates in a quantum domain even when running on a deterministic x86/x64 architecture.


What types of problems are solvable now that this sort of infra is possible?


It seems pretty clear that a QVM isn’t going to magically offer better performance than the underlying hardware. The main benefit I’d guess at as a layman would be that they can write more meaningful quantum logic now and run it somewhere while they wait for the real deal to scale


Sorry, I'm referring to the actual quantum computing hardwarw that this is emulating. My question is poorly phrased considering the topic.


I think one of the biggest applications for quantum computers which isn't well appreciated is simulation of quantum systems: chemistry and materials science. Similar to the old analog computers which were used to simulate differential equations long before they could be fully "virtualized" (simulated deterministically without noise) using digital algorithms.


There are several categories of computational problems where quantum computers will be able to solve at better computational complexity compared to classical computers. One famous example would be Shor's algorithm. It's considered to be a practical threat to many public key cryptography depending on integer factorization being hard.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: