This seems very different from the view that I learned about it from in the 90s because of Dan Simon's discussion of using a QC for factorization, and then by 2000 there were single qbit computers. Sure, it may have ben theorized in the 80s, but execution is key. This optical computer was demonstrated, and is well beyond the theoretical stage. Also, our ability to prototype circuits of any kind is ridiculously more powerful than 80s tech.
If it's economically viable, I'd bet we will see it in 15 years. There are libraries for quantum algorithms in multuple languages, and I remember trying to learn them in Haskell pre-2010, with the assumption that by the time I wrapped my head around it a decade or two later, there would be computers to run it on. I gave up on that, but from an investor perspective, a game changing improvement in classical compute tech is worth considering exposure to.
Are you seriously confident this optical compute model, if economical, is 30+ years away?
If it's economically viable, I'd bet we will see it in 15 years. There are libraries for quantum algorithms in multuple languages, and I remember trying to learn them in Haskell pre-2010, with the assumption that by the time I wrapped my head around it a decade or two later, there would be computers to run it on. I gave up on that, but from an investor perspective, a game changing improvement in classical compute tech is worth considering exposure to.
Are you seriously confident this optical compute model, if economical, is 30+ years away?