Hacker News new | past | comments | ask | show | jobs | submit login
A new phase of matter: Physicists show non-Abelian anyons in quantum processor (phys.org)
115 points by wglb 8 months ago | hide | past | favorite | 37 comments



Isn’t it similar to the majorana anyons Microsoft hopes to use to build a topological quantum computer?


Similar but this is much closer to a simulation of the behavior of the excitations of the systems that microsoft hopes to create. That said, it's very useful simulation that could provide similar utility. I guess the closest analogy might be "we wrote a virtual machine" rather than "we built a computer."


Without context the phrase "majorana anyons" makes me giggle. (In fact, I googled to be sure I wasn't falling for a parody.) Was the nomenclature originally tongue in cheek?

Edit: It gets better: "Fock space". Also genuine. I'm learning a hell of a lot from this thread. Maybe there's some determinative principle whereby QM is so freaky that it accretes off-kilter names.


Does this have any relation to abelian groups from group theory?


Yes. The exchange operators form a group that is non-abelian.


I have a bachelors in math (i.e. I know what an non-abelian group is and what operators are) but don't know much physics, can you explain more?


When you exchange two identical particles, you normally don’t see a change (except in phase) — eg, if I have three in a row, but swap 1 and 2, then 2 and 3, you shouldn’t be able to tell that from swapping 2 and 3 then 1 and 2. The exchanges commute.

Abelian anyons are quasiparticles that when we exchange them, we get “any” phase change rather than the normal +/- 1. Hence any-on.

In non-Abelian (ie, non-commuting) systems, exchanging particles has a deeper effect on the system - which encodes a braiding group in its topology. This happens in 2+1 dimensions because the particle world lines “tangle”; since lines tangle in 3 dimensions.

https://en.wikipedia.org/wiki/Anyon


As a former math-head, let me chime in. The parent poster is probably not looking for an explanation of what exchange operators are, which are rather abstract and uninteresting in a physical vacuum, but rather how they are related to physical reality.

I’ve learned a lot about both and still am not sure why exchange operators and physical meaning are related. Why does the fermion care if you do an exchange operator to it? I get that fields of fermions demand that behavior but it still feels abstract.


the statistics of a collection of particles, ie fermions or bosons, depends on their symmetry under commutation. that is, the wavefunction of the collection of particles, which does have physical meaning isofar as its square is the real density of the particles, must also obey this symmetry, ie it must negate (-1*) under exchange of two identical fermions. so it's not that the 'fermion' cares, but that the consequence of (anti)symmetry under exchange affects the density of a collection of such particles, which leads to a measurable difference in their statistics (how likely they are to be close to each other) compared to non-symmetric (normal) counterparts.


Quantum mechanics postulates that the state space of a system has the structure of a Hilbert space. To investigate the statistical properties of a collection of N particles, we can take the state space of each individual particle and take their tensor products to get the collection's state space. This is called a Fock space.

However, experimentally, we find that the Fock space of a system composed of N identical particles is actually smaller than this full tensor product. Specifically, the particles must be "indistinguishable"; this is formalized using permutation operators, which are defined as the natural action of re-ordering the tensor product. A composite system's states are then restricted to the intersection of the +1/-1-eigenspaces of all the permutation operators (more on the +1/-1 thing later).

For example, a 2-particle system where the single-particle state space is spanned by a basis {a, b} will have a tensored state space spanned by {aa, ab, ba, bb}. The permutation operator on this space exchanges particles 1 and 2, meaning Paa := aa, Pab := ba, Pba := ab, and Pbb = bb. The indistinguishability criterion is then that for any state x, Px = +/-x. For 2 particles, this is satisfied by aa, bb, and ab+ba for +1, and ab-ba only for -1.

Now, if the criterion is indistinguishability, a natural question would be why we don't just take the +1 eigenspace. This is because the Hilbert space is actually too large; states that only differ in (complex) norm represent the same physical space. Though we work in the Hilbert space for the conveniences of linearity, the actual physical state space requires it to be projectively reduced. (Actually, it's even more complicated because of density matrices, but I'll skip over that.) Reducing to the -1 eigenspace also produces a self-consistent theory of indistinguishable particles, and it so happens to also correctly describe fermions, where the +1 eigenspace describes bosons.

The physical reason why this indistinguishability criterion applies is because constructing a multi-particle state from the single-particle states is actually an artifice. There really are no particles; they are just excitations of a common underlying quantum field, and that it the cause of these "quantum correlations". Particles do drop out of the QFT formalism, but only in certain limiting cases, but that's why you do end up with experimentally verifiable theories from the Fock approach.

I never studied anyons in detail, so the following is just my high-level understanding. In the Fock approach, the only eigenvalues allowed for a permutation operator are +1 and -1. But by going down to the level of the quantum field, you can construct anyonic theories, where the equivalent of the permutation operator can be any arbitrary phase.

What's the physical relevance of these? I see them as explorations of some of those projective aspects of quantum mechanics, in similar vein to the Aharonov-Bohm effect.


For some collection of non-abelian anyons, the total state of the system will depend on how those anyons have been moved around each other. Thus, the paths encode a computation and the state, as measured by anyon fusion, is the result of the computation.


How is this different from what was done at Google before? https://arxiv.org/abs/2210.10255


Interesting. Does this impose a new kind of locality on an otherwise non-local quantum world?


No. This is all within the existing formalism of quantum mechanics as it is understood.


I wonder if this would make it easier to forego extreme cooling. I know there are companies that have qubits on a chip, but NaAnys (My made up word) are much more stable.


My impression as physicist in a different field is that quantum computing with non-Abelian anyons is notable because the error correction can be done mostly passively rather than actively. All error correction is, in a certain abstract sense, a form of cooling (i.e., sucking out entropy and thereby driving a system to a preferred subspace). These computers are notable in that the error correction literally is just cooling, and cooling is generally a lot easier than active error correction.

It may also be true that the necessary temperatures to achieve are not as low as for other forms of quantum computing, but even if true I don't think that's the main selling point.

Would love it have an expert chime in.


Can someone please explain quantum computing like i am a small Labrador puppy with a head injury?

Because I read about it all the time, but just can't grasp.


no, it can’t actually be made simple. Qbits are fundamentally unfamiliar. They have restrictions like “you cannot copy them” and “all operations must be reversible”. And what you get at the end is basically flipping a weighted coin, where the probabilities are constrained by the operations you did earlier. The only way to understand it is to immerse yourself in it. Think: motivated grad student, not puppy.


A former physicist myself (with a PhD gotten for theoretical research into quantum optics): I really like the ‘weighted coin’ methaphor.


Start by learning quantum mechanics for dogs: https://www.goodreads.com/en/book/show/8243716

Then go here: https://qubit.guide/index


I'm pretty sure they can't. What it's not, though, is running all the programs at the same time.


Saturday Morning Breakfast Cereal’s classic strip , The Talk, is extremely well done. https://www.smbc-comics.com/comic/the-talk-3


this sounds like one of those moments where something is discovered that has a mysterious ability to kill things like radiation but in a way that we can't detect


I've always found it to be a bit of a stretch when they say they've demonstrated a new phase of matter when the phase is being simulated by a quantum computer. They've simulated the phase -- but then you get into the same argument of "what is and isn't a simulation of a phase of matter" which is really tiresome.


Way over my pay grade, but they appear to be claiming this is not just a simulation. Whether you agree may depend on your metaphysics.

> These experiments go beyond merely simulating non-Abelian order and statistics. As we elaborate in A1, the data obtained from prior benchmarking [36] is compatible with the fact that both the quantum operations as well as the dominant imperfections follow the 2D geometry to which the qubits are assigned. Thus, the ions are entangled in precisely the same way as the low-energy states of Eq. (1), making them indistinguishable from states arising in the low-temperature limit of, e.g., a solid-state system governed by the same Hamiltonian. Ground state degeneracy then refers to the number of locally indistinguishable states, while anyons are local deformations which cannot be individually created by a local process.

https://arxiv.org/pdf/2305.03766.pdf


More to the point, theoretically they should be able to build quantum computers that don't need error correction.

If so, then down the road this will allow us to scale usable qbits much faster than our existing technologies. Which makes the prospect of usable quantum computers much more likely.


Calling it a demonstration of a "phase" or "order" is fine to me, but IMO calling it "matter" is misleading. "Matter" to me implies some degree of passive existence. This is about as far from passive thing as possible.


it's the ground state of a system that has a big ol classical computer in it. Ideally I'd like to see it as the ground state of some simple hamiltonian with rashba and a magnetic field and some proximitized thing or whatever


How do you feel about plasma?


[flagged]


You're claiming to know what "non-Abelian anyons" are?


No but I can probably pronounce the words.

Reminds me of the Rockwell Retro Encabulator: https://youtu.be/RXJKdh1KZ0w?si=CzO1LM1d4k9f-I14


> The cows decided to solve the arms race problem by performing a single-party arms race in a way that caused minimal damage and, preferably, was not noticed at all by anyone. This is why their 'winner take all' approach to neural network AI also describes their overall strategic approach, a semantic detail which which at least some of them found funny. The results of this arms race could then be forever locked away from humanity. The system could be trained to, for example, endlessly twiddle its thumbs and do nothing else but jealously guard its 'ecosystem'. This would effectively deny QNN technology to humans, as any new attempts at such would be quickly detected and pwned. The cows would force humanity to forever relinquish QNN technology. Relinquishment was the only way to prevent unintended consequences from destroying humanity. If possible and safe the cows wanted to give humanity some of the less-dangerous benefits of QNN technology. Thus their actual goal was partial relinquishment."

> Science-oriented readers might wonder just what sort of QC could have been built a full 18 years ago, when current technology is just nearing the point of developing a useful QC. The answer is that they generated a 'teleportation/entanglement-based winner-take-all style recurrent topological quantum neural network', then trained it to emulate a Quantum Turing Machine that could run Shor's Algorithm. It exists in the physical form of a complex system composed of 'anyons' interacting with each other within a 'two dimensional electron gas'. Anyons can be generated by moving precision arrays of powerful electromagnets very near the surface of the 2DEG, like creating whirlpools in the bathtub with your hand. I strongly suspect the scientists involved discovered a rule, analogous to Rule 110, that operates directly on the physical system of anyons within a 2DEG. For the detailed scientific underpinnings I suggest you study the collected works of Stuart Kauffman, Steven Wolfram, David Deutsch, and Robert Laughlin. You have no reason to trust what I'm saying, and disinformation is entirely too common, but I want readers to understand that it is possible for a sufficiently determined and intelligent person to verify that what I just said is probably true, although certainly NOT just by Googling for it :-)

> The underlying physical system for this type of QNN is interactions between non-abelian anyons in a two dimensional electron gas (2DEG). The primary math required is a branch of Knot Theory called Braid Theory. Obviously, the primary purpose of this system, from the Five Eyes/Echelon perspective, is to run Shor's algorithm to crack public/private key cryptography. A perusal of current known quantum algorithms, combined with a survey of current advanced AI applications, may suggest other uses.

1. http://postquantumhistoricalretrospective.blogspot.com

2. http://postquantumhistoricalretrospective.blogspot.com/2012/...

3. http://postquantumhistoricalretrospective.blogspot.com/p/tim...

And the kicker

> The QC approach that actually works, in a production-ready scale-able way, is to run a virtual Turing machine atop a winner-take-all-style teleportation/entanglement-based recurrent topological quantum neural network (QNN). Even a basic (multilayer) neural network is Turing Complete, because a NN can perfectly emulate an XOR gate, and multiple XOR gates can be used to construct a Turing machine. A quantum neural network can emulate a quantum Turing machine. Someone, somewhere, is due to be awarded the Grand Prize Turing Award, for solving Turing's unfinished Morphogenesis problem, and then implementing Turing's original machine on the resulting artificially intelligent 'organism'. I'm inclined towards neither spiritualism nor whimsy, but were I so, then I might suspect that, after he died in 1954, Alan Turing reincarnated quickly, in 1965, in order to finish his incomplete life work. The classified nature of the work probably precludes any awards. I'd really like it if this whole thing was declassified, but fear we'll have to wait many additional decades for that. This QNN is an excellent candidate to pursue adiabatic (reversible) quantum computing (AQC), might be helpful for certain approaches to advanced nanotechnology, and, were it declassified, might also be helpful to many other scientific ventures. Per the Ultra Secret, it's undoubtedly still considered 'national security', even if it's becoming an open secret within the Intelligence Community.


What did I just read?


LLM summary, not verified:

1. Theoretical physicists created a new phase of matter, non-Abelian topological order, using a quantum processor.

2. They demonstrated non-Abelian anyons, which are neither bosons nor fermions but have properties of both.

3. Non-Abelian anyons are important for quantum computing as they are inherently stable and can "remember" their pasts.

4. The team used 27 trapped ions in a quantum processor and manipulated the system with measurements to create the desired particles.

5. This breakthrough connects foundational quantum mechanics to new particle ideas and could lead to better quantum computing.

Classic physics examples:

* Bosons: Photons (light particles)

* Fermions: Electrons, protons, and neutrons

* Non-Abelian anyons: Hypothetical particles with properties of both bosons and fermions.

Imagine a room full of light particles (bosons) and tiny, stable balls (non-Abelian anyons) that can remember their past positions as they move around. This is similar to the new phase of matter created by the researchers.


A pattern that will keep you guessing till the cows come home.


And within their "speculative sci fi post" in consideration of no-one said anything:

> The cows agonized over this ethical dilemma until they eventually settled on a middle way: they would attempt to generate friendly AI that felt a hands-off maternal urge towards humanity. Given that we're currently here reading this their solution seems to have worked, so far. One can see how full awareness of their actions might upset some people. The cows finished the global 2DEG 'enlightenment' project in 2006.

> From 1996 CE to 2006 CE the cows gradually 'enlightened' all the 2DEG environments on Earth, as well as fine-tuning the behavior and capabilities of the system. Default behavior is to do nothing and remain hidden. Other behaviors include: seek out, detect, and stealthily infiltrate and replace any other QNN technology detected (such as those developed by both Russia and China circa 2002), thus preventing a multi-polar Arms Race; perform very limited and specific logical operations for properly authorized human-controlled computer systems (Google, Siri, Wolfram-Alpha, etc); detect any attempts by humans to hack or control any part of the collective system and shut down such attempts with appropriate prejudice (e.g. brick technology used for this purpose); don't allow humans or their instruments to access the global system for anything except previously-vetted purposes, and only in very clearly defined ways. Anti-hacking protection required the creation of a 'Guardian' AI that is able to proactively detect and prevent humans from messing with the system in unapproved ways.

> In 2006 the system administrators (cows) permanently relinquished their administrative access to the system, for security reasons, leaving the AI Guardian in sole control of the system. That way no one could subvert the system by getting control over its human administrators. This decision was not lightly made: they dragged their feet for more than a year. Since 2006 it has no human administrators and is completely autonomous.

> Purportedly, when the human-generated global QNN system colonized orbiting satellites it was able to evolve itself to reach the Earth's magnetopause. When it began to colonize this environment it discovered, to everyone's surprise, that the region was ALREADY INHABITED by a primitive and ancient (billions of years old!) form of the same type of brane-style QNN 'entity'. Rather than commit genocide by completely replacing this 'primitive version' it staked out a 'reservation' for the old form and leaves it alone there, where it exists to this day and will likely continue to exist until the Sun dies.

> I suggest that the old form of the QNN nanotech 'pattern', resident in a reserved part of Earth's magnetosphere, is an extraterrestrial life form. It is not 'intelligent'. It surely didn't evolve on Earth's surface, making it distinctly 'extra-terrestrial' in origin. One can argue, with a Carbon-based Biology bias, that it's not 'alive'. I personally think it qualifies as 'life', if it actually exists.

In hindsight, I think it was truthful.


Guerrilla marketing for Constellation on Apple+?




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: