Hacker News new | past | comments | ask | show | jobs | submit login
Landauer's Principle (wikipedia.org)
80 points by layer8 on May 28, 2023 | hide | past | favorite | 32 comments



Inspired by this, in my PhD we actually used a quantum computer to do classical logic to investigate if that can lead to energy savings [1]. Quantum machines are (in principle) reversible, so they may avoid Landauer's principle.

However, there are subtle energy costs that you can hit before getting to Landauer's. The most interesting to me is that the qubits can become entangled with the wires that control them! This reduces the quality of information, and one way around it is to use a lot of energy [2].

[1] https://arxiv.org/abs/2210.10470

[2] https://journals.aps.org/prl/abstract/10.1103/PhysRevLett.89...


Interesting, I suppose you'd use reversible logic for the computation? I think non-reversibility is relatively common in computing, but surely a significant fraction of operations could be made using reversible logic. Fourier transforms are a good example. Sorting a simple example of an operation destroying information (log2(n!) roughly ~ n log2(n) bits). Algorithms would need to be redesigned significantly I suppose if it ever came to reversible computing being energy advantageous.


You could also think of reversible blocks. For example, if you had an half-adder, you each constituent gate could be thought of as doing something like erasing in normal operation. If instead the half-adder was reversible, and you perform the operation, save the outputs, reverse it, and then erase the inputs, at least you only have to pay the Landauer tax on the input bits (rather than each intermediary bit). The bigger the block, the more you save.


All of quantum computing is done with strictly reversible logic. But this is not a reduction of power. The Toffoli gate is an example of a reversible 3-bit logic gate that is universal. You can transform any classical circuit into a reversible one using this gate.

The trick? You have some input/output wires in your circuit that start with initial value 0 or 1, and some that end up in a 'garbage' state. You don't care about what they end up with, they strictly exist to maintain reversibility.

As an concrete example, the binary operator a OR b is not reversible - it destroys information. But if we make it a three-in three-out operator f(a, b, c) -> (a, b, c XOR (a OR b)), then it is reversible. Here f is it's own inverse, because applying it twice you'd end up with c XOR (a or b) XOR (a or b) = c.

Yet if you start with c in an initial 0 state, you can now compute the binary OR of two variables a, b in a reversible manner.


Is there any reason to think that quantum computing can fundamentally do more computation than classical computers? Consider this a leading question rather than a stupid one.


Here’s a recent quote from Scott Aaronson:

> In truth, anything that a QC can calculate, a classical computer can calculate as well, given exponentially more time: for example, by representing the entire wavefunction, all 2n amplitudes, to whatever accuracy is needed. That’s why it was understood from the very beginning that quantum computers can’t change what’s computable, but only how efficiently things can be computed.

https://scottaaronson.blog/?p=7321


No. Scott Aaronson makes this point repeatedly in his blog. Quantum computers are much faster than conventional computers on a few problems, and practically useless for all the others. That's the whole story.

Quantum computers are not hyper-turing.


There are many problems thought to be hard for which efficient quantum algorithms are known.


How much energy do you spend cooling your quantum computer?


ELI5: there's a lower bound on the physical heat required to erase one bit of information

It's cool because it creates a relation between pure math (information) and physics (entropy).

Maxwell's Daemon is useful context. In short, it's a thought experiment about trying to "cheat" 2nd law of thermodynamics, and Landauer's Principle pops up as a computational speed limit of sorts.

[1] https://en.wikipedia.org/wiki/Maxwell%27s_demon


As a kid I always wondered where the name “Maxwell’s Maniac” came from.

https://gamicus.fandom.com/wiki/Maxwell%27s_Maniac


How reversible computing is related to it?


It is a lower-bound on the cost of erasing a bit. Reversible computing retains enough information to run the computation backwards, resulting in the erasure of no bits.


However

> On the other hand, recent advances in non-equilibrium statistical physics have established that there is no a priori relationship between logical and thermodynamic reversibility.[19] It is possible that a physical process is logically reversible but thermodynamically irreversible. It is also possible that a physical process is logically irreversible but thermodynamically reversible. At best, the benefits of implementing a computation with a logically reversible system are nuanced.[20]


The idea of a logically non-reversible, physically reversible process blows my (admittedly layman) mind. But, the paper Wikipedia linked to on that topic was around 100 pages, so any understanding will have to wait…


Reversible computing cannot lose information or energy. No dissipation allowed.


I'm bothered by the characterization of "erasing" information. Black holes notwithstanding, you can't erase information.

If you rephrase it as transmitting information to your environment it makes far more sense (and becomes rather trivial).


You can't erase it in the physics sense, but you can erase information in an engineering sense, such that it is irretrievable in practice. NAND gates erase information this way; given an output, there's no way to determine in all cases what the inputs were.


A NAND gate doesn't erase information; you still have the inputs. In Quantum computers it's more a matter of "overwriting" a qubit with a new value.


Every gate that has more inputs than outputs erases information. (NOT gates don't erase information.)

Quantum gates generally have the same number of outputs as inputs, so they don't erase.


Ahh, clarity. Thanks for posting.

In other words, Laundauer's Principle might be rephrased as, "all possible physical realizations of information erasure are (by Conservation of quantum information) actually forms of information transmission in disguise."


If I understand this correctly, this puts an upper bound on the computational ability of a Dyson sphere surrounding a star.


For irreversible computation, yup. If reversible computation turns out to be practical, things are a little weirder, and the bounds are... further away.

Stuff like: https://en.wikipedia.org/wiki/Margolus%E2%80%93Levitin_theor... https://en.wikipedia.org/wiki/Bremermann%27s_limit

The Landauer limit is dependent on temperature, though. If you fast forward the universe a little bit to let the CMB cool down, even irreversible computing can get pretty efficient in theory.


dyson spheres compute?


They don't have to. You can interpret the original comment as "puts an upper bound on the computational ability of a Dyson sphere if it dedicated all captured energy to computation".


Depends. For useful computing? See comment by 'icegreentea2. For more abstract take - everything computes something all the time.


One interesting thing that this applies to is brute forcing ciphers. Using this you can calculate how many times over oceans _have to_ be boiled before someone can brute force your encryption key.

The concept of "Universal Security" is also discussed by Lenstra et al in https://eprint.iacr.org/2013/635.pdf


There's a hilarious stackoverflow post about the work required to brute force aes128/256[0]. One of the answers quotes Applied Cryptography, which Schneier himself quotes here[1]. Summarizing the strength of AES-256, he writes:

> [The amount of energy required has] nothing to do with the technology of the devices; they are the maximums that thermodynamics will allow. And they strongly imply that brute-force attacks against 256-bit keys will be infeasible until computers are built from something other than matter and occupy something other than space.

[0] https://security.stackexchange.com/questions/82389/calculate...

[1] https://www.schneier.com/blog/archives/2009/09/the_doghouse_...


Computer is a watched clock w/ complications. Reading a clock requires dissipation. Energy needed to do anything of interest, thats nature’s way. Lemme know if you have a better idea. There is plentiful energy, anyway, just dig deep. Thermodynamics is essentially right, tho there is some tricks, they are not really of any actual profit, due to complexity vs. gain. (For compute, maybe acceleration)


The Wikipedia article mentions that many prominent figures claim that this principle doesn't exist in physical reality. Oof.


As a non-prominent layman, I’ll point out that experiments have yet to contradict it, and not for lack of trying.


Does this say that at absolute zero temperature the bound is zero?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: