The second law of thermodynamics isn't really a fundamental physical law, but rather a promise based on statistics that says "disorder will increase or stay constant in a closed physical system".
That said, it's entirely possible for entropy to spontaneously decrease in a closed system, the probability of this happening is just astronomically small for typical macroscopic systems.
Example:
If you have a system consisting of two compartments that are separated by a wall, where one of the compartments contains N particles. The system has low entropy because all the particles are on one side of it. Now, if you remove the barrier between the two compartments, particles will distribute evenly on both sides, increasing the entropy of the system. If we assume that the position of the particles is random, the probability of finding all of them on one side of the system is (1/2)^N, which quickly converges to zero for most macroscopic systems, which often contain > 10^23 particles.
aren't people (and all living organisms) the quintessential example of a local decrease in entropy that results in greater overall entropy? we are highly ordered groupings of matter but we're really great at churning about the matter and energy around us and we eventually decompose too.
> aren't people (and all living organisms) the quintessential example of a local decrease in entropy that results in greater overall entropy
Complexity is orthogonal to entropy. The aforementioned barrier-separated box is high in entropy but simple. Upon lifting the barrier, a description of the gas front moving into the vacuum is enormously complex. It is also lower in entropy than the previous, barrier-separated state. Finally, when the gas is diffuse and entropy at its maximum, complexity dips back down.
Life is complex. Plants turn solar radiation into complex living structures; they also diffuse waste heat. Animals eat those plants, make more complex stuff, and generate more diffuse waste heat.
> The aforementioned barrier-separated box is high in entropy but simple.
This isn't true in any meaningful objective sense; representing the exact state of the system in some basis (e.g. the position or momentum states of all particles in the system) requires a huge amount of information, corresponding to the entropy. The GP only said it's a "simple" system because we (for arbitrary reasons) don't really care very much about the position of each gas molecule. We as humans are satisfied with describing the system in terms of bulk statistical characteristics like temperature and pressure. However, we would be unsatisfied doing the same thing with a similarly information-rich system such as a microprocessor, because now the bulk characteristics of the system aren't sufficient information to describe the characteristics that humans care about.
"Simplicity" or "complexity" as described by the GP is not a physical quantity; it's a reflection of which precise dynamical behavior we happen to care about. You could probably find all sorts of heuristics that generally match with human intuition, but in the end it's up to opinion.
A living organism is of course a noble endeavor standing against the tide of entropy. But it can do so only with significant influx of energy extracted from its external environment. The cost of constructing such an elaborate order of matter is paid by disorder elsewhere.
As we know, the living stance can be maintained for only a relatively short time, after which it rapidly decomposes to background entropy.
Interesting to speculate, as living beings we are chock full of "micro-environments", there could be exceptions to the laws of nature as we've understood them that could account for certain mysteries of biological nature.
We're not standing against the tide of entropy. If we're going to use a water metaphor, let's talk about a dam or a water wheel. Life is capturing energy and redirecting it-- we're still increasing entropy, but we're directing it in such a way that allows us to have some semblance of order. We aren't trying to resist anything, we dance in the expenditure of entropy.
Any living organism is high entropy there are considerably lower entropy states for the molecules that make up your body to hold than the arrangement that makes you, you.
A living organism also increases the entropy of the environment.
Yes absolutely! Life is actually a way to increase entropy as it harnesses chemical energy in systems that otherwise wouldn't be able to reach a lower energy state by themselves.
aren't people (and all living organisms) the quintessential example of a local decrease in entropy that results in greater overall entropy?
Yes, but that's not what the person you replied to was saying. They were pointing out that there can be a global, absolute decrease in entropy, it's just statistically very unlikely.
Sure, but the above point was indicating that even a closed system is capable of decreasing in entropy, according to all other physical laws; it's just that the more complex the system is, the probability of the decrease declines rapidly to near zero.
The second law of thermodynamics is as fundamental as the uncertainty principle: The former is a result from markov chains and information theory, the latter is a result from fourier analysis of conjugate variables.
What would you consider a "fundamental physical law"?
Would you mind sharing a link to a proof of the second law using Markov chains and information theory? I'd like to learn more about this approach.
I'm surprised that Markov chains would be involved when the laws of physics are deterministic.
The Poincare recurrence theorem has always suggested to me that the second law is not as fundamental as other laws. For a finite system with finite phase space, the state of a system will traverse closed loops, repeating forever with no steady increase or decrease in entropy. (Edit: to be clear, I'm not claiming that what I just described is the Poincare recurrence theorem or that it applies to our universe. But it is worth considering systems where the second law doesn't apply and trying to figure out how and if they differ critically from reality.)
Not that my background is worth anything, but just so you know where I'm coming from, I have a PhD in physics, spent years thinking about the entropy of computation, and wrote parts of the Wikipedia entry on Maxwell's demon. I think much of the disagreement over entropy and the second law comes from how we frame the problem.
I am on mobile now, and can't provide a simple link, but it is given in Cover&Thomas "Elements of Information Theory", in the episode that discusses entropy of markov processes. I can find pages in google books but they won't zoom big enough to read...
IIRC, the proof requires the markov chain be irreducible, and extends to the general case by summing over the irreducible parts; and that entropy will stay the same or increase while converging to the stationary distribution over states.
(Although it has now been 20 years since I dealt with these things so I might be misremembering. Time to retread Cover&Thomas I guess...)
Cover & Thomas, 2nd Edition, Jul 2006, pg 81, section 4.4 - entropy rate of markov processes, I did not remember all the conditions needed for this to hold, please read if you are interesting.
[0] staff.ustc.edu.cn/~cgong821 /Wiley.Interscience.Elements.of.Information.Theory.Jul.2006.eBook-DDU.pdf seems to have a copy indexed by Google. I suspect it is not legitimate
I believe a distinction could be made around the second law of thermodynamics being emergent rather than, well, fundamental.
Put another way: the second law of thermodynamics is not required to fully describe a universe that acts like ours appears to. It will "take care of itself" based on more fundamental descriptions of matter and its interactions.
I'm also comfortable with what appears to be fundamental today no longer being fundamental tomorrow.
Especially as the subject is quantum physics and thermodynamics, this is a very interesting viewpoint. Though I agree with you in principle, many centuries of inquiry about our universe by some of the most intelligent humans to date seems to say that mathematical formulations and ideas are the best (if not only, as with QM) way to think about our universe. No other way has really been as incredibly useful in terms of prediction as a mathematical one. I think you do have a point, but I will continue to take the so-called Copenhagen Interpretation of QM: Shut Up and Calculate.
The uncertainty principle is an absolute and inviolable consequence of pure mathematics; the second law of thermodynamics just makes predictions that are "very very overwhelmingly likely". The probability of the application of the uncertainty principle producing an incorrect prediction (internal to the theory) is zero, whereas the probability of the application of the second law producing an incorrect prediction (again internal to the theory) is non-zero. There is an infinity of difference between the internal "absoluteness" of these theories.
The probability of either theory producing an incorrect prediction external to the theory is much higher, and will be about the same for each theory. That is, it's more likely we're wrong about all of physics than that the second law makes a bad prediction in a bulk system.
The statement of the uncertainty principle that I am aware of is that the product of standard deviations of conjugate variables (e.g., time and frequency; position and velocity) is bounded from below. This is a statement about the sample space. Note that standard deviation is expectation (over the ensemble) of the 2nd moment.
I did not remember the exact statement when I posted earlier, but here it is: the statement of the (markov) 2nd law of thermodynamics is also a statement about the sample space: It says that in expectation (over the ensemble) the relative entropy decreases towards that of stationary distribution (which in most systems is the highest entropy distribution possible in that system, thus absolute entropy is non decreasing). That is, unless the system starts at a state with a higher entropy than that of the stationary distribution, entropy will not decrease.
Both are mathematical statements, consequences of pure mathematics, neither of which gives a prediction - they both give ensemble averages, and both are both internally perfectly correct and consistent. See Cover & Thomas, 2nd Edition, Jul 2006, pg 81, section 4.4 - entropy rate of markov processes. Unless, of course, you are referring to a different version of the uncertainty principle which I am not familiar with.
No, the uncertainty principle is a statement about the behavior of non-commuting operators in a Hilbert space. It is not a probabilistic statement. It doesn't even have anything to do with probability until you apply it to a probabilistic interpretation of quantum mechanics, where vectors in the Hilbert space have something to do with probability. The understanding you are referring to is more or less correct, but is a specific application to certain interpretations of quantum mechanics. It's also useful to think about it this way experimentally (in terms of "uncertainty"), which is why most people learn it this way in early physics classes and where the name comes form.
Perhaps I have made the same mistake as you, and was thinking about the practical but non-generalized way the second law of thermodynamics is usually taught, which includes concepts like "the system will be in this state". The only part of your statement that is still probabilistic is "entropy will not decrease". That's not really true; it probably won't decrease.
>neither of which gives a prediction
"entropy will not decrease" is a prediction. It is possible (albeit overwhelmingly unlikely over large time scales) that this prediction is sometimes false.
> The only part of your statement that is still probabilistic is "entropy will not decrease"
The markov/IT version of the 2nd law is a statement about macrostate entropy (taking the entire microstate ensemble for each macrostate), and in that sense it is not probabilistic. See the Cover&Thomas reference I gave earlier for the exact definition. It is indeed different than how it is usually taught in physics, in which the microstates are differentiated.
I guess we both need to be more careful about mathematical definitions in the future ...
> The second law of thermodynamics is as fundamental as the uncertainty principle: The former is a result from markov chains and information theory, the latter is a result from fourier analysis of conjugate variables.
I don't think enumerating the pedigree of a law gives it the seal of approval of validity. Newtonian physics also has pretty solid pedigree, and yet breaks down in relativistic conditions.
Please note I do not disagree with your conclusion, only with the way you present it.
Any law that's not based on a stochastic process, since stochastic processes can be gamed with a little bit of cleverness. If you don't believe me, just look at HFT.
You may want to read 'Thermal Physics' by Kittel and Kromer. It is the BIBLE of statistical mechanics and should straighten you out on why a statistical law is just as good as all others. And yes, it goes into great detail about when and how the statistical laws 'emerge', so to speak. Physicists have been 'worried' about what you mention in you comment for about a century and the science is very mature these days. Used copies start at ~$25.
This argument that the Second Law is a simple logical consequence of basic probability always seems glib to me. Why would probability demand entropy increase as one extrapolates forward in time, but probability not similarly demand entropy increase as one extrapolates backwards in time? In other words, what of Loschmidt's paradox (https://en.wikipedia.org/wiki/Loschmidt%27s_paradox)?
There fact that is not a consequence of statistics is that we observe the universe to not be in the state of maximal entropy. That is sufficient to explain asymmetry of time, I think. The mystery is -- why wasn't the universe in a state of maximal entropy at time 0?
The anthropic principle requires any self-reflective universe to not be in a state of maximum entropy. If our universe where in a state of maximum entropy, there would be no one around to ask.
That's one of the fundamental questions of physics. Hawking's Cambridge lectures audiobook explicitly cover the topic of what's up with the second law being statistical, why the arrow of time is apparently directional, and possible explanations for how you can answer those questions and explain the existence of the universe without requiring a "cause" for the big bang.
Look at every possible two particle collision. To simplify, consider one of the particles as the initial frame of refernce so you now just have impact angle, which is random.
Now play back time by looking at that particle's post collision path as the frame of reference and play time backwards. You will find a bias in input angles for these collisions.
Universe started from low-entropy state. From the big bang, there is nowhere go but downhill.
In a microscopic level the physics can be time-symmetric (it seems not to be the case, there are T-symmetry violations), but macroscopically universe had higher entropy in the past (cosmological arrow of time)
Kaon decay and B mesons decay break CP-symmetry and a CP-symmetry violation is equivalent T symmetry violation (CPT symmetry is preserved only if CP-violation is paired with T-symmetry violation)
You are right in that we need to assume that the universe started in an extremely small entropy state. This is basically the past hypothesis. (Which can be justified to some extent using anthropic principle and eternal inflation, however these two ideas are very controversial among physicists.)
This is very true, and is the case with many 'laws'. Another example is with centre of masses: Take a hollow sphere, clearly the c.o.m. should be in the very middle, yet an object placed inside has not force whatsoever on it so it isn't attracted to the c.o.m. and the idea breaks down (NB objects outside are attracted predictably).
This is because this idea is purely a tool to make calculations of large groups of particles as easy as one, but it does break down occasionally.
Which law are you thinking of? There isn't a physical law I'm aware of that claims gravitational attraction toward a center of mass, you might be conflating several laws or thinking of an approximation rule that was never intended to cover one object inside or even near another.
Newton's law of gravitation is stated in terms of a particle to particle relationship.
Center of Mass is the balance point of an object, and objects will always rotate around their center of mass unless constrained, but this doesn't relate to gravity.
You can approximate gravity at a distance from any object by using the object's center of mass, but that approximation breaks down when you're close to it.
I don't think that's true. An object would be attracted to the inner surface of the sphere because that's where the mass actually is. The shape you're describing doesn't have a center of mass the way we traditionally think of it.
An object inside the hollow sphere would in fact be attracted to each individual mass-ful particle on the surface of the hollow sphere. But (assuming uniform density on the sphere) the net effect is 0 (it feels no gravitational attraction whatsoever).
The best way to prove this is to compute the gravitational force between your object and any arbitrary particle on the surface, then do the integration over all the particles (across the 3 dimensions).
Doesn't evaporative cooling work against the second law somehow? There is a system with some average temperature, and spontaneously enough entropy is created to cause some of the liquid to reach boiling temperature and leave the system, which decreases the average temperature of the liquid.
Countdown to a sci-fi movie where this idea is only half understood and the "proof of concept" device opens a portal to hell, where we meet the real Maxwell's demon...
Now that I've typed this, I want to see it happen.
Several Doom games, and also Half-Life. Ripping a hole into Hell for travel purposes has been done in Doom, the movie Event Horizon, and the tabletop gaming franchise Warhammer 40K.
Which goes to show what a terrible idea this all is ;-).
Also done in an excellent though unfinished series of books, The Salvation War[0]. Albeit travel was only a side effect of the real goal there.
[0] - http://www.tboverse.us/HPCAFORUM/phpBB3/viewforum.php?f=29 - full text of Armageddon and Pantheocide available there; the author decided to abandon the series because someone stole his work to destroy his ability to publish it.
There are some details scattered around the forum. For starters, one quote from the author[0]:
"The third part never got written. We had a contract signed to publish them and they'd been prepared, copy-edited and put into paperback novel format (I actually have the author's preprints) when a . . . . person . . . . stole the copy and published it himself as a torrent. As a result, the whole deal fell through and nobody will touch an already-published work. So, without any possibility of generating return, I ditched the project. There's no chance of going back to it now."
I once saw a more detailed story about the guy who stole the copy and alleged reasons for doing it, but I can't find it right now.
> Note that in the discussed example the reservoir acts as some quantum analogue of the classical Maxwell demon. Namely, having been prepared in a special state, the reservoir is able to decrease the entropy of the system without the energy exchange with it, and can be referred to as a ‘quantum Maxwell demon’ […] In what was discussed above, an electron interaction with the quantum spin does not induce any correlations between the electron and the spin and, therefore, no classical correlations are present. Hence an important distinction between how do quantum and classical Maxwell’s demons operate.
So, it sounds like the "refrigerator-at-a-distance" (and thus energy transmission, when combined with a heat engine) actually (1) is more of a battery, and (2) doesn't interact with classical systems.
I don't think so. There is no energy transmission involved at all. I suspect that there is some entropy transmission, but I didn't see an analysis of that, and the amount is negligible compared to what is already in the quantum mechanical system.
Of much greater surprise to me was the claim that an isolated quantum mechanical system neither gains nor loses entropy. I'm almost as astonished at this as I am at the fact that there are no chaotic quantum mechanical systems because quantum mechanical systems evolve linearly, while chaotic ones evolve exponentially.
The apparent contradiction here should not make you doubt the statement that an isolated quantum system can't gain entropy. It should make you doubt the statement that the universe, or at least the "universe" that appears to us to be gaining entropy, is an isolated quantum mechanical system. In other words, it should make you consider the possibility that we observe an apparent entropy gain because we can only observe a portion of the universe, and that portion is entangled with portions that we can't observe (and might never observe given that the expansion of the universe is accelerating). We never actually observe the pure state of the universe as a whole.
Could have sworn that one of the tricky things about that law was that while it may appear that you are decreasing entropy locally, you are just increasing it somewhere else, thus entropy still increased in aggregate.
Ironically someone downvoted you for this comment. I guess that they didn't understand how exactly true your statement is.
In essence the entire biosphere is a giant heat engine, mainly fed off of the flow of energy from the Sun to Earth to outer space, and in some extreme environments fed by the radiation of heat out from the core of the Earth. Shut off those flows of energy, and the second law would shortly catch up to us and we'd all perish.
Becoming cold requires a flow of energy outwards. If you put a perfectly reflective energy shield above our atmosphere and about 2 miles down, we would not get cold. We would just wind up all dead at a uniform temperature.
But as they're frozen, the ratio of possible microstates to each colder, dryer macrostate increases tremendously. S = k log W. Entropy sets in well before algor mortis.
Yeah. The impression I get from this article (which is pitifully short on specifics) is that they're going to exploit some of the seemingly-nonlocal properties of QM, like quantum teleportation.
So, a normal refrigerator decreases entropy in one region and increases it in another, but they're directly adjacent regions and the entropy (heat) is moving from one to another along a simple, everyday path (like a heat exhaust tube). It sounds like the researchers have proposed using some quantum-teleportation-like trick to have the heat show up in some unconnected region of space.
Is it sad that my second thought reading that was of weaponization. Depending on how you project this energy, you could put it somewhere very much unwanted. Or the opposite, remove it from somewhere very much needed.
Imagine location A experiences an entropy decrease Y. To compensate, location B experiences an entropy increase >Y, and thus the 2nd law of thermodynamics is upheld. Under typical conditions, those two locations must be directly connected. The biosphere of the Earth, which is a local decrease in entropy, receives insolation directly from the sun, and radiates directly into space.
The impression I get from the article is that the newsworthy idea here is that those two locations might not need to be directly connected. This would permit what looks like a perpetual motion machine. But overall the 2nd law would still hold because entropy is increasing somewhere.
Note: I'm not a physicist so don't take my word for this. Just conveying what I think I read.
> if a small theoretical being sat at the door between the hot and cold rooms and only let through particles traveling at a certain speed. This theoretical imp is called "Maxwell's demon."
Maxwell's demon is considered a slight of hand because it requires energy to perform its task. But if a passive energy-free equivalent can be found with a novel quantum substrate then any liquid could be separated into hot and cold pools. I am highly sceptical of such a claim. The difference in temperature between these hypothetical pools would likely be too small to create any useful energy.
> "Although the violation is only on the local scale, the implications are far-reaching," Vinokur said. "This provides us a platform for the practical realization of a quantum Maxwell's demon, which could make possible a local quantum perpetual motion machine."
> For example, he said, the principle could be designed into a "refrigerator" which could be cooled remotely — that is, the energy expended to cool it could take place anywhere.
Perhaps there's another definition, but as far as I'm aware, a PPM has net negative/zero energy input from anywhere, be it local or remote.
That said, apart from hyperbolic misapplication of terminology (not a PPM, no circumventing Second Law), this does sound like interesting research.
This article seems a little light on details, and the original paper (not behind a paywall!) is quite dense, but some of my favorite paper's on Maxwell's demon are:
It seems as though the authors are confusing Boltzmann's H-theorem and its cousins expressed in more complicated formalisms with the Second law of thermodynamics. That's a very common misconception, often resulting in announces of apparently great discoveries, while the sober account would be more akin to "we derived a theorem where this special expression, similar to Boltzmann's H-function, does not behave as one would expect based on the H-theorem, which in turn is proven to be valid only for a simplified model of ideal gas in special condition." Not so interesting. Second law is an experimental law that concerns macroscopic systems. So far, this law was not shown to be violated based on any broadly accepted theory.
So this article does the Maxwell Demon some injustice. There is a key point about the demon that when he is 'sorting' the particles into two bulbs or rooms, that the gate he is working on is frictionless. In this way, you can can see that there is no energy entering the system, yet the entropy is decreasing.
Now, this is where things get interesting to me (please correct me if I'm wrong here). What the demon is adding is information to the system. It decreases entropy. Natural selection is a sort of maxwell demon in it's selection process. I have a theory though that it all evens out. The more complex/evolved the organism, the more entropy the organisms generate themselves.
> There is a key point about the demon that when he is 'sorting' the particles into two bulbs or rooms, that the gate he is working on is frictionless.
Exactly. Maxwell's Demon is a magical construct. In reality, any active device that sorts molecules into high and low energy bins would take power to run, and would generate more heat (or other entropy) than it removed by doing the sorting.
While what you said is true, it sort of misses the point. The thought experiment demonstrates that there is some quantity being added to a system each time the demon opens or closes the gate, and that quantity is information. Now we have a relationship between information and entropy.
Edit: I think you might be speaking to my point that in the end it evens out, and that natural selection in turn does create entropy even as its generating information.
The second law of thermodynamics just follows from statistics and a large number of interacting particles. It's not actually inviolable as the article says. There are whole books written about far-from-equilibrium thermodynamics, and the fluctuation theorem quantifies the probability of entropy increasing https://en.wikipedia.org/wiki/Fluctuation_theorem
Right, the second law simply is a result of selecting a result from a probability distribution that increases entropy, but drawing from tails of the distribution aren't impossible, only highly improbable.
Decreasing entropy locally is usually called a fridge, and I am currently betting my dinner that it is not bullshit. (I don't have the slightest idea what the article is actually talking about.)
Haven't read the paper, but it sounds like they just rediscovered the microcannonical ensemble (but for quantum mechanics). Probably works, probably hard to do anything interesting with it in the near to medium term.
A quick read of the paper shows it's mostly focused on systems with discrete levels; IIRC there's already plenty of thermodynamic weirdness in those, like "negative temperature" states in the Ising model and so forth. They touch on a continuous system in the final section, but that also includes a system with a finite upper bound to its energy (phonons). Wonder if that's related to the potential for 2nd Law violations.
There is a free PDF on the nature site, but the paper is also available on arxiv (https://arxiv.org/abs/1407.4437), the first version on the arxiv dates back to 2014. It's strange, but I find the arxiv version much more readable in terms of typesetting than the polished version in nature. Also there are additional appendices in the arxiv version.
You lost me at "locally". Of course if you take a closed system A and make it a subsistem of some bigger closed system B you are going to be able to diminish entropy at A by increasing entropy even further at (B + !A).
My grandmother's freezer was doing this 50 years ago, and I am pretty sure the engineers who designed it did not think their work was fundamental research in any way or form.
The best thing with local and short-term effects is that eventually someone finds the way to extend the space and time boundaries a bit, then a bit more and then we are talking about astronomical scale.
Can be a nice story plot for a sci-fi book, in which science finds the way to defer the rise of enthropy to some almost infinitely distant moment in future (that end of time, we've always being expecting) and move the boundaries of locality to the observable universe. What a world that would be.
By "locally" I think they mean something like "entropy will decrease within what looks like a closed system, and it might not be immediately obvious where else in the universe the corresponding increase in entropy is occurring".
So not actually "free energy", but perhaps it could be used to stage a convincing demo to potential investors.
I don't see what the big deal is. Let's try the ideal gas. For starters temperature ∝ Kinetic energy
We have two rooms. One were molecules all travel at speed A and other room molecules travel at speed B and we open a small window.
Eventually, over a long period of time the temerature in both rooms will settle at a temperature between that of B and of A.
Let's try to formule this mathematically. This is something like the mean value theorem in calculus that f'(C)(B-A) = f(B) - f(A) for some intermediate value C. And here are function f(C) is the equilibrium temperature.
In statistical mechanics we imagine we could count the number of particles -- 10^23 or 10^25 -- something very large. And some fraction M travel at speed A and N-M of them travel at speed B. And we count the probabilities of various mixtures occurring.
"The authors are planning to work closely with a team of experimentalists to design a proof-of-concept system, they said." - Pardon if I'm mistaken, but this seems like a fancy way of saying they have no evidence at all so far.
Quantum Mechanics is pretty damn solid. I expect they'll find their evidence, and figuring out that something is possible in QM is more than good enough reason to try it.
If that turns out to be false then QM will have been disproven, which is even bigger news!
The second law is not violated because the Earth is not a closed system.
Also to a very good approximation, we emit what we absorb. When averaged over a long period, that approximation gets better. (It has been somewhat worse over the last century though.)
We're currently absorbing very slightly more energy that we emit. We call it global warming. Other than that, output balances input.
The atmosphere is a heat reservoir, but just like a water reservoir behind a dam, that doesn't mean it's ever-rising. Energy flows in one side and out the other, but between the two there is room for life.
I recall having an idea at least a bit similar to this years ago when I was heavily studying evolutionary informatics.
Some particularly avant garde types in that field have posited that the universe rather than having two constituents -- matter and energy -- has three primary first-order constituents. The third is information. Information is not merely an epiphenomena of matter and energy but a primary "thing."
If that is the case then there should be an E=mc^2 type equation that relates matter to information and energy to information and all three should be interconvertible. It would then further follow that energy can be converted into information and vice versa in the same way that matter can.
I then imagined a Dyson swarm of solar power satellites that produce a data stream encoding the energy they collect. This stream can be subscribed to and decoded to reconstitute this energy remotely. To globally conserve energy there would have to be a two-way aspect to this -- I imagined the receiver of energy transmitting "challenges" to the swarm that are then "solved" to yield energy stored in the form of the solution. The receiver then receives these solutions and executes them to generate what in effect would look like local perpetual motion. (But in reality energy is still being conserved.) It would look like a cryptographic hashcash-style challenge-response system with proof of work, but the energy input of the POW function can be reversibly extracted elsewhere.
If such a thing were possible and sufficiently efficient and could function in the presence of high latency, this could power a starship among many other things. If it were latency-tolerant it might also be a way to store energy. Save your laptop's power to its hard drive.
It reminds me a little bit of the "telematter stream" propulsion system from Peter Watts' Blindsight.
>Some particularly avant garde types in that field have posited that the universe rather than having two constituents -- matter and energy -- has three primary first-order constituents. The third is information. Information is not merely an epiphenomena of matter and energy but a primary "thing."
Is that really so avant-garde? It seems a bit weird to claim that thermodynamics and the arrow of time are epiphenomena when we can measure the relevant quantities experimentally. The more avant-garde thing seems to be the kinds of papers where they claim space-time or gravity are somehow emergent from entropy/information.
>If that is the case then there should be an E=mc^2 type equation that relates matter to information and energy to information and all three should be interconvertible.
Uhh, why? Even the more exotic physics-of-information work seems to end up equating entropy with space-time or something like that: mass-energy would bend space-time-entropy, but wouldn't be convertible into it.
> Information is not merely an epiphenomena of matter and energy but a primary "thing." If that is the case then there should be an E=mc^2 type equation that relates matter to information and energy to information and all three should be interconvertible.
I'm not sure I agree with this premise, but one thing I've been musing about is that there's probably an information-theoretic lower bound for the amount of energy it takes to transmit a given quantity of data a given distance. You can pick different points along a curve (higher bandwidth/higher frequency signals need higher power to produce a given SNR, and lower-bandwidth/lower-frequency signals can produce the same SNR at a lower power) but there is some asymptotic energy limit there that you cannot beat unless you have an infinitely sensitive receiver.
Anyway, what I'm going for here is that your unifying theorem there would probably be Shannon-Hartley, since that deals with the transfer (derivative) of information. The "noise" is whatever natural equilibrium opposes the transformation process, and your SNR dB is the equivalent of the rate constant in chemistry.
It's all fairly useless without some idea of how we convert hashes back into energy of course. Without that, there is insufficient data for a meaningful answer.
This corresponds to a minimum amount of energy needed to transmit a given number of bits.
We are nowhere near this limit. However it is an upper bound that guarantees that Moore's Law can't possibly continue for classical computing to the end of this century.
(Part of the interest in quantum computing is that it has no theoretical upper limits at all. However this comes with some very weird restrictions.)
If you could figure out how to remotely power a space habitat with nothing but downloaded prawn streams and lolcats, you, good sir, will end up ruling the galaxy, whether you want to or not.
If useful quantum information-to-energy conversion requires an observer, you will also solve the planet's unemployment problem.
That said, it's entirely possible for entropy to spontaneously decrease in a closed system, the probability of this happening is just astronomically small for typical macroscopic systems.
Example:
If you have a system consisting of two compartments that are separated by a wall, where one of the compartments contains N particles. The system has low entropy because all the particles are on one side of it. Now, if you remove the barrier between the two compartments, particles will distribute evenly on both sides, increasing the entropy of the system. If we assume that the position of the particles is random, the probability of finding all of them on one side of the system is (1/2)^N, which quickly converges to zero for most macroscopic systems, which often contain > 10^23 particles.