The universe engages in gravitational clustering and stellar fusion, which I believe incorporate information. A huge cloud of ionized protons and electrons has high disorder and entropy (less so if it’s an entire Big Bang universe, in that every neighborhood having its own unique expansion vector is a form of order). As matter clumps into stars and galaxies, it acquires order, kind of. Matter picks up kinetic energy as it clumps, heating up, raising its disorder, then gets even hotter as stars form and undergo fusion. However, the energy is radiated away, reducing disorder. People who know this stuff better than me discuss it here and here (a discussion from Prof. John Baez, who came up with the Crackpot Index). But if the universe is a closed system, radiated energy should increase disorder somewhere. Except that radiated energy loses energy over large amounts of time and distance. I think this lost energy increases the definition and order of space - it works against entropy. So the formation of spatial relationships among particles in the universe increases order.
Furthermore, I think that this order and structure gives the universe the wherewithal to go from one set of information-processing tasks to the next, the way we go from one thought to the next. If the universe is spatially differentiated in a non-Big Bang way, with an active center containing proton-rich galaxies and collapsed outskirts with burned-out, neutron-rich structures, the active universe can re-light structures on the outskirts, perhaps via floods of neutrinos. Matter in the active, Big Bang-looking center of the universe is largely transparent to neutrinos, but matter in the collapsed outskirts, which has less space, would be more concentrated and more likely to capture neutrinos. Plus, I’m guessing that neutrinos which have traveled long distances will have lost energy to space as long-distance photons do. Unlike photons, neutrinos have mass, so they lose energy by losing velocity. Slower neutrinos should have bigger capture cross-sections. So, yeah, I think the universe can renew.
So much wrong in one article. I’m disappointed to see this on Boing Boing.
Information processing is not the inevitable result of complexity. It’s the result of evolutionary pressure toward a very particular type of complexity. Our brains evolved to make predictions, and the utility of more complicated predictions (particularly social predictions) drove our brains to be complicated in particular ways. The fact that the universe has a large-scale structure does not imply that that structure is engaged in information processing.
Not saying that complexity implies information processing. Saying that if complex structures are found where Big Bang Theory says that it’s too early for complex structures, then Big Bang Theory will have to be modified at least.
But I do believe that the universe has plenty of characteristics which are consistent with information processing. Quantum mechanics is heavily associated with the behavior of information. Weird quantum effects are often consequences of missing information, with the double slit experiment being the most famous example. If there’s no information about which slit a photon traveled through, then it traveled through all possible slits. Quantum waves can be seen as the universe sharing information with itself. Quantum fuzziness can be seen as the universe having finite capacity to define itself.
This little article which disappointed you is only 600 words long, so there’s not much room to make extended arguments. I simply jumped from mentioning large, early structures to my opinion that the universe is an information processor. Didn’t mean to imply that those structures were the proof. You can read the long string of comments,following the article for some clarification, or you can go here, where, along with a lot of frippery, I discuss at fairly great length how I think the universe might work. You will find plenty to further annoy you.
Been here before. Look up the history of Cepheid variables. The Big Bang model survived.
Frankly, I don’t know what “the universe is an information processor” even means. Doesn’t seem like the author does either. But it sounds nifty, like Deepak Chopra.
If you read some of the comments below the article, there are some semi-specifics and links about what information processing might mean with regard to the universe. Processes which release energy - fusion, the coalescing of celestial objects via gravity - can be thought of as adding information, especially when the freed radiation loses energy by traversing large distances. If long-lived particles such as protons acquire meaning through context - by linking with other particles - the same way you could build up meaning by linking Liebnizian monads (themselves as simple as can be conceived, but able to build up complexity through association) then these linkages contain information. That’s what “the universe is an information processor” might mean. Plus, points for dropping in “Liebnizian monads.”
I’m familiar with those arguments. This area is sort of my field. I don’t buy them. Information theoretic approaches are useful for some sorts of problems, not for others, and non information approaches work as well. To assert that everything can only be understood that way and that the universe is nothing other than information processing is a glib, far reaching claim based on a small data set that is probably wrong. It also immediately begs the question of how anyone could understand anything about the universe prior to the invention of information processing if that is truly all that is going on.
Most of the physics world hooted at Wolfram for floating this notion. You can certainly make an argument. I just don’t happen to think it is a good one. It strikes me as a glorified version of Mach’s Principle, which also never really went anywhere.
Not asserting that everything can be understood only via information processing. There are lots of frameworks - QM, GR, maybe string theory eventually. Don’t know what data set you’re talking about. Gas condenses into stars, stars undergo fusion and emit radiation, much of which escapes into space and eventually loses energy to the Hubble redshift. 10^22 stars aren’t a small data set.
There are aspects of the universe which couldn’t be understood until the 20th century brought general relativity and evidence of other galaxies and their redshifts. Not unreasonable that there are aspects which are best understood from an information-processing perspective. But while theories of information could be a big deal, but they’re not everything. People could understand aspects of biology before the theory of evolution and the discovery of DNA.
Still a fan of Mach’s Principle, even if it’s never been fully mathematicized.
Don’t remember the specific criticisms of Wolfram’s arguments, but some quick Googling seems to indicate that part of the unfavorable reaction was his attempt to build the laws of physics from cellular automata (plus his Segway-like overhyping). Other universe-as-information theorizers such as Wheeler didn’t seem to experience as much hooting.