Some say that the universe itself IS information. That way beyond our dependency on information as individuals and as a civilization, information is essential, is fundamental.
Certainly no information is very boring, very static. A basic yes/no 0/1 opens up all of computer language, but no variation of 0’s and 1’s, just a universe of all 1s or all 0s, doesn’t get a computer program very far. No information, no this and that. Certainly information is the realm of the relative.
Some say entropy is the heart of any first this then next that, that entropy defines time itself. Though I, and many philosophers and scientists and other deep thinkers like Lanza and Dogen don’t think that is true, that time is essentially just entropy, it is just as true that entropy certainly plays a role in our quotidian experience of time.
While Humpty Dumpty falling off a wall and breaking is an easy and quick thing, it is hard to put back Humpty Dumpty together again. Broken Humpty doesn’t just scrunch back together and fly up that wall, sitting pretty and smiling widely again, after the fall. It is possible for that to happen, but it is really, really, really very unlikely.
If you drop an egg and film it, you can tell pretty much if the film is then played forward or backward. Very rarely you may be wrong. Simply, broken eggs are more disordered, breaking them took little energy to make happen, just open your fingers and gravity does the rest, but an intact egg is very ordered, and took a lot of energy (ask mother chicken) to get there.
Entropy captures that difference qualitatively and quantitatively. It evolved in thermodynamics (for steam engines, trying to understand what all that inefficiency and wasted heat was about) as the statistical likelihood of a state evolving due to there being more “disordered” than “ordered” states in a system.
As a rule of thumb in physics and chemistry, entropy is really good and implies this then that! No engineer can ignore entropy and keep a job.
Information theory also uses entropy as a quality and a quantity to be reckoned with. So how are information and entropy connected? Is it in some sense related to heat wasted, energy spent. Yes, computers heat up and energy is spent, and organization, efficiency and predictability are all part of it. Information Theory was first interested in the efficient transmissions of signals in communications. What was important was whether the message, the information, the data, the signal that was sent was the one received.
To start, you could say high entropy is a measure of little information, great disorder, hence ignorance, at least of the details. You can see how such ignorance may impact a message and the information sent. To get an intuitive grasp of this is pretty easy, as is seeing why our perception of time is influenced by entropy, order and disorder, at least relative order and disorder, going back to Humpty Dumpty. When the egg breaks the yoke is all mixed up with other stuff, shells and albumin, sticky and hard to tease apart, hard to say just where all the yoke is.There are, of course, many more ways for the molecules in a smashed egg to be mixed and splattered than in an intact egg where yoke is here, albumin there, all neatly wrapped in a shell. That’s a measure of higher entropy of Humpty Dumpty having fallen. And it helps predict which chemical reactions will go which way and how much energy will be wasted in heating up the atmosphere by your engine (or your computer, or your brain, for that matter).
Information theory is at the heart of computer and communication science and needs a more quantitative understanding of ignorance and entropy. We need to know the probability of surprise, and so the extent of ignorance of what happens next.
Information theory, a great intellectual and technical insight by Shannon decades ago, says that the amount of surprise in a message relates to entropy. And in this, entropy relates to ignorance, because you are only surprised if you don’t know what’s coming. You don’t want surprises popping up in your message if you are in charge of communications! You want what is received to be the same as what was sent, no issues of garbled text.
All of this also suggests that it matters, to some extent, exactly how you ask the question of the system about how much entropy there is.
I am not surprised if a fair, six-sided die with a different number on each face and those numbers are the sequence 1 through 6, comes up with a number 1 through 6 when I role the die. So if that’s the question I ask is, will I get any number 1, or 2, or 3 or 4, or 5 or 6 on the next role of the die, the answer is clearly yes, of course, and simply reflects that I asked a trivial question of a simple system I understood. Probability (p) = 1.0, 100% sure. No ignorance, no high entropy, and not very interesting, either. If I ask whether I will get an even number on the next roll of the die, there is a bit more uncertainty. I have a 50:50 chance of an even or odd number. That is less ignorance and lower entropy than if I ask if I will get a 5 on the next roll, which is only 1 of 6. In this situation the outcome is random. I am not surprised whatever number I get here either, as any number can come up. But my surprise with this question is different from when I asked if I will get any number, but not very different. Same basic conclusion, just asked a different way.
Now if I ask if will I get a number other than 5, then the answer is on average I will 5 of 6 times I roll the fair, unbiased die, I will get a number other than 5 with the probability of 5/6 for each roll. Good odds, so little (but some) surprise when I get a 1 or 2 or 3 or 4 or 6. Lower entropy, less surprise, and less ignorance. If I don’t get a 5 I am not so surprised, if I get a 5 I am surprised and delighted if money was riding on the outcome. Same system each time, a six-sided die with each face having a different number 1 through 6, but different questions, and so different ignorance/surprise/entropy.
But notice the 1/6 is of a 5 on the roll just 1, the probability I will get a number 1 through 6, minus the probability of getting a number a number other than 5. And vice versa! SO these are related. We have constraints based on our question, our delusions don’t come into play, we don’t set the odds however we like them. We can set the system up with an unfair die of course.
An important idea scientifically is that we have to explain any deviation from random in a system. There is random, then everything else on a continuum of probabilities, of more or less ignorance of the outcome before we roll the die, before we send the message.
So, low entropy is good, right? Entropy is disorder, entropy is ignorance, and we don’t want that. While what we want does matter to engineers and programmers, and to us if we are pure of heart and mind, seeking a way out of suffering for ourselves and others, does it define good and bad to the universe, in terms of the unfolding of creation?
As in the entropy as an answer to a question about our ignorance and about randomness versus certainty and all between changing depending on the question you ask of the system, as in the results of a quantum experiment revealing a particle or a wave depending on how you ask the question, so does the answer to whether entropy is “good” or “bad.” What you are asking and how the answer comes back matters. And the universe may not be asking the same question as your monkey brain at any given moment! Maybe that’s our practice, aligning our questions with those of the unfolding universe!
Maybe you create entropy and ignorance with your expectations!
Anyway, I suppose low entropy is good if you don’t like surprises, but it is a kind of boring universe that universe of all 5s.
We think we want certainty. Perhaps that is an illusion.
Now, say we go back to that fair six-sided die with a different number on each face. Large entropy if I ask whether a specific number will come up next. I have no idea which number from 1 to 6 will come up next, so large ignorance, room for surprise! I get more information each roll than with the all 5s die: I get information the die was rolled (because there is a new number 5 of 6 rolls) and I discover the resulting number. If I play a game where a 5 wins, every game is life anew. Not so in the universe of all 5s. No games, no surprises, little information.
As for the symmetry test, the one that says an object is symmetrical if you can’t detect a change when the state of the system is changed (say a circle rotated when you aren’t looking), I will know 5 of 6 times whether you rolled the die (in 1 of 6 rolls on average you will get the same number), so again more information obtained there and less symmetry as well with the die with different numbers on each face than the one with all 5’s. Symmetry is associated beauty and the absolute, the unending. But it is in the breaking of symmetry that we end a certain kind of ignorance, where things happen, where circles become waves and waves become all things..
Is symmetry beautiful and useful? Kind of depends on what we are looking for, what questions we ask; our state of mind, as it were.
Not so straightforward, this ignorance, symmetry and entropy thing. Those who study chaos theory and complexity theory say the best stuff happens at the edge of chaos. Too random, can’t sustain anything even close to life. Too static, no change, just same old, same old.
After all, a single tone has little entropy, is very organized with no surprises, we know what comes next exactly, but it isn’t the most fun music. Similarly static that is randomly generated has endless variation, is not organized, but it also can be pretty annoying (but some people thrive on “white noise.” Go figure) and wont hit the top 40 on the charts.
So entropy can be computed, and ignorance and surprise can be quantified Computer science and communication science (where this came form) depend on it. But what does that really mean?
We seem to need variation without total randomness in our music and our stories. And what are we without our stories? Maybe liberated? Maybe awake? Maybe it is our state of mind that counts? There is the story about where at UCLA when the studied his samadhi they were shocked that Yasutani Roshi found each tick of a metronome to be unique, he didn’t adapt to the repetitions, didn’t experience them that way. He seems to have experienced each tick as existing in a universe that is never the same, always changing. No expectations, perhaps. Being truly awake, perhaps.
“What does it really mean?” is a trick question. There is no really mean. The problem is that when we think in terms of values as determined by our evolved monkey brains, we are constrained by our perspective and scale of living, the way we like our stories. Something happens and we ask ourselves, gee, what will happen next? What does it imply for me, my sense of well-being? A reasonable question of course, and that’s why we evolved brains to ask and answer it. Does that noise I hear in the bushes mean a lion will jump out and eat me, does that rhythmic sound coming from my husband encode words that I should interpret to mean that he doesn’t love me anymore? Or is it just atmospheric conditions generated by sunlight heating the air above earth and sea unevenly creating wind shaking the leaves that I hear, my husband clearing his throat, expelling bursts or air, nothing more? The meaning of information in that sense is what we project on the universe with our monkey brains. It can be very useful, it can be critical for our health, happiness and survival as smart primates, but is again, like the entropy of infomration depending on the question we ask of a system, the meaning of information is also dependent on our expectations, hopes and fears.
And on top of that, we are kind of lazy sometimes. Well, all things (composite things, Buddhist might say, deep impermanence) tend toward lower energy (oh, and higher entropy as energy is released as unused heat and composite things come apart). Both the word ”random” and the word “unchanging” take about the same time and effort for us to say and both things will be boring to us. So in that sense what we might think of as information when we address it with our usual language is about the same whether it is randomly generated white noise or a single sustained note; since both would be annoying over time and would have very little other import to us, we may think they are about the same. Since a random weather report and one that never changes regardless of the state of the local atmosphere would be equally useless (though both would be occasionally right, like the broken clock twice a day) we think they have a similar lack of information.
But both have a history, both contain information we may not perceive. How is the random noise generated? What is keeping that note going?
That’s how it often is. We project our day-to day experience on the universe. We decide on what is true and useful based on our brain and body needs, our need for a weather report that tells us whether to wear a coat or plow our fields, or our need to be entertained, our need to feel certain ways (loved, special, comfortable).
Information, the universe, is not sentimental or goal directed in the same way we are. Our self-perceived needs, our ego’s delights, are not primary, but rather a subset of the functioning of the universe. Our minds may be totally entangled with the universe of Mind, but the universe need not respect our biases.
The universe of Mind or consciousness, of Zen or Biocentrism, need not be designed in some dualistic fashion by a separate designing entity to live on the cusp or chaos and order. It doesn’t necessarily use information and entropy as we would ask it to, our questions born of our karma and desires, our craving and our fear.
Mind is not defined by some human definition of “intelligence,” consciousness need not be “smart” in human terms (intelligence is a dicey concept at best), though Mind contains and embraces human intelligence.
Equally, information is not inherently goal directed, it is simply question directed. Our egos have goals, our perceived needs, and these determine the questions we ask. That’s our problem, our need for interesting stories that make our lives “better” in some imagined way, that make sense to us in terms we dictate, often based on total delusion, though it is true that information does have value in our goal to be compassionate and live our lives with grace when compared to ignorance.
Of course, that idea of information is important. If we want to live with “no self deception as Maezumi Roshi exhorted us to, and I think central to Zen practice as I understand it and as taught by Nyogen Roshi and others, we want good information with minimal static. Ignorance is one of the poisons in Buddhism. This is also why scientists and others worry about intelligent design (religious dogma in scientific garb, a Trojan horse of the religious fanatics) and superstitions that lead to grave errors and great pain. I do not mean to say these things don’t count. This has not reached the level of functioning of some of the most powerful people in the world, and to many voters around the world, to our great peril. Hence we get climate change deniers, a president that eschews reality, racism run rampant, overpopulation, talk of a renewed arms race just as we got Iran to back down from nukes and hope Korea some day just might, and you can fill in the rest, there are so many examples of willful ignorance to serve greed and enacted out of fear.
After all, Buddhism does concern itself with pain, suffering and compassion and being awake, an end to ignorance!
I’m just saying the universe isn’t sentimental about it. The earth would be fine without us. A supernova destroys worlds on end but creates many of the atoms we are made of. Information is found in random noise.
None of this excuses us from taking responsibility for knowing what it takes to decrease suffering and wake up.
In the realm of the relative is entropy a “bad” thing? It inherent in change, it is the manifestation of form; it is the world of the relative. It is where things happen. Bad vs. good isn’t too helpful a concept in this context. Like the Tao, it has no difficulty, no obstruction, just avoid picking and choosing, the poem the “Xin Xin Ming” of the second patriarch says.
So take care with what questions you ask, your assumptions about good and bad, symmetry and beauty. Watch how you ask your questions, and what you do with the answers.