Change is life, and change is death.

No change is nihilism. No change is no information, no movement.

0’s and 1’s

000000 the nihilism early Buddhists (well, and later ones of course) argued against. Dismal.

111111 stagnation. Just as dismal.

But together, combining, we get information, creativity, Mind at play.

To dance is to move and change. Constantly falling, with grace and awareness, maintaining the center we change costumes and dance some more.

Cosmic dance, body dance.

Me, you, black holes and quasars.

Same thought, same Mind.

Never ending, never beginning; those are just conceits born from our distorted view of our lives.



From Scientific American June 2019:

“All The World’s Data Could Fit In An Egg”

A strand of DNA is about 0.00001 meters thick. That is 1/100,000 of a meter. The DNA is so tightly packed in your cells that the DNA in each cell each cell can be stretched to 2 meters (about 6 feet). If all the DNA in all of your cells was placed end to end it would be 100 trillion meters long. That is 100 million kilometers, or over 62 million miles. Your DNA would stretch 2/3 of the way to the sun, though the strand would be too thin for you to see.

An ounce of DNA has the storage density of almost thirty million hard drives.

How lucky that life stumbled on that little trick. Useful for evolution.

There isn’t a selfish gene. There is nothing selfish about it. Information, always changing, always morphing into contingent forms.

The buddha turns the dharma wheel and reality is shown in all of its many forms, we chant in a Zen service where I meditate.

30 Kushan Buddha

Information made manifest.

Mind made manifest.


Information and Entropy: Be Careful What You Ask!


Some say that the universe itself IS information. That way beyond our dependency on information as individuals and as a civilization, information is essential, is fundamental.

Certainly no information is very boring, very static. A basic yes/no 0/1 opens up all of computer language, but no variation of 0’s and 1’s, just a universe of all 1s or all 0s, doesn’t get a computer program very far. No information, no this and that. Certainly information is the realm of the relative.

Some say entropy is the heart of any first this then next that, that entropy defines time itself. Though I, and many philosophers and scientists and other deep thinkers like Lanza and Dogen don’t think that is true, that time is essentially just entropy, it is just as true that entropy certainly plays a role in our quotidian experience of time.

While Humpty Dumpty falling off a wall and breaking is an easy and quick thing, it is hard to put back Humpty Dumpty together again. Broken Humpty doesn’t just scrunch back together and fly up that wall, sitting pretty and smiling widely again, after the fall. It is possible for that to happen, but it is really, really, really very unlikely.

If you drop an egg and film it, you can tell pretty much if the film is then played forward or backward. Very rarely you may be wrong. Simply, broken eggs are more disordered, breaking them took little energy to make happen, just open your fingers and gravity does the rest, but an intact egg is very ordered, and took a lot of energy (ask mother chicken) to get there.

Entropy captures that difference qualitatively and quantitatively. It evolved in thermodynamics (for steam engines, trying to understand what all that inefficiency and wasted heat was about)  as the statistical likelihood of a state evolving due to there being more “disordered” than “ordered” states in a system.

As a rule of thumb in physics and chemistry, entropy is really good and implies this then that! No engineer can ignore entropy and keep a job.

Information theory also uses entropy as a quality and a quantity to be reckoned with. So how are information and entropy connected? Is it in some sense related to heat wasted, energy spent. Yes, computers heat up and energy is spent, and organization, efficiency  and predictability are all part of it. Information Theory was first interested in the efficient transmissions of signals in communications. What was important was whether the message, the information, the data, the signal that was sent was the one received.

To start, you could say  high entropy is a measure of little information, great disorder, hence ignorance, at least of the details. You can see how such ignorance may impact a message and the information sent. To get an intuitive grasp of this is pretty easy, as is seeing why our perception of time is influenced by entropy, order and disorder, at least relative order and disorder, going back to Humpty Dumpty. When the egg breaks the yoke is all mixed up with other stuff, shells and albumin, sticky and hard to tease apart, hard to say just where all the yoke is.There are, of course, many more ways for the molecules in a smashed egg to be mixed and splattered than in an intact egg where yoke is here, albumin there, all neatly wrapped in a shell. That’s a measure of higher entropy of Humpty Dumpty having fallen. And it helps predict which chemical reactions will go which way and how much energy will be wasted in heating up the atmosphere by your engine (or your computer, or your brain, for that matter).

Information theory is at the heart of computer and communication science and needs a more quantitative understanding of ignorance and entropy. We need to know the probability of surprise, and so the extent of ignorance of what happens next.

Information theory, a great intellectual and technical insight by Shannon decades ago, says that the amount of surprise in a message relates to entropy. And in this, entropy relates to ignorance, because you are only surprised if you don’t know what’s coming. You don’t want surprises popping up in your message if you are in charge of communications! You want what is received to be the same as what was sent, no issues of garbled text.

All of this also suggests that it matters, to some extent, exactly how you ask the question of the system about how much entropy there is.

I am not surprised if a fair, six-sided die with a different number on each face and those numbers are the sequence 1 through 6, comes up with a number 1 through 6 when I role the die. So if that’s the question I ask is, will I get any number 1, or 2, or 3 or 4, or 5 or 6 on the next role of the die, the answer is clearly yes, of course, and simply reflects that I asked a trivial question of a simple system I understood. Probability (p) = 1.0, 100% sure. No ignorance, no high entropy, and not very interesting, either. If I ask whether I will get an even number on the next roll of the die, there is a bit more uncertainty. I have a 50:50 chance of an even or odd number. That is less ignorance and lower entropy than if I ask if I will get a 5 on the next roll, which is only 1 of 6. In this situation the outcome is random. I am not surprised whatever number I get here either, as any number can come up. But my surprise with this question is different from when I asked if I will get any number, but not very different. Same basic conclusion, just asked a different way.

Now if I ask if will I get a number other than 5, then the answer is on average I will 5 of 6 times I roll the fair, unbiased die, I will get a number other than 5 with the probability of 5/6 for each roll. Good odds, so little (but some) surprise when I get a 1 or 2 or 3 or 4 or 6.  Lower entropy, less surprise, and less ignorance. If I don’t get a 5 I am not so surprised, if I get a 5 I am surprised and delighted if money was riding on the outcome. Same system each time, a six-sided die with each face having a different number 1 through 6, but different questions, and so different ignorance/surprise/entropy.

But notice the 1/6 is of a 5 on the roll just 1, the probability I will get a number 1 through 6, minus the probability of getting a number a number other than 5. And vice versa! SO these are related. We have constraints based on our question, our delusions don’t come into play, we don’t set the odds however we like them. We can set the system up with an unfair die of course.

An important idea scientifically is that we have to explain any deviation from random in a system. There is random, then everything else on a continuum of probabilities, of more or less ignorance of the outcome before we roll the die, before we send the message.

So, low entropy is good, right? Entropy is disorder, entropy is ignorance, and we don’t want that. While what we want does matter to engineers and programmers, and to us if we are pure of heart and mind, seeking a way out of suffering for ourselves and others, does it define good and bad to the universe, in terms of the unfolding of creation?

Not necessarily.

As in the entropy as an answer to a question about our ignorance and about randomness versus certainty and all between changing depending on the question you ask of the system, as in the results of a quantum experiment revealing a particle or a wave depending on how you ask the question, so does the answer to whether entropy is “good” or “bad.” What you are asking and how the answer comes back matters. And the universe may not be asking the same question as your monkey brain at any given moment! Maybe that’s our practice, aligning our questions with those of the unfolding universe!

Maybe you create entropy and ignorance with your expectations!

Anyway, I suppose low entropy is good if you don’t like surprises, but it is a kind of boring universe that universe of all 5s.

We think we want certainty. Perhaps that is an illusion.

Now, say we go back to that fair six-sided die with a different number on each face. Large entropy if I ask whether a specific number will come up next. I have no idea which number from 1 to 6 will come up next, so large ignorance, room for surprise! I get more information each roll than with the all 5s die: I get information the die was rolled (because there is a new number 5 of 6 rolls) and I discover the resulting number. If I play a game where a 5 wins, every game is life anew. Not so in the universe of all 5s. No games, no surprises, little information.

As for the symmetry test, the one that says an object is symmetrical if you can’t detect a change when the state of the system is changed (say a circle rotated when you aren’t looking), I will know 5 of 6 times whether you rolled the die (in 1 of 6 rolls on average you will get the same number), so again more information obtained there and less symmetry as well with the die with different numbers on each face than the one with all 5’s. Symmetry is associated beauty and the absolute, the unending. But it is in the breaking of symmetry that we end a certain kind of ignorance, where things happen, where circles become waves and waves become all things..

Is symmetry beautiful and useful? Kind of depends on what we are looking for, what questions we ask; our state of mind, as it were.

Not so straightforward, this ignorance, symmetry and entropy thing. Those who study chaos theory and complexity theory say the best stuff happens at the edge of chaos. Too random, can’t sustain anything even close to life. Too static, no change, just same old, same old.

After all, a single tone has little entropy, is very organized with no surprises, we know what comes next exactly, but it isn’t the most fun music. Similarly static that is randomly generated has endless variation, is not organized, but it also can be pretty annoying (but some people thrive on “white noise.” Go figure) and wont hit the top 40 on the charts.

So entropy can be computed, and ignorance and surprise can be quantified Computer science and communication science (where this came form) depend on it. But what does that really mean?

We seem to need variation without total randomness in our music and our stories. And what are we without our stories? Maybe liberated? Maybe awake? Maybe it is our state of mind that counts? There is the story about where at UCLA when the studied his samadhi they were shocked that Yasutani Roshi found each tick of a metronome to be unique, he didn’t adapt to the repetitions, didn’t experience them that way. He seems to have experienced each tick as existing in a universe that is never the same, always changing. No expectations, perhaps. Being truly awake, perhaps.

 “What does it really mean?” is a trick question. There is no really mean. The problem is that when we think in terms of values as determined by our evolved monkey brains, we are constrained by our perspective and scale of living, the way we like our stories. Something happens and we ask ourselves, gee, what will happen next? What does it imply for me, my sense of well-being? A reasonable question of course, and that’s why we evolved brains to ask and answer it. Does that noise I hear in the bushes mean a lion will jump out and eat me, does that rhythmic sound coming from my husband encode words that I should interpret to mean that he doesn’t love me anymore? Or is it just atmospheric conditions generated by sunlight heating the air above earth and sea unevenly creating wind shaking the leaves that I hear, my husband clearing his throat, expelling bursts or air, nothing more? The meaning of information in that sense is what we project on the universe with our monkey brains. It can be very useful, it can be critical for our health, happiness and survival as smart primates, but is again, like the entropy of infomration  depending on the question we ask of a system, the meaning of information is also dependent on our expectations, hopes and fears.

And on top of that, we are kind of lazy sometimes. Well, all things (composite things, Buddhist might say, deep impermanence) tend toward lower energy (oh, and higher entropy as energy is released as unused heat and composite things come apart). Both the word ”random” and the word “unchanging” take about the same time and effort for us to say and both things will be boring to us. So in that sense what we might think of as information when we address it with our usual language is about the same whether it is randomly generated white noise or a single sustained note; since both would be annoying over time and would have very little other import to us, we may think they are about the same. Since a random weather report and one that never changes regardless of the state of the local atmosphere would be equally useless (though both would be occasionally right, like the broken clock twice a day) we think they have a similar lack of information.

But both have a history, both contain information we may not perceive. How is the random noise generated? What is keeping that note going?

That’s how it often is. We project our day-to day experience on the universe. We decide on what is true and useful based on our brain and body needs, our need for a weather report that tells us whether to wear a coat or plow our fields, or our need to be entertained, our need to feel certain ways (loved, special, comfortable).

Information, the universe, is not sentimental or goal directed in the same way we are. Our self-perceived needs, our ego’s delights, are not primary, but rather a subset of the functioning of the universe. Our minds may be totally entangled with the universe of Mind, but the universe need not respect our biases.

The universe of Mind or consciousness, of Zen or Biocentrism, need not be designed in some dualistic fashion by a separate designing entity to live on the cusp or chaos and order. It doesn’t necessarily use information and entropy as we would ask it to, our questions born of our karma and desires, our craving and our fear.

Mind is not defined by some human definition of “intelligence,” consciousness need not be “smart” in human terms (intelligence is a dicey concept at best), though Mind contains and embraces human intelligence.

Equally, information is not inherently goal directed, it is simply question directed. Our egos have goals, our perceived needs, and these determine the questions we ask. That’s our problem, our need for interesting stories that make our lives “better” in some imagined way, that make sense to us in terms we dictate, often based on total delusion, though it is true that information does have value in our goal to be compassionate and live our lives with grace when compared to ignorance.

Of course, that idea of information is important. If we want to live with “no self deception as Maezumi Roshi exhorted us to, and I think central to Zen practice as I understand it and as taught by Nyogen Roshi and others, we want good information with minimal static. Ignorance is one of the poisons in Buddhism. This is also why scientists and others worry about intelligent design (religious dogma in scientific garb, a Trojan horse of the religious fanatics) and superstitions that lead to grave errors and great pain. I do not mean to say these things don’t count. This has not reached the level of functioning of some of the most powerful people in the world, and to many voters around the world, to our great peril. Hence we get climate change deniers, a president that eschews reality, racism run rampant, overpopulation, talk of a renewed arms race just as we got Iran to back down from nukes and hope Korea some day just might, and you can fill in the rest, there are so many examples of willful ignorance to serve greed and enacted out of fear.

After all, Buddhism does concern itself with pain, suffering and compassion and being awake, an end to ignorance!

I’m just saying the universe isn’t sentimental about it. The earth would be fine without us. A supernova destroys worlds on end but creates many of the atoms we are made of. Information is found in random noise.

None of this excuses us from taking responsibility for knowing what it takes to decrease suffering and wake up.

In the realm of the relative is entropy a “bad” thing? It inherent in change, it is the manifestation of form; it is the world of the relative. It is where things happen. Bad vs. good isn’t too helpful a concept in this context. Like the Tao, it has no difficulty, no obstruction, just avoid picking and choosing, the poem the “Xin Xin Ming” of the second patriarch says.

So take care with what questions you ask, your assumptions about good and bad, symmetry and beauty. Watch how you ask your questions, and what you do with the answers.



Entropy, Ego, What’s the Point?


Rather than launch into a technical description of entropy and the relationship of energy and entropy lets try this first.

More entropy means more disorganization and more ignorance. Low signal to noise. Less information. Like static preventing the faithful transmission of data. Think of loud static on a radio when you are trying to listen to music on your car radio.


If I tell you I mixed up the numbers one through ten and put them in a bag, then I picked out two, say a 3 and a 7, all you know about the next one I will pick is that it is not a 3 or 7. So they are mixed up, disorganized, and we have a bit of ignorance about some aspect of that system. Relatively high entropy. If I throw in some letters or blanks into the bag along with the numbers, i.e. static, you are even less able to predict the next thing to come out of the bag!

Now I tell you I ordered the numbers from ten down to one. There are no blanks or letters. I picked out a ten. Next picked will be… nine! Very good. You had little to no ignorance. But I had to put extra energy into ordering the numbers compared to throwing them in the bag. I had to have some way to assure they stayed in order as well. Low entropy, but it took more energy.

Meditation can be seen as aiming for high energy, low entropy. But I am not sure that’s quite true for zazen. You’d have to ask a Zen teacher. Certainly “mindfulness” is like that.

A circle is low entropy. You know everything about it and it took energy to create it (minimally mental energy, in addition perhaps energy to move the pencil or program and run the computer).

Symmetry is not ignorance. True, by definition symmetry is present when you can’t tell something has changed, like someone else spinning a circle while your eyes are closed, so that seems like ignorance. But to do that experiment, you need to know that the experiment was planned and then do it! That’s a lot of knowing, organization and energy!

Information is low entropy. It takes energy to put 0’s and 1’s in some order and that is one aspect of what information is. Ordered dualism.

Meaning is how we interpret and experience information. It is our perspective on it. It is contingent to the max. It is easily colored by our wishes and desires, by our egos.

I just read that the Nobel Prize winning physicist Steven Weinberg, who unified the electromagnetic and weak nuclear forces (along with others, of course; anyway major physics achievement) wrote: “The more the universe seems comprehensible, the more it also seems pointless.”

That seems very nihilistic and depressing. Perhaps that’s how he meant it. If so, somehow he had dealt with it because some four decades after writing that he is still writing books!

On the contrary, that seems very Zen to me. And liberating. It relieves us of arbitrary values and goals. The kind the ego sets up to measure ourselves by, so we can achieve them and reassure ourselves. Except when we don’t.

What ultimate, objective, cosmic, universal, non-dualistic “point” could there be? Any point we could articulate would be a human construct, limited and contingent, a dualistic notion of use in only a very small corner of time and space.

Matthieu Ricard writes in his book “Altruism” that the ego is the crystallization of our identity. He writes that we try to protect it. That’s pretty good, but I am not sure that it is quite right. There is no single anatomic brain space that houses the ego. I think the ego is the process by which we protect our identity. The identity is our sense of who we are based on our conditioning (biologic and psychological, contingent on where and when we are). It is how we organize our sense perceptions and react to them. It is our karma, if you will. It is how we try to make the world comprehensible, to find a point. The ego is the process of having and wanting there to be a point. A point is like a location, a beacon, a polar star that the ego can refer to on the horizon to measure itself and its position by so it can better protect us as we cruise through the world of time and space, the world of the six senses.

So as the universe becomes comprehensible, what we comprehend may not be to our ego’s liking. It may not put our bodies (brains included) at the top of the heap. It may remind us that our limited sensory experience is a pretty pale reflection of the vastness of the universe. Of course comprehensible in this context means the forces of nature. The things physics studies. That which can be measured. It does not mean the whole shebang.

To be clear: I am not suggesting a lack of values. I hope you value compassion. I hope you don’t value your suffering and especially not the suffering of others. I am only suggesting not being seduced into thinking that is the “point.”

Or is it? We can chose to embody compassion, we can aspire to the low entropy high energy state. Is that the “point” of our lives, our minds, the dream, the whole show? Some think so. I admit to liking that view. But maybe that’s the point! It is a goal to like, admirable to be sure, but do I like it because it makes me feel better about myself? Is that my ego protecting me?

No “point”? Perhaps that’s kind of like “ordinary mind is the way.” Or the miracle is chopping wood and carrying water. You don’t need a “point” writ large to the universe to eat when hungry, or to be compassionate. That is the functioning of the universe. What needs to be added? What would be the point?



Information Is The Dreams Stuff Is Made Of


Recently Stephen Hawking announced a new theory about what happens to information at the event horizon of a black hole.

Some scientists took him to task. They said in effect: isn’t it a bit of grandstanding to announce such a thing without showing your work?

I like that. Hold authority’s feet to the fire! That is the scientific way!

The question is: why do scientists care?

It turns out to be a question that is basic to the scientific view of how the universe is put together. Leonard Susskind wrote great book about it called “The Black Hole Wars.”

You see, information is conserved.

Like energy over all is conserved, is the same at the beginning of a process as at the end, though not the specific forms of energy (e.g. chemical energy becomes heat).

And most definitely not like entropy. Entropy is not inherently conserved!

Information that is conserved is not exactly the same as “meaning.” It is the possibility of different states. You know, like 0 or 1 in the binary code that the computer uses.

Or the letters of the alphabet. If you see:


in an e mail you think “laugh out loud.” Heck, you can program a robot to recognize it and make “ha, ha, ha” sounds.

Doest the robot know mirth? Joy? What it is to laugh?

Do you?

Is that information?

No, the idea that LOL means “laugh out loud” is meaning gleaned from information.It is not inherent in the information. We supply the meaning. Conscious, sentient beings do. If someone finds the letters LOL in a message many years from now, odds are it will not have meaning to them. Maybe not in very many years; I understand LOL is going out of fashion already. But it will have information.

LOL could have been randomly generated (a complex thing to do) or it could have been from a program that says: insert consonant-vowel-consonent.

Take a circle. Little information is needed to generate the circle:

  1. a definition (all points equidistant to one given central point)

2. and the variable (the distance).

The circle is symmetric. If you shut your eyes and I spin the circle around the central point, when you open your eyes the circle looks the same. No change. Symmetry.

But the universe has changed. The energy I used to move my muscles to move, say a cut out circle, thus spinning the circle, or by tapping circle moving instructions on a computer keyboard to spin a computer soft ware generated circle, comes from energy stored in my muscle cells.  The cells take glucose and break it down to CO2 and H2O molecules and use the energy released from the chemical bonds to create high energy bonds in ATP  (adenosine tri-phosphate) molecules, then the muscles use the ATP molecules for energy, breaking the ATP phosphate bonds (creating ADP, adenosine di-phostphate and then passing on the phosphate released from the ATP; don’t worry if this doesn’t mean much to you. The details aren’t critical) and thereby changing the energy state in the ATP/ADP/Actin/Myosin structure and thus changing the molecular structure of actin/myosin in muscle to create movement.

This chemical/mechanical process resulted in more molecules with less energy in their chemical bonds than the original glucose molecule, and released excess energy as heat. Also heat is generated by my fingers moving the circle or pressing the keys (friction and the energy of my fingers interacting with the molecules in the paper as I move the paper circle or computer keys as they crash into each other). The change in the molecules and the cells and the infrared photons (the heat released) pinging around create a less organized, higher entropy situation.

So the circle is unchanged, it is symmetric, it is in the same state after we spun it that it was  before we spun it, but the entropy of the universe has increased. We can re-create the glucose molecules, but it takes CO2, H2O, cellular organization and energy, for example in  the complex biological process of photosynthesis. But there will still be the same or more entropy each time we go about making any change.

So even a symmetric situation in the “real world” is not totally symmetric.  Even if we do the circle spinning as a thought experiment, where you don’t actually move the circle, as you did when reading this pretty much, takes energy! The energy of the chemical reactions and electrons moving about in your brain when you think generates heat and entropy.

Which leads us to thermodynamics and Maxwell’s demon.

But I digress; lets hold on doing more thermodynamics and Maxwell’s demon for this post. I will do more on that later.

For now, let’s get back to the idea that in the world of change and movement, the world of the senses (themselves of course information processors) information is conserved. Not meaning, just information.

Meaning is contingent. It is not conserved. It is relational, and generates entropy or uses energy to decrease entropy. Either way, energy and entropy are involved in meaning, playing off each other, perhaps. Energy is conserved. Information is conserved. Entropy is not. Meaning is not.

I find that very hard to get my head around. Why should that be? For that matter, why should it be that energy and information are conserved?

Perhaps it is because those conserved elements of reality were never created and can’t be destroyed, no beginning no end, so how can they fundamentally change?

Meaning is dualistic. It is not conserved. It is contingent on context.

Perhaps the universe at its core IS information. Some physicists think so. Every aspect of the universe that is, well, an aspect, is an aspect because it could have been otherwise (not necessarily just any old otherwise, perhaps a specific set of otherwise consistent with the laws of string theory, quantum field theory, whatever). Otherwise it isn’t an aspect.

0 and 1. Yin and Yang. Duality. That is what physics studies, after all. That is the core of our experienced universe of the senses.


Remember: information is not meaning, It is not essence, noumena. It is phenomena. It is occurrence.

Information is the dreams stuff is made of.

Meaning is determined by sentience.  Consciousness. Is there silicon sentience? If so, that robot will know mirth. And why not? Why should we be carbon chauvinists? Perhaps the very quantum fields can coalesce in many ways to find mirth.

When communications scientists developed the idea of information, it was to quantify the fidelity of communications. Does a phone message get through ungarbled? Not whether the speaker or her message was coherent. Do the 0’s and 1’s that make up your e mail message stay the same, or are some lost in  the “tubes” of the internet? It doesn’t matter if the email is LOL or a consonent-vowel-consonent randomly generated, whether or not it has linguistic or human intellectual or emotional meaning, if at each slice of time and space there was an either/or, a 1/0, there was information.

I will elaborate later. But I don’t know that I will get past the following no matter how hard I try:

Everything that happens according to scientists does not change the total information in the universe (though you can rob Peter to pay Paul energy/information, more here less there. Shuffle it around. That takes energy if we are talking about information). Information cannot be irretrievably lost.

This relates to symmetry (lack of change in some element of a system even if something somewhere else changes)

This relates to energy.

This relates to thermodynamics: what is likely to happen, and the role of entropy.

There is relationship between entropy and ignorance (another post; this is one of the technical definitions of entropy: how we can know the state of the components of a system) but as implied here, there is a connection between a type of contingent sense of meaning, meaning as we ascribe it to the stuff of daily life, meaning as motivation in our world of the senses, of our karmic experience, that is also part of entropy.

There is an Akashic record. Information in the universe is never totally lost. If you could lose information, modern physics collapses. Ask Dr. Susskind (or read his book!). This akashic record is not about some mystical new age vision of some grey haired old guy writing in a large parchment book with a quill pen somewhere or Santa Clause remembering if you were naughty or nice. In theory you could piece back all of the energy and information transitions and reclaim the original. Sure it may take time and energy without beginning and without end, so our technology may not be up to it.

Perhaps this “akashic record” is the manifest mind of the universe. It doesn’t have to track information back, put it back togeher. Perhaps it is the process, the functioning. It is not dualistic. It isn’t stuck in meaning in things like “LOL.” Maybe that is our dualistic perspective.

The process of oneness, of unfolding, of compassion, that is the flavor I suspect of this akashic record.

It is kind of fundamental and I find it kind of interesting!