When Claude Shannon, confusingly known as ‘the father of information theory’, wrote a paper in 1946 titled ‘A Mathematical Theory of Communication’, he wondered what he should call the formula he had devised for the ‘degree of uncertainty’ of messages. Shannon first thought of calling it information, but he felt that this word was overly used. John von Neumann, the polymath who designed the stored-program computer, had a better idea. Von Neumann suggested,
You should call it entropy, for two reasons. In the first place your uncertainty function has been used in statistical mechanics under that name, so it already has a name. In the second place, and more important, nobody knows what entropy really is, so in a debate you will always have the advantage.
What Shannon meant by ‘uncertainty’ in a communications channel, such as this, is that the more improbable a message, assessed against expectations, the more binary digits (bits) are needed to encode it.
However, as Shannon admitted in an article he wrote for the fourteenth edition of edition of the Encyclopædia Britannica, communications theory is not concerned with the meaning of the information in messages, but solely with signs, codes, and the quantitative measurement of these entities in a mechanistic, stochastic sense.
But this is quite different from the concept of information that information systems architects use in business, for they are much concerned with meaning, recognizing that information is data with meaning. Furthermore, meaning plays a central role in David Bohm’s theory of the Implicate Order, which enables us to use the Cosmic Equation to unify the incompatibilities between quantum and relativity theories.
Yet, the second law of thermodynamics still holds a sacrosanct position in physics, as we see from this Arthur Eddington quote from 1948:
The law that entropy always increases—the second law of thermodynamics—holds, I think, the supreme position among the laws of Nature. If someone points out to you that your pet theory of the universe is in disagreement with Maxwell’s equations—then so much the worse for Maxwell’s equations. If it is found to be contradicted by observation—well, these experimentalists do bungle things sometimes. But if your theory is found to be against the second theory of thermodynamics I can give you no hope; there is nothing for it but to collapse in deepest humiliation.
Even though these words were written 75 years ago, the celebrity physicist Brian Cox is still holding on to this nineteenth century notion of entropy, indicating in the ‘Destiny’ episode of his BBC documentary series The Wonders of the Universe in 2011 that the laws of physics, called the laws of nature, are absolute, especially the second law of thermodynamics, which ensures that the arrow of time runs in one direction only: ‘from order to disorder, from low to high entropy’. As he said, “Entropy always increases, because it’s overwhelmingly likely that it will.” He thus believes in the ‘heat death of the universe’, a one-sided vision of the Universe that had a profoundly negative effect on the optimism of the late nineteenth and early twentieth centuries, as the historian of science Stephen Brush has pointed out.
The upshot of all this is that materialistic, mechanistic science cannot explain human creativity or any vitalistic growth process, like evolution, where order manifestly increases and entropy, as a measure of disorder, decreases.
We can bring sanity back to science by algebraically mapping the Cosmic Psyche, through which the entire world of form emerges, directly from the Absolute through the creative power of Life.
1868, from German Entropie ‘measure of the disorder of a system’, from Greek entropia ‘a turning toward’, from en ‘in’, from PIE base en-, and tropḗ ‘a turning, a transformation’, from PIE base *trep- ‘to turn’.