12 February 2006

Not So Intelligent Design

"Why do the heathen rage so furiously together, and why do the people imagine a vain thing?" Rather than read that line in context in the Bible, try listening to Handel's version of it in his oratorio, The Messiah." Now we are ready to talk about the controversy over Intelligent Design (ID) as an alternate theory to Evolution.

My grad school buddy Adrian Melott once wrote, Intelligent Design is Creationism in a Cheap Tuxedo, which I fear is true of most of its advocates. Now, don't get me wrong. I would love to see a real theory of Intelligent Design - a mathematical/logical framework that would enable me to methodically estimate the probability that a given object is natural or artificial, that is made by artifice, according to some design. A way to tell whether a stone is just a stone, or the earliest handmade tool. Today we use various arguments to convince ourselves one way or the other, but we have no formal theory.

One of ID's brightest advocates, William Dembski, seems to start toward such a theory, in his online article, Intelligent Design as a Theory of Information. In it, he sketches the concept of Complex Specified Information (CSI) - which involves Actualization, Exclusion, and Specification. Actualization means that a possibility has been realized. Exclusion means that possibility excludes many other possibilities. Specification means that the possibility matches a pattern that is given independently of and prior to the possibility itself.

Since exclusion is measured in terms of probability, i.e., the likelihood that the possibility could have occurred by chance, Dembski adopts the standard convention in information theory that the amount of information in an event is proportional to the negative of the logarithm (to the base 2) of the probability of the event's happening by chance. The base 2 is a nod to the idea that information can be reckoned in terms of binary digits or bits. But the log has significance in physics that Dembski ignores in his short article - the formula relating information in a system to the probability of the system being in the configuration in which it is found is related (by a sign change) to the Entropy of the system. More about this in a moment.

Dembski goes on to say:
Natural causes are therefore incapable of generating CSI. This broad conclusion I call the Law of Conservation of Information, or LCI for short. LCI has profound implications for science. Among its corollaries are the following: (1) The CSI in a closed system of natural causes remains constant or decreases. (2) CSI cannot be generated spontaneously, originate endogenously, or organize itself (as these terms are used in origins-of-life research). (3) The CSI in a closed system of natural causes either has been in the system eternally or was at some point added exogenously (implying that the system though now closed was not always closed). (4) In particular, any closed system of natural causes that is also of finite duration received whatever CSI it contains before it became a closed system.


Not so fast, Dr. Dembski. There is a related law in physics, the Second Law of Thermodynamics, which says that Entropy must always increase, which it seems to do in the Universe as a whole. But there are localized regions in the Universe in which Entropy can decrease for a time due to a large input of Energy. One of these is the surface of the earth, on which the biosphere decreases its Entropy by means of the input of a huge amount of sunlight. Now the Entropy of a system is just an expression of the disorder in the system, which is the probability that the system would be in some actualized state by chance. In other words, Entropy is the negative of Information. And, since Entropy is physical, so is Information, and any theory of information must be constrained by what we already know about physics. (Actually, it goes much deeper than this: the scientifically inclined who have Adobe Reader can read Quantum Information is Physical by Di Vincenzo, et al.)

Since we know that Entropy can decrease locally for a time, we know that Information can increase locally for a time. That is to say, there is nothing in the article cited above that necessitates a Designer for CSI to originate and grow on earth, at least for a while, say as long as the sun keeps shining, and the globe remains habitable. To put it simplistically, if the earth is getting more intelligent, it is because the sun is getting dumber.

In fact, this may be true of the entire Universe, which seems to have begun in an extremely low-Entropy, high-symmetry (high Information) state. The generation of what appears to us to be CSI in the Universe is merely the creation of temporary, relatively low-entropy regions of the Universe, such as earth's biosphere, such as ourselves. Or more accurately, what appears to be Complex Specified Information in our DNA is actually not specified, at least not in advance by a Designer.

To see this, consider that once you have a self-replicating molecule (DNA, RNA, or some prior molecule or molecules), you have a computer. By replicating or failing to replicate under different circumstances, it processes information about what allows it to replicate and what doesn't. If variations can occur in the molecule, it can essentially learn to replicate better in its environment, and it can learn to adapt to changes in its environment. Since learning involves structural change in the molecule, it keeps a kind or record of its past configurations. Now, sometimes parts of the record might get lost, and extraneous stuff might get in, but generally speaking the molecule (or more precisely, the current generation of the molecule) constitutes a record of its path of changes through time.

Because the molecule is subject to the laws of physics, its stereochemistry (its 3D structure and how it interacts with other molecules because of its 3D structure and their 3D structures) is constrained. That is to say, not all variations are possible, and in particular, once it has gone down one path of changes, many other paths become less accessible, because its structure now no longer permits those variations.

The forgoing two paragraphs imply that this process of changes cannot be strictly random, because the laws of physics and chemistry make it impossible for the variations in the molecule's structure to be completely random. Moreover, because the past changes in structure limit the possible (and the likely) future variations in the molecule's structure, the temporal process of successive variations is not even Markov. (In a Markov process, the current state of a system embodies all the information about a system - the future of the system is independent of its state at any other time in the past.)

Now this sequence of variations of molecular structure is the evolution of the molecule, which if the molecule is DNA, means that the evolution (variation of form through time) of life is neither random nor Markov. In other words, strict Darwinism is false - but what did you expect? Darwin knew nothing of DNA. On the other hand evolution is not a theory but a process which has been observed under controlled conditions, such as dog-breeding. What is theoretical about it is the details of how it happens.

So far, we see that Evolution is not equivalent to Darwin's idea of natural selection of the fittest random variations of living forms. Rather it is eqivalent to natural and/or artificial (in the case of breeding) selection of non-random and non-Markovian variations of living forms (and the biomolecular computers which generate and encode them). Essentially the biomolecular computers are mechanisms for turning inputs of energy into temporary and local decreases of entropy (increases of information). In other words, it is not necessarily the case that the information in our human DNA is specified by a Designer. Physics and chemistry may have brought about the first very simple biomolecular computers with some low but not infinitesimal probability, and then the biomolecular computers and their environment did the rest. There may be no CSI in humans as biological entities. It would not be the first time that humanity has had to dethrone itself.

But what about that low-entropy, high-information, priveleged state in which the Universe appears to have begun. Who specified that? Well, the state was a quantum vacuum state. Such states are characterized by so-called vacuum oscillations, in which virtual particle-antiparticle pairs spontaneously pop out of the vacuum and recombine to their original nothingness before the Time-Energy Uncertainty Principle is violated. In layman's terms this means that the original quantum vacuum state contained the virtual possibility of everything and its opposite. What changed this state was the spontaneous breaking of certain symmetries as the Universe expanded and cooled. The broken symmetries guaranteed that not all of the opposites would arise to cancel out all of everything. The universe began to differentiate into what we see today.

This rather dramatic over-simplification of current cosmological theory is I hope still reasonably faithful to the original. If so, it tends to support the Hindu/Buddhist idea of the "Ten Thousand Things" falling out of the original perfect unity of All in All. On the other hand, the emergence of the Universe itself, and the cosmic radiation from the Big Bang tends to add support to the Judeo-Christian, "Let there be Light." Or maybe not. The point is that I still see no logical/physical necessity for an Intelligent Designer. As a Christian, I happen to believe there is One, and that I experience His Presence, but here I stand to say that so far, I don't think Dembski has proved His/Her/Its existence.

Of course, the concept of Complex Specified Information is not all there is to Intelligent Design theory. But then, I'm not done with ID yet. It's just that this article is long enough for tonight.

No comments: