I’m starting off this blog with a post about an interesting discussion I’ve been having* on on the Uncommon Descent blog about the claim, frequently made by Intelligent Design proponents, that Chance and Necessity cannot generate information; information can only be generated by a mind.
Clearly, to either support or refute this claim, we need clear conceptual definitions of “Chance and Necessity” and “information”.
William Dembski uses Monod’s terms, “Chance and Necessity” to characterise natural processes, and indeed, devised an Explanatory Filter, for candidate exemplars of information-bearing patterns, whereby, if Chance and Necessity could be serially eliminated, Design could be inferred as the only remaining explanation. There are various ways of defining Chance and Necessity, but for convenience it may be reasonable to regard “Chance” events as unpredictable events (e.g. quantum events) and “Necessity” as reliable physical or chemical laws. In a deterministic universe, of course, once you have a set of starting conditions, all that follows is Necessity, and the opportunities for a Designer lie in specifying the starting conditions in such a way that the willed outcome is inevitable, and/or giving things a poke with a celestial snooker cue to keep them on the willed track. So in a deterministic universe, the ID question would be easy: were the starting conditions willed or a Chance first throw of the dice and/or are the workings-out of those starting conditions left to Necessity or tweaked to suit? In a non-deterministic universe, which it seems we have, Chance has a potentially more interesting and active roll. So the ID question becomes: can the events we observe be explained solely a combination of Chance quantum events and Necessary consequences, or can they be better explained by positing an Intelligent Designer who could affect the way things unfold by nudging quantum Chance and/or the otherwise Necessary consequences?
But what is meant by “information” mean, in the context of the ID claim? On Uncommon Descent, I made the counter-claim that I could demonstrate that Chance and Necessity could indeed generate information, for any regular English usage of the word information.
One of the regular posters there, Upright BiPed, took me up on my claim, and my response was to ask him (or any ID proponent) was to provide me with a conceptual definition of information for which he believed ID claim was true. My plan was then to operationalise the definition to our mutual satisfaction, and then to attempt to make good mine.
So what are candidate definitions?
Clearly, nobody is making the claim for Shannon entropy, as that would be easily falsified. Dembski’s concept of “specification” is all about narrowing down the set of Shannon entropy-rich patterns to those for which he considers “Design” a reasonable inference, by insisting not merely on a large amount of Shannon information (event-complexity”) as measured in bits, but also a large degree of compressibility, or “pattern simplicity” (“specification”), . When I first made my claim I was anticipating that the definition I’d be getting was something like Dembski’s Complex Specified Information (CSI). Dembski’s claim is that Chance and Necessity cannot generate CSI (or could only do so with such remote probability that the possibility is not worth entertaining).
However, Upright BiPed suggested something that in my view is much more interesting, in which “information” is defined not a property of a pattern, as with CSI, but the property of a process. One such definition is cited by Stephen Meyer in his book: Signature in the Cell, and is on of the Merriam-Webster definitions, namely:
the attribute inherent in and communicated by one of two or more alternative sequences or arrangements of something (as nucleotides in DNA or binary digits in a computer program) that produce specific effects
This makes a lot more sense to me, as I’ve said, and would mean that the ID claim, which I set out to refute, is:
Chance and Necessity cannot create information, where information is arrangements of things that have specific effects.
This definition invokes not only a pattern but some form of transmission protocol – information is not just a pattern but a pattern that has effects. And not just any effects – effects specific to a pattern. In other words there is a mapping between pattern and effect.
However, Upright BiPed also made an additional caveat, which is that to be information, the mapping has to be achieved via some kind of inert arbitrary intermediary (as is done by tRNA in mapping an RNA codon to an amino acid).
And in addition, I made the caveat that the specific effects should probably be functional in some way – e.g. promote faithful self-replication.
And so the the ID claim I aim to refute becomes:
Chance and Necessity cannot generate information, where information consists of arrangements of something that produce specific functional effects by means of inert intermediary patterns.
And this is the claim I am willing to attempt to refute, provided some ID proponent is willing to stand by the claim!
Alternatively, if you would like to supply an ID claim that you are willing to stand by, I’d be delighted to hear it.
Otherwise, welcome to The Skeptical Zone, have a free virtual beer in celebration of its first post, and post any comments, objections, suggestions, and criticisms you may have.
All are welcome. The only rule is: Park your priors by the door
*and I’ll take this opportunity to thank the UD community for the welcome they extended to me, and to extend my invitation to them, here, in return.