…Gap Highlighter, Design Conjecture
Though I’ve continued to endear myself to the YEC community, I’ve certainly made myself odious in certain ID circles. I’ve often been the lone ID proponent to vociferously protest cumbersome, ill-conceived, ill-advised, confusing and downright wrong claims by some ID proponents. Some of the stuff said by ID proponents is of such poor quality they are practically gifts to Charles Darwin. I teach ID to university science students in extra curricular classes, and some of the stuff floating around in ID internet circles I’d never touch because it would cause my students to impale themselves intellectually.
I have yet another opportunity to make myself yet more odious to the ID community by suggesting information theory should be dropped or at least de-emphasized as a means of making the design argument, especially the use of CSI to implement the Explanator Filter (EF). I claim the EF is adequately and more effectively implemented with CSI-free methods for most biological designs that are debated currently.
TSZ’s very own Patrick/MathGrrl successfully demonstrated the inability of ID proponents to make basic ID inferences in ways that would stand up to university-level scrutiny. I tried to agree with him without having to offend too many IDists in my essay:
Siding with Mathgrrl.
I probably got away with my oblique defiance of the CSI culture at the time because my alternate approach to rejecting the chance hypothesis actually works in demonstrating certain chance hypotheses can be rejected! In fact, those who tried to assail my alternative CSI-free approach became the butt of jokes at UD. I now informally introduce my version of the CSI-free Explanatory Filter.
But first I should mention, I use the phrase “Design Conjecture” vs. “Design Inference”. The word “conjecture” seems to capture the notion that an ID claim might not ever be formally proven nor falsified, but there might be intuitions or beliefs to suppose design could be true. FWIW, at the time of this writing, even in mathematics, there are some conjectures which many could accept as true, but have not yet been nor may ever be proven: List of Conjectures. [My favorite conjecture is Goldbach’s conjecture probably because it is easy to understand.] The phrase word “Design Inference” connotes an assertion that lies beyond the limits of what can actually be formally demonstrated. “Conjecture” captures more the spirit of what ID can formally claim based on what can be formally demonstrated.
The Explanatory Filter can be argued to be a tool to support the design conjecture. And if “design conjecture” seems too ambitious, the EF can be said to be a tool to demonstrate gaps in our knowledge. Rather than saying the EF proves design (as some of my ID comrades argue), I make the more modest claim the EF demonstrates the origin of certain systems are not consistent with certain chance and law hypotheses. The modified CSI-free EF does not eliminate all conceivable non-ID hypotheses.
The CSI-free EF does not argue from ignorance, but rather proves via contradiction. For example, if a proposed mechanism conforms to the binomial distribution, to the extent a system convincingly contradicts that distribution, that distribution is falsified.
Perhaps the simplest (albeit somewhat weak) example of a conjectured design in biology is homochirality. Amino acids are left-handed homochiral, DNAs are right handed homochiral. Though there are a few successful chiral amplification chemistry experiments for amino acids, they fail to maintain homochirality through a polymerization process of diverse amino acids in a supposed pre-biotic soup. Further more, even in the unlikely event amino acids spontaneously polymerize, if they were homochiral, they won’t stay that way for long without an active policing mechanism since thermal agitation at temperatures above freezing will dispense homochirality in short order relative to geological time.
The homochiral feature of life statistically relates to the problem posed by 500 fair coins heads and is at variance with binomial expectation. The CSI-free EF can be said to reject the model of the bionomial distribution to explain the homochiral features of life. If a distribution for a set of coins or amino acids in a polymer are assumed binomial, even if there is modest bias, there are conditions where the chance hypothesis can be operationally rejected. We don’t have to resort to CSI or ambiguous CSI Phi_ST calculations to do this! We can resort to textbook freshman statistical analysis.
On the presumption a binomial distribution accurately models a chance hypothesis for a system, I outlined one method to reject the chance hypothesis using the law of large numbers: LLN vs Keiths and Eigenstate and my other TSZ critics. This was so easy! And there are other comparably easy approaches to reject the chance hypothesis, such as a Chi Squared Test.
I then came to a heretical viewpoint during my writing of alternate approaches to the EF. Why use CSI to implement the Explanatory Filter (EF) at all? It seemed CSI was mostly superfluous and an added unnecessary layer of confusion. The superflousness is demonstrated by this formula:
I = -log2(P)
where P is the probability of an event, I is the measure of information. Restating the probability in terms of negative log2 adds no insight. That sort of logarithmic transformation is clearly useful for communication engineers trying to figure out how to push more more bits through a wire, but it certainly never helped me make a more convincing anti-chance argument!
In the case of 500 coins 100% heads, the P is 1 / 2^500 and I is 500 bits. But this tells us nothing about rejecting the chance hypothesis for the 100% heads. Calculating CSI to reject the chance hypothesis required complicated procedures which practically all IDists at UD could not follow, but which they swore by nonetheless, and certainly few if any were able to execute successfully to Patrick’s satisfaction.
Rather than the CSI method, I resorted to simple statistics. The widely accepted model for the statistics of coins and/or amino acids was the binomial distribution (biased or unbiased). There are 501 possible macrostates for 500 fair coins under the binomial distribution, and 100% heads had the lowest possible multiplicity macrostate, was the farthest configuration from expectation, and would be rejected by chi-square tests for randomness. No need for CSI! The tools to make an unassailable inference with respect to the most widely accepted statistical model of fair coins was available and used in practice for decades, maybe even centuries.
Why not just use basic arguments? For systems as complicated as the algorithmically controlled metabolisms found in life, maybe we just have to be a little creative and thus find much easier ways to implement the EF without doing those blasted Phi_ST calculations!
Can the CSI-free EF be applied to other features of life? I think so, and here is an example in outline (not elaborated) form. By convention we view a conceptual divide between hardware and software. The same software can be run on computers made of different materials – silicon, halfnium or DNA-RNA-PROTEINS. Hardware cannot as a matter of principle determine the essential details of the software, but instead must allow necessary degrees of configurational freedom. We sometimes measure those degrees of freedom in bits, but lets not go there for now. 🙂
One particular set of algorithms of interest to ID are Quines. In addition to Quines another set of algorithms of interest would be those that drive von Neumann Universal Constructors. A few scientists view life as implementing such wonderful devices as Quines and von Neumann Constructors.
If life indeed implements these devices, as a matter of principle, life software cannot be reduced to law any more than the essential features of software can be explained in terms of hardware. Furthermore, the chance hypothesis cannot be an explanation for such software constructs even in principle. Additionally, as far as OOL is concerned, Darwinian selection (even assuming it works, which it really doesn’t) is inapplicable to OOL or should I say Origin of Quines and Origin of von Neumann Constructors.
It is no surprise then that individuals like Don Johnson, who hold two PhD’s (one in chemistry, the other in computer science) and who researched recombinant DNA, accept ID. And it is no surprise there are a few closet ID sympathizers in the engineering community. They understand the Origin of Life problem is not one of chemistry, but of software, and not just any kind of software, software associated with Quines and von Neumann Constructors.
Many years ago I had a discussion with Tom English at ARN over how improbable a Turing complete system might be as a matter of principle. I recall the numbers would be astronomical for Hofstadter’s example of a Turing system for DNA as stated in Gödel, Escher, Bach, but I could not get a generalized answer from anyone thereafter. I never had a chance to finish the discussion….
What I eventually realized however was that it’s not a matter of how simple an algorithm can be made, but that extravagant algorithms of extreme Rube Goldberg complexity exists in life that defy expectation from ordinary processes. Sure replicators can spontaneously form, but when a replicator forms that is astronomically far from expectation, thoughts of a Designer, a Creator God begin come into the mind of some. We don’t have to use theologically loaded words like “miracle” or “supernatural”, but we can use euphemisms like “astronomically far from scientific expectation.” 🙂
So what is the CSI-free Explantory Filter? It is a filter that does not use CSI but rather basic science, logic, mathematics and cybernetics to reject or make less believable known or claimed law and chance mechanisms as explanations for certain features of the universe.
[Title shortened by Lizzie]