Philosophy and Complexity of Rube Goldberg Machines

Michael Behe is best known for coining the phrase Irreducible Complexity, but I think his likening of biological systems to Rube Goldberg machines is a better way to frame the problem of evolving the black boxes and the other extravagances of the biological world.

But even before going to the question of ID in biology, I’d like to explore philosophical and complexity questions of man-made Rube Goldberg machines. I will, however, occasionally attempt to show the relevance of the questions I raised regarding man-made Rube Golberg machines to God-made Rube Goldberg machines in biology.

Analysis of man-made Rube Goldberg machines raises philosophical questions as to what may or may not constitute good or bad design and also how we make statements about the complexity of man-made systems. Consider this Rube Goldberg machine, one of my favourites:

Is the above Rube Goldberg machine a good or bad man-made design? How do we judge what is good? Does our philosophical valuation of the goodness or badness of a Rube Goldberg machine have much to say about the exceptional physical properties of the system relative to a random pile of parts?

Does it make sense to value the goodness or badness of the Rube Goldberg machine’s structure based on the “needs” and survivability of the Rube Goldberg machine? Does the question even make sense?

If living systems are God-made Rube Goldberg machines, then it would seem to be an inappropriate argument against the design of the system to say “its poor design for survivability, its fragility and almost self-destructive properties imply there is no designer of the system.”

The believability of biological design is subjective to some extent in as much as some would insist that in order to believe design, they must see the designer in action. I respect that, but for some of us, a system that is far from physical expectation, design is quite believable.

But since we cannot agree on the question of ID in biology, can we find any agreement about the level of specificity and complexity in man-made Rube Goldberg machines? I would hope so.

What can be said in certain circumstances, in terms of physics and mathematics, as far as man-made systems, is that certain systems are far from what would be expected of ordinary non-specific processes like random placement of parts. That is, the placement of parts to effect a given activity or structure is highly specific in such circumstances — it evidences high specificity.

My two favourite illustrations of high specificity situations are:

1. a domino standing on its edge on a table.

2. cards connected together to form a house of cards. The orientation and positioning of the cards is highly specific.

We have high specificity in certain engineering realms where the required tolerances of the parts is extremely narrow. In biology, there are high specificity parts (i.e. you can’t use a hemaglobin protein when an insulin protein is required to effect a chemical transaction). I think specificity of individual interacting parts can be occasionally estimated, but one has to be blessed enough to be dealing with a system that is tractable.

In addition to specificity of parts we have the issue of the complexity of the system made of such high specificity parts. I don’t think there is any general procedure, and in many cases it may not be possible to make a credible estimate of complexity.

A very superficial first-pass estimate of complexity would be simply tallying the number of parts that have the possibility of being or not being in the system. This is akin to the way the complexity of some software systems is estimated by counting the number of conditional decisions (if statements, while statements, for statements, case statemets, etc.)

In light of these considerations we might possibly then make statements about our estimate for how exceptional a system is in purely mathematical and physical and/or chemical terms — that is providing the system is tractable.

I must add, if one is able to make credible estimates of specificity and complexity, why would one need to do CSI calculations at all? CSI is doesn’t deal with the most important issues anyway! CSI just makes an incomprehensible mess of trying to analyze the system. CSI is superfluous, unnecessary, and confusing. This confusion has led some to relate the CSI of a cake to the CSI of a recipe like Joe G over at “Intelligent Reasoning”.

Finally, I’m not asserting there are necessarily right or wrong answers to the questions I raised. The questions I raise are intended to highlight something of the subjectivity of how we value good or bad in design as well as how we estimate specificity and complexity.

If people come to the table with differing measures of what constitutes good, bad, specified, complex and improbable, they will not agree about man-made designs, much less about God-made designs.

I’ve agreed with many of the TSZ regulars about dumping the idea of CSI. My position has ruffled many of my ID associates since I so enthusiastically agreed with Lizzie, Patrick (Mathgrrl), and probably others here. My negative view of CSI (among my other heresies) probably contributed to my expulsion from Arrington’s echo chamber.

On the other hand, with purely man-made designs, particularly Rube Goldberg machines, I think there is a legitimate place for questions about the specificity of system parts and the overall complexity of engineered systems. Whether such metrics are applicable to God-made designs in biology is a separate question.

230 thoughts on “Philosophy and Complexity of Rube Goldberg Machines

  1. As I understand it, in many cases knockout experiments are a poor indicator of functionality because of the redundancy built in to genetic networks.

    Exactly. One lecturer at the NIH passingly said, we knock out on microRNA of 9 that bind to gene product (mRNA), no problem. Knockout all 9 simultaneously — death.

    Assumption of 80%-100% functionality is a reasonable working hypothesis. I see no damage from accepting it provisionally.

    FWIW, even as a creationist, I once believed the 2% functionality claim. I changed my mind after reviewing NIH ENCODE and Roadmap papers, I think the 80% figure is a good provisional working hypothesis. So what if that number wrong, I don’t see the harm of making that assumption for now and the real figure is 15%.

    I do, however, see some damage being done if we insist that 10% functionality is God’s truth when in fact the genome could be 80% functional.

    By function, I mean some sort of causal connection as illustrated with the HOTAIR Rube Goldbergesque design that differentiates the kind of skin at the sole of the foot vs. the eye. It’s no wonder Rinn’s discovery of HOTAIR stunned the biological community. HOTAIR was mentioned several times at ENCODE 2015 as if were symbolic of something, now I’m seeing why.

  2. stcordova: even as a creationist, I once believed the 2% functionality claim.

    Sal, I have done a bit of googling, and can’t find an academic source for the two percent claim. The lowest number that comes up from any authoritative sites is 8.2 percent.

    When ENCODE results came out, there were informal discussions of 5-10 percent. I can’t find anything from the last 40 years that is lower than that.

  3. stcordova: I take it bothers you that ENCODE proponents are inspiring the idea that the human genome is more than 10% functional.

    It bothers me that scientists are making declarations not based on facts. This is about truth and scientific rigour.

    stcordova: If people go around believing it is 50%, 60%… 99.5% functional, that would really bother you.

    If having false beliefs had zero real-world consequences, then no it wouldn’t bother me. I do not have any sort of personal investment in what fraction of the genome is functional, I couldn’t care less whether it’s 10, 20, 50 or 99%. I care about the process of science, I care about what is actually true, and I care about the relationship between science and the media. A relationship that is actually in many ways quite unhealty, because scientists feel a pressure to make grandiose statements and declarations to advertise the science they do, so they can continue to get funding, job-security, attention to the institutions they work for and so on. This isn’t supposed to be like that, because it affects the quality of the work they do, it introduces biases and eventually leads to bad science when mistaken interpretations done on previous work is taken as the basis for further reasearch in future work.

    stcordova: Let’s assume for the sake of argument the functionality numbers aren’t that high in reality, say 15% functionality instead of 2% or 10%

    I have never ever heard anyone say they think the proportion of the genome that is functional is as low as 2%. The lowest I have seen is 8%. Why would that matter to me on a personal level? It doesn’t. It matters to me because the FACTS matter to me. By using a 2% number one might get the provably mistaken impression that scientists once thought all non-coding DNA was junk. But this view has never ever been held by anyone who bothered to educate themselves on the case for junk-DNA. The 2% number roughly corresponds to the fraction of the genome that codes for proteins (it’s actually more like 3%, but anyway…). That doesn’t mean scientists ever thought at some point in time that the genome only contained protein-coding sequence and nothing else. Such a view has never been held by anyone.

    stcordova: would a false belief of 100% functionality be such an unacceptable cultural change?

    If it was entirely without medical, social and political consequences, I couldn’t give any less of a shit. But people’s beliefs inform their actions, and when people believe in false things their actions are more likely to have bad consequences.

    stcordova: How badly would such a belief, that the human genome is packed with 99.5% Rube Goldbergesques designs — how badly would such an idea affect the progress of science?

    I can’t predict the future so here I can only offer you speculations. But if scientists were to continue to believe, in spite of large amounts of evidence to the contrary, that the entire genome was functional, then I can totally see how very large dollar sums and an awful lot of time could be wasted chasing ghosts in the genome. This would have important real-world consequences, if not directly then indirectly because a lot of money and time that could have been spend better on something fruitful and TRUE.

    stcordova: I don’t see why it should, a creationist like me got so excited about the idea and that I wanted to learn more about science, not less. I wanted to teach more about the details of how these machines connect (as I have done in this discussion), not less.

    Surely whether the genome is entirely functional or only slightly functional, finding out what and which parts do what in particular, is what is important. But you don’t find that out by continuing to believe in things your evidence doesn’t support. Nobody is preventing ENCODE researchers from doing their work, they’re being asked to report the results of their work correctly and responsibly, instead of sensationalistic and grandiose.

    stcordova: Is the real problem the fact that such a change in a cultural mindset might make people more inclined to worship an invisible God because of the great genius and humor and unnecessary wasteful extravagance they perceive in biological systems? It’s not really because it will hinder the progress of medical discovery is it?

    I don’t believe people believe in god due to what percentage of the genome is thought to be functional. I suspect even most atheists, and even many biologists who are atheists, believe the whole genome to be functional.

    I don’t believe if most religious people were to come to understand that most of the genome is junk, they’d suddenly deconvert. Simply put, real-world facts have very little to do with why people believe in god. Most people believe in god because they were raised to believe in god and because those beliefs are held and reinforced by their social groups. Whether the genome is mostly junk or not, how old the universe is, or how many years it took for whales to evolve is all besides that and when people become informed of such sterile empirical facts, they simply include and then rationalize them into their beliefs in some way.

    stcordova: I take time to find and discuss the details of biological systems and rather than you guys marveling at the intricacy of these contraptions

    I spend all day every day marveling at them. I just don’t let my sense of awe colour my understanding of how these systems came to exist.

    By all means, marvel at it. It is marvelous after all. Life is no less wonderful in it’s details whether the genome is mostly junk or mostly functional. If I want to marvel at a fully functional genome I can marvel at E coli. Or IIRC the human mitochondrial genome.

    stcordova: I’d think you guys would have enjoyed actually discussing science and facts just like the HOTAIR non-coding DNA and it’s ability to connect to a writer and eraser, almost like a pencil.

    What’s there to discuss? Nobody is disputing that this system exists or how it works. What the discussion is about is what fraction of the genome is functional (at least, for me that is what is I’m here to discuss, not what label (“Rube-goldbergesque machines” or w/e) you want to stick on to complex many-parts interacting molecular entitites)-

    stcordova: What’s there to fear if ENCODE changes the culture?

    I think I have already explained that above. In the simplest terms, people’s beliefs inform their actions. If people believe in falsehoods they are at greater risk of harming themselves and others.

  4. stcordova: FWIW, even as a creationist, I once believed the 2% functionality claim.

    That’s strange because I’ve never run into a claim of only 2% functionality anywhere.

    Quite possibly you’ve been mislead by (probably well-meaning but misinformed) popular press articles that also made the mistake of thinking that any DNA that isn’t protein-coding was once thought to be junk.

    But again, for the 4th time now, scientists never thought only the fraction of the genome that codes for proteins was the only functional part. Again, it was the other way around. They started out believing the genome was ENTIRELY functional based on arguments from natural selection+time.
    The idea was that carrying unneeded stuff around carried with it a cost in energy during cell-division when new DNA had to be synthesized, and also maintained during the lifetime of the cell, and so it would be eventually weeded out because deletions of excess unneeded DNA would carry with it a reduced energy requirement for the organism – therefore given the fact of geological time, all DNA would have to be functional because excess stuff would have been weeded out long ago.
    Or so the rationalization went. Until they started actually sequencing the stuff, doing the genetic load calculations, discovered transposon, retrotransposons, pseudogenes, retroviruses, interspecies genome size variatios and so on. Gradually they had to contend with and explain these facts. A simpler hypothesis that explains more facts is preferable to a more elaborate, detailed and (heh) Rube-Goldbergesque one bloated with details and caveats that has to be retro-fitted, readjusted or even discarded in favor of a different one, over and over again as new species and mechanisms are discovered.

  5. I wrote a rather large post that was “awaiting moderation” but now it seems to be entirely gone?

  6. Alan Fox:Rumraket,
    Apologies, Rumraket. Your comment is restored.

    I assumed you had flagged your comment as a duplicate. My mistake.

    Oh? I must have clicked on it by mistake, so the error is probably mine. Thx for restoring it.

  7. stcordova,
    Biol Chem. 2015 Apr 24;290(17):11093-107. doi: 10.1074/jbc.M115.648394. Epub 2015 Mar 15.
    Selective Distal Enhancer Control of the Mmp13 Gene Identified through Clustered Regularly Interspaced Short Palindromic Repeat (CRISPR) Genomic Deletions.
    Meyer MB1, Benkusky NA1, Pike JW2.

    This paper is from Pike’s group, the presenter you provided youtube for on vitamin d. This experiment described in the abstract (sorry pay wall) shows some knock out experiments that are interesting. In addition to the promoter region there are two other protein binding regions 10kb and 30kb away from the promoter region when knocked out down rev or fail transcription. Thoughts?

  8. stcordova,
    Biol Chem. 2015 Apr 24;290(17):11093-107. doi: 10.1074/jbc.M115.648394. Epub 2015 Mar 15.
    Selective Distal Enhancer Control of the Mmp13 Gene Identified through Clustered Regularly Interspaced Short Palindromic Repeat (CRISPR) Genomic Deletions.
    Meyer MB1, Benkusky NA1, Pike JW2.

    This paper is from Pike’s group, the presenter you provided youtube for on vitamin d. This experiment described in the abstract (sorry pay wall) shows some knock out experiments that are interesting. In addition to the promoter region there are two other protein binding regions 10kb and 30kb away from the promoter region when knocked out down rev or fail transcription. Thoughts?

    Hi,

    Thanks for pointing out the paper. I was able to get behind the paywall because I have institutional access.

    It was an agonizing read of 15 pages that went in to the experimental details of how Pike’s group was able to make its conclusions.

    His ENCODE 2015 talk was basically a summary of that paper, so what’s in the paper is actually in the public domain via youtube.

    That paper wasn’t directly aimed at the role of vitamin D in health, but rather how the Matrix metalloproteinase 13 (Mmp13, collagenase-3) gene/protein is regulated by 3 stretches of DNA that are far from the actual Mmp13 gene.

    Now the reason Pike’s discovery this was obviously celebrated at the ENCODE 2015 conference is that he illustrated how a gene is regulated by distant DNA stretches. Pike’s team deleted a 427 base pair stretch of DNA that was 10,000 bases away from the actual Mmp13 gene’s promoter and then another 582 base pair stretch of DNA that was 30,000 bases away. This is really amazing in that it may show how non-coding regions are recruited in regulation — these DNA regions are used as parking lots for the Vitamin D receptor (VDR) and Runx2.

    This is a description of the VDR.

    https://en.wikipedia.org/wiki/Calcitriol_receptor

    One end of the VDR attaches to the DNA directly or via some complex, and then vitamin D attaches to another part of the VDR to make the VDR work. So Pike’s work illustrates a specific way vitamin D influences gene regulation through a very intricate cascade of events.

    Btw, this was about vitamin D3 which is one of the 5 forms of vitamin D:
    https://en.wikipedia.org/wiki/Cholecalciferol

    Now Pike focused was only a small small sample of the effect of vitamin D because he focused on Vitamin D’s effect on only one gene (Mmp13).

    But the above video of the HAT complex (Histone Acetylase complex) shows how vitamin D could affect thousands of genes:

    This is because the Histone Acetylase complex often has a Vitamin D receptor attached to it.

    Pike’s work was cited in this book about vitamin D, by the way (which retails for $168):

    https://books.google.com/books?id=-OSECgAAQBAJ&printsec=frontcover&source=gbs_ge_summary_r&cad=0#v=onepage&q&f=false

    Your question motivates me to make sure I have adequate levels of vitamin D. I didn’t realize how vital it was to health!

    Unfortunately, because of the intricacy of life, vitamin D can’t be a cure all since there are so many other things that can go wrong when one is dealing with cancer. But work like Pike’s and other researchers at least show why the importance of nutrition and maintaining vitamin D levels isn’t voodoo medicine. It has a rational basis based on the in the way our cells work at the molecular level.

    Agonizing, tedious and detailed experiments such as Pike’s help drive home the point of how important vitamin D is.

    Btw, this also shows maybe what might happen if there is too much vitamin D. That may be bad too.

  9. Below is a screen capture from about 25 minutes into Wesley Pike’s 2015 ENCODE presentation (youtube link below) which highlights reasons junk DNA isn’t necessarily junk:

    The -10k, -20k, -30k bubbles are the enhancer regions that basically serve as parking lots for the regulatory machinery that acts on the Mmp13 gene.

    The Mmp13 gene is represented by the black squares of Mmp13 exons and the GTA (general transcription assembly) bubble. The diagram is also in the paywalled article that colewd mentioned.

    This diagram is really a 2 dimensional conceptual picture of how 3 dimensional DNA positions the parking lots (enhancers) for regulatory machines like the Vitamin D receptor (VDR) and RUNX2 assembly . The enhancer parking lots are marked “-10k”, “-20k”, “-30k” and are right there near the promoter region (marked “Pro”) of the Mmp13 gene.

    So this shows how junk DNA can be used as a parking lot/scaffold for the complex regulatory machines to park themselves near genes. The parking lot has specific signs (junk DNA sequences) on them that help the regulatory machinery like the vitamin D receptor to find the right parking lot. We might call those junk DNA sequences “receptor binding sites”.

    As mentioned earlier the -10k parking lot is 427 bases and the -30k parking lot is 482 bases. I did not find the size of the -20k parking lot in Pike’s paper.

    But if ENCODE is right that 80-100% of the DNA is transcribed, there is a chance these parking lots may also create functional RNA transcripts which would be eRNA transcripts:

    https://en.wikipedia.org/wiki/Enhancer_RNAs

    Whether this happens in this gene remains to be seen, but eRNA transcription does happen in other gene enhancers.

    If junk dna enhancer regions exist which act as scaffold/parking lots/receptor binding regions but also provide functional eRNAs when transcribed, then this would be an amazing dual-use Rube Goldbergesque design that should just boggle the mind!

  10. stcordova,

    Unfortunately, because of the intricacy of life, vitamin D can’t be a cure all since there are so many other things that can go wrong when one is dealing with cancer. But work like Pike’s and other researchers at least show why the importance of nutrition and maintaining vitamin D levels isn’t voodoo medicine. It has a rational basis based on the in the way our cells work at the molecular level.

    I think if we had everyone measure blood levels of vitamin d and then get supplements we could reduce cancer by as much as 75%. Since the cells have many redundant ways to avoid cancer there are only a few critical problems to get on top of. First to have cancer you need to produce stem cells. Differentiated tumors are not cancerous because they only grow in specific tissue and cannot metastasize. You also need to have DNA repair and apoptosis fail during the cell cycle. Evidence is strong that low vitamin d levels can create failures in all areas, and very few other mutations or mechanisms can. One example is if the APC genes binding capability is lost through mutation and it can no longer down regulate beta catenin which can create cancer through a group of protein chain reactions. The area of Pikes presentation that was very interesting was the data that at early cell formation vitamin d was not regulating basal cell expression. This has got me thinking about the problem very differently and I am hopeful this will lead to new discovery. BTW the best source of vitamin d is the sun and unlike oral consumption vitamin d from the sun is regulated so you cannot OD.

    colewd,

    I agree with you that Pike’s data is a case for additional function to the genome even if it is “parking lot” function vs. critical sequence function.
    Sal, thanks so much for your thoughts 🙂

  11. stcordova: Below is a screen capture from about 25 minutes into Wesley Pike’s 2015 ENCODE presentation (youtube link below) which highlights reasons junk DNA isn’t necessarily junk:

    ok. but no one is claiming junk DNA is necessarily junk.

  12. Sal, thanks so much for your thoughts 🙂

    Well, I’ve been blessed talking to you because a few years ago I was feeling really sick and my blood tests showed low levels of vitamin D. The doc prescribed huge doses since my vitamin D levels were so low.

    Unfortunately, I stopped taking the supplements, and I think I need them and/or better nutrition. My skin is very sensitive to sunlight so I need to find other ways to maintain my vitamin D levels. So thanks for bringing up the subject. It motivates me to take care of that aspect of my life.

  13. stcordova,

    Its hard to get vitamin d through food. Next best to food is supplements. Just get a little sun every day. If your skin is sensitive that means you probably convert really efficiently. Get your vitamin d level tested regularly until you have created a process to keep it around 50ng/ml. Amazing that we need sunlight just like plants:-) I can’t tell you how useful the youtube video was. My head is spinning with ideas right now.

  14. I can’t tell you how useful the youtube video was. My head is spinning with ideas right now.

    Wow. Well your enthusiasm made me look at the video some more. I had to double check the astonishing figure (8 minutes into the video) that vitamin D affected 7007 genes/sites in pre-osteoblasts. The graph in question is pictured below.

    “Veh” apparently means the cell without vitamin D and 1,25D3 means the cell with vitamin D.

    Now pre-osteo blasts are only 1 of maybe 213 cell types, so vitamin D might be affecting even more of the 20,000 possible coding genes/sites than just 7,007.

    Amazing that he said that most of the “parking lot” sites were in introns (widely viewed by some as pure junk) or intergenic regions. Those introns could easily be in genes other than the one being regulated. Amazing!

    Dr. Pike referred to ChIP-seq several times. I hope to learn more about ChIP-seq technology in a few weeks.

    http://www.chipseq.com/

  15. stcordova,

    What has my attention are the graphs that show vitamin d inactive in promoting early cell development prior cell differentiation. That says that if your cells are low on vitamin d then they “think” they are development cells again. Cancer requires rapid cell division and stem cell creation. This says the answer to understanding cancer may be understanding embryology better. The relevant graphs start about 18:53 in show little vitamin d related basal gene expression at early stages ramping up quickly during differentiation (stem cells becoming tissue specific)

  16. colewd,

    I sent you a private message regarding vitamin D. I have a link on the topic I’d prefer not to share publicly. Thanks.

    Sal

  17. stcordova,

    I provided good evidence we could suspect introns (which are 25%) of the genome, are functional.

    The presence of an intron, and the necessity of its entire contribution to genome length to that function, are two different things, as I have said. You can’t corral 25% for the ‘functional’ case’ like that. It’s a variant of the “here’s a functional LINE, that’s 17%” argument. Oh, here it is …

    I provided evidence transposable elements (17-21%) are functional for 2 reasons (somatic transposition to make unique neoron trascriptomes and genomes, yes genomes), and histones on transposable elements.

    […]

    Granted the lines of evidence I offered must be extrapolated to give high levels of function, but at least I cited experiments and lab measurments which is more than what the critics here have done.

    What experimental protocol would you suggest to counter the kinds of extrapolation that you are performing? If it turns out that these percentages really are overwhelmingly nonfunctional, would that have been money well spent? I find it hard to envisage an experiment that would persuade you. Because your counter would simply be “they didn’t look hard enough”.

    You seem mighty certain that ENCODE is right and all critics wrong. That seems based on sand.

    Look to the fugu and consider her ways. Look to the lungfish and despair.

  18. Look to the fugu and consider her ways. Look to the lungfish and despair.

    You seem to be going on the false assumption the lungfish utilizes its DNA in about the same way as other creature.

    It turns out the more we look the less “conserved” or similar the actual functioning of the combinatorial regulation schema is between species.

    28-30 minutes in Tijan alludes to it:

    Amazing that medical researchers are the ones discovering how genomes really work vs. the chasers of imaginary phylogenies who only provide imaginary stories of why genomes shouldn’t work.

    And Jerry Coyne knows who is at the bottom of science’s pecking order.

  19. stcordova,

    You seem to be going on the false assumption the lungfish utilizes its DNA in about the same way as other creature.

    It is an assumption, sure. In what way is it deemed false? On the one hand a handful of functional transposons is enough for you to ascribe function to every single one, on the other every species has to be investigated one by painful one before you’d accept any generalisation regarding nonfunctional DNA. Inconcistent, much?

    One could indeed construct an elaborate ad hoc world where every single organism operates differently, genomically, from every other, but this would conflict with what is known. And why would one do that, other than to prop up one’s argument on one particular species? This seems a particularly ‘maybe’-infested attempt to preserve your speculations about the function of junk in humans.

    Amazing that medical researchers are the ones discovering how genomes really work vs. the chasers of imaginary phylogenies who only provide imaginary stories of why genomes shouldn’t work.

    Who has said the genome shouldn’t work? That is a ridiculous misrepresentation of the junk argument. Yet another.

    Evolutionary biology is not genomic analysis and it is not medical research, so of course it is not going to uncover the things it is not set up to do. It does make a vital contribution to understanding, nonetheless. Amazing, meanwhile, how little contribution Creationism has so far. It blinds people to objective analysis, as far as I can see. You simply decide X is true – the genome is functional, say – and ignore or discard all contrary evidence. Meanwhile, all the evidence you adduce comes from the part no-one disputes: the conserved part. Fortunately not everyone does science like that.

  20. One could indeed construct an elaborate ad hoc world where every single organism operates differently, genomically, from every other, but this would conflict with what is known.

    We hardly know anything relative to what could be known in terms of how everything connects. It’s premature to pronounce stuff as non-functional, that’s blind faith argument from willful ignorance. It’s a science stopping attitude.

    If things work differently between organisms, to the extent they do, that is medically important since we base so much human medicine on study of other creatures mostly for practical and ethical reasons.

    If lungfish and humans have different cell types, tissue types, developmental mechanisms, stuff could work very differently. This would solve the C-value paradoxes, but well , evolutionary biologists don’t seem that interested to see if organisms actually implement molecular machines differently, they’ve decided they are extremely similar.

    Even what little exposure I’ve had to biochemistry, I can already see the histones in the variety of yeasts, plants and humans have different histone codes and machines to implement them. Given it may take decades just to look at a few histone modifications in species and the combinatorial regulatory complexes etc., I think insistence that “stuff works so similarly at the molecular level that lungfish and humans therefore must be constructed with junk DNA” is premature. It’s a science stopping attitude.

  21. stcordova,

    We hardly know anything relative to what could be known in terms of how everything connects. It’s premature to pronounce stuff as non-functional, that’s blind faith argument from willful ignorance. It’s a science stopping attitude.

    Bollocks. I’m not taking that from a Creationist! No-one is prevented from doing any research they wish. But at what point will it stop being ‘premature’? Answer, never. That is my point. You have decided it is all functional, and nothing, but nothing will dissuade you from that position.

    If things work differently […]

    If lungfish and humans have different […]

    If, if, if. What reason have you given for these suppositions, besides the fact that it helps insulate your expectation vis a vis the human genome from inconvenient facts? It seems a perfectly reasonable default position that DNA at the cellular level works in much the same way throughout Animalia, and probably well beyond. Of course there will be differences in detail. But transposons are transposons. And histones are one of the most highly conserved of all proteins. For obvious reasons.

    Even what little exposure I’ve had to biochemistry, I can already see the histones in the variety of yeasts, plants and humans have different histone codes and machines to implement them.

    Histone regulation relates to conserved genetic regions. In untranscribed regions, histones act to prevent transcription. See, even the genome knows it’s junk!

    Given it may take decades just to look at a few histone modifications in species and the combinatorial regulatory complexes etc., I think insistence that “stuff works so similarly at the molecular level that lungfish and humans therefore must be constructed with junk DNA” is premature.

    Who on earth has said anything remotely resembling that garbled paraphrase?

  22. A further point is that, initially, the idea that the genome was mostly junk was resisted. It was only through investigation that Ohno’s load argument gained support. I myself was perfectly happy to accept 100% when at uni. I’d never heared of an intron or a transposon. I’m still happy to accept ~100% for many organisms. But the evidence for junk as the explanation of the c-value paradox, and the role of constraints that lead to its reduction in certain organisms, is considerable.

    Bulk function for any putative junk class is completely absent, despite all the ‘maybes’. If it turns up, I’ll change my mind. But at the moment, the ball is in the ‘perfectionists’ court.

Leave a Reply