On the ‘Evolving Complex Adaptations’ thread, a side-discussion arose with CharlieM over Behe’s ‘CCC’ argument. In summary, Behe places an event of probability as the upper bound or ‘Edge of Evolution’. If a specific single mutation has a probability of of arising in any one replication event, a specific double mutation has a probability of , a triple , and a quadruple – that is, if four independent changes must happen simultaneously before a particular step is achievable, then that step cannot realistically have occurred in the history of life on earth. Behe thinks he has found a case with a probability in the resistance to chloroquine in the malaria parasite Plasmodium falciparum. Continue reading →
A little bit of this and a little bit of that. A new and cheap way to implement genetic engineering, and a way to bypass the usual rules of population genetics.
The stakes, however, have changed. Everyone at the Napa meeting had access to a gene-editing technique called Crispr-Cas9. The first term is an acronym for “clustered regularly interspaced short palindromic repeats,” a description of the genetic basis of the method; Cas9 is the name of a protein that makes it work. Technical details aside, Crispr-Cas9 makes it easy, cheap, and fast to move genes around—any genes, in any living thing, from bacteria to people. “These are monumental moments in the history of biomedical research,” Baltimore says. “They don’t happen every day.”
ANY GENE TYPICALLY has just a 50–50 chance of getting passed on. Either the offspring gets a copy from Mom or a copy from Dad. But in 1957 biologists found exceptions to that rule, genes that literally manipulated cell division and forced themselves into a larger number of offspring than chance alone would have allowed.
A decade ago, an evolutionary geneticist named Austin Burt proposed a sneaky way to use these “selfish genes.” He suggested tethering one to a separate gene—one that you wanted to propagate through an entire population. If it worked, you’d be able to drive the gene into every individual in a given area. Your gene of interest graduates from public transit to a limousine in a motorcade, speeding through a population in flagrant disregard of heredity’s traffic laws. Burt suggested using this “gene drive” to alter mosquitoes that spread malaria, which kills around a million people every year. It’s a good idea. In fact, other researchers are already using other methods to modify mosquitoes to resist the Plasmodium parasite that causes malaria and to be less fertile, reducing their numbers in the wild. But engineered mosquitoes are expensive. If researchers don’t keep topping up the mutants, the normals soon recapture control of the ecosystem.
Push those modifications through with a gene drive and the normal mosquitoes wouldn’t stand a chance. The problem is, inserting the gene drive into the mosquitoes was impossible. Until Crispr-Cas9 came along.
Emmanuelle Charpentier did early work on Crispr.Today, behind a set of four locked and sealed doors in a lab at the Harvard School of Public Health, a special set of mosquito larvae of the African species Anopheles gambiae wriggle near the surface of shallow tubs of water. These aren’t normal Anopheles, though. The lab is working on using Crispr to insert malaria-resistant gene drives into their genomes. It hasn’t worked yet, but if it does … well, consider this from the mosquitoes’ point of view. This project isn’t about reengineering one of them. It’s about reengineering them all.
The Principle of Sufficient Reason is a powerful and controversial philosophical principle stipulating that everything must have a reason or cause. This simple demand for thoroughgoing intelligibility yields some of the boldest and most challenging theses in the history of metaphysics and epistemology. In this entry we begin with explaining the Principle, and then turn to the history of the debates around it.
I am convinced that a broadly materialist view of the world must possess three essential features.
First, for a worldview to be materialistic, there must be a mechanistic base level.
Second, the level of basic physics must be causally closed.
Third, whatever is not physical, at least if it is in space and time, must supervene on the physical.
This understanding of a broadly materialistic worldview is not a tendentiously defined form of reductionism; it is what most people who would regard themselves as being in the broadly materialist camp would agree with, a sort of “minimal materialism.”
To the atheists:
Some of you know you’re materialists, some of you suspect it, others try to deny it or don’t like to be identified as such. But if you’re an atheist what else do you have?
Just out of interest … this word gets bandied about a lot, mainly by evolution opponents hereabouts. They seem to use it when a word with multiple meanings is used. The accusation tends not to be withdrawn even when the intended meaning is unequivocally clarified – a bizarre situation where someone commits to a meaning and is still equivocating!
A typical definition is “The use of ambiguous language to conceal the truth or to avoid committing oneself”. There is a veiled hint of dishonesty – making an honest mistake with alternative definitions of a word is not strictly equivocation as defined there. That is, it is not merely ‘using ambiguous language’, still less ‘confusing two definitions of one word’, but purposefully being vague or misleading. But the use of the word rarely seems appropriate to me in the contexts in which it is used – generally, even the charge of ambiguity is unjustified, let alone nefarious motive. Numerous derails are provoked when one party says ‘you are equivocating’ and the other says ‘no I’m not’. I almost invariably find myself siding with (or being) the ‘no I’m not’ party (or, for self-referential funzies, “maybe I am, maybe I’m not”!).
Is this a quirk of American English (Americans forming the majority of opponents in these discussions)? Or is it a meme that has been unconsciously passed from one to another among the evolution-skeptical fraternity? Or something else?
Here’s a simple thought-experiment. There’s a fire at an fertility clinic, and there is precious little time before the entire building is engulfed in flames. Down one hallway, there’s the soft purring sound of an incubator with a thousand frozen embryos; down the other hallway, the cries of a newborn baby. Which do you choose to save?
Usually, people answer “the baby” and the interesting debate then concerns why.
A long-standing challenge to evolutionary theory has been whether it can explain the origin of complex organismal features. We examined this issue using digital organisms—computer programs that self-replicate, mutate, compete and evolve. Populations of digital organisms often evolved the ability to perform complex logic functions requiring the coordinated execution of many genomic instructions. Complex functions evolved by building on simpler functions that had evolved earlier, provided that these were also selectively favoured. However, no particular intermediate stage was essential for evolving complex functions. The first genotypes able to perform complex functions differed from their non-performing parents by only one or two mutations, but differed from the ancestor by many mutations that were also crucial to the new functions. In some cases, mutations that were deleterious when they appeared served as stepping-stones in the evolution of complex features. These findings show how complex functions can originate by random mutation and natural selection.
The thing about a computer instantiation of evolution like AVIDA is that you can check back every lineage and examine the fitness of all precursors. Not only that, but you can choose your own environment, and how much selecting it does. There are some really key findings:
There is already a thread here dedicated to the book, but I decided to separate the thesis of the book from the actual natural theological arguments themselves. The evidence that the premises upon which these natural theological arguments rest are natural and intuitive are the subject of that thread.
In this thread I’d like to explore how the cosmological argument for the existence of God is presented in the book and provide a place where these cosmological arguments can be examined and criticized.
…This brings us back to the UC Berkeley “Understanding Evolution” website. It abuses science in its utterly unfounded claim that “natural selection can produce amazing adaptations.”
In fact natural selection, even at its best, does not “produce” anything. Natural selection does not and cannot influence the construction of any adaptations, amazing or not. If a mutation occurs which improves differential reproduction, then it propagates into future generations. Natural selection is simply the name given to that process. It selects for survival that which already exists. Natural selection has no role in the mutation event. It does not induce mutations, helpful or otherwise, to occur. According to evolutionary theory every single mutation, leading to every single species, is a random event with respect to need.
He has forgotten what “adaptation” means. Of course he is correct that “Natural selection is simply the name given to [differential reproduction]”. And that (as far as we know), “every single mutation …is a random event with respect to need”.
And “adaptation” is the name we give to variants that are preferentially reproduced. So while he would be correct to say that “natural selection” is NOT the name we give to “mutation” (duh); it IS the name we give to the very process that SELECTS those mutations that promote reproduction. i.e. the process that produces adaptation.
Cornelius should spend more time at the Understanding Evolution website.
Questions about the existence and attributes of God form the subject matter of natural theology, which seeks to gain knowledge of the divine by relying on reason and experience of the world. Arguments in natural theology rely largely on intuitions and inferences that seem natural to us, occurring spontaneously — at the sight of a beautiful landscape, perhaps, or in wonderment at the complexity of the cosmos — even to a non-philosopher.
Gathering my thoughts on moderation at TSZ, I found that I really have two OPs to write: one discussing the effects of rules and moderation at TSZ, and another exploring why the moderation — particularly the Guano-related stuff — has those effects. The second topic is by far the more interesting, but it’s the first topic that has the most practical import, so I’ll address it now.
In a nutshell: We’ve already experimented with different levels of moderation at TSZ, and the results are in. Less moderation works better.
Having taken a brief break from commenting and still taking a break from proactive moderating, I still find myself sucked into reading OP’s and comments. Not wanting to stir the hornet’s nest of currently active discussion and not having enough time to get up to speed on all the current issues where commenters are discussing deep issues of the day, I wondered about opening a thread titled something like “Suggestion Box” to get people’s thinking on any improvements Lizzie could consider that might ameliorate concerns over site policy, aims, aspirations etc. Continue reading →
[Here is something I just sent Casey Luskin and friends regarding the ENCODE 2015 conference. Some editorial changes to protect the guilty…]
One thing the ENCODE consortium drove home is that DNA acts like a Dynamic Random Access memory for methylation marks. That is to say, even though the DNA sequence isn’t changed, like computer RAM which isn’t physically removed, it’s electronic state can be modified. The repetitive DNA acts like physical hardware so even if the repetitive sequences aren’t changed, they can still act as memory storage devices for regulatory information. ENCODE collects huge amounts of data on methylation marks during various stages of the cell. This is like trying to take a few snapshots of a computer memory to figure out how Windows 8 works. The complexity of the task is beyond description. Continue reading →
Simply put, liberals/progressives are the ones who, IMO, are going to utilize these services the most. So, yeah, the fewer babies they get to raise, and the earlier we can stop them from voting, the better. On the conservative side we have the Duggars and highly religious people breeding like crazy and clinging to life for every breath they can take – which puts and keeps more conservatives in the voting pool longer.
So, as a pragmatic political matter, I say let ’em abort their young and kill themselves off to their heart’s content.
[Thank you to Elizabeth Liddle, the admins and the mods for hosting this discussion.]
I’ve long suspected the 3.1 to 3.5 gigabases of human DNA (which equates to roughly 750 to 875 megabytes) is woefully insufficient to create something as complex as a human being. The problem is there is only limited transgenerational epigenetic inheritance so it’s hard to assert large amounts of information are stored outside the DNA.
Further, the question arises how is this non-DNA information stored since it’s not easy to localize, in fact, if there is a large amount of information outside the DNA, it is in a form that is NOT localizable, but distributed and so deeply redundant that it provides the ability to self-heal and self-correct for injury and error. If so, in a sense, damage and changes to this information bearing system is not very heritable since bad variation in the non-DNA information source can get repaired and reset, otherwise the organism just dies. In that sense the organism is fundamentally immutable as a form, suggestive of a created kind rather than something that can evolve in the macro-evolutionary sense. Continue reading →