I just checked out Seth Lloyd’s paper, Computational Capacity of the Universe, and find, interestingly (although I now remember that this has been mentioned before), that his upper limit on the number of possible operations is 10^120 (i.e. 400 bits) rather than 10^150 (the more generous 500 bits usually proposed by Dembski). However, what I also found was that his calculation was based on the volume of the universe within the particle horizon, which he defines as:
…the boundary between the part of the universe about which we could have obtained information over the course of the history of the universe and the part about which we could not.
In other words, that 400 bit limit is only for the region of the universe observable by us, which we know pretty well for sure must be a minor fraction of the total. However, it seems that a conservative lower limit on the proportion of the entire universe that is within the particle horizon is 250, and could be as much as 10^23, so that 400 bit limit needs to be raised to at least 100,000, and possibly very much more.
Which rather knocks CSI out of the water, even if we assume that P(T|H) really does represent the entire independent random draw configuration space, and is the “relevant chance hypothesis” for life.
But I’m no cosmologist – any physicist like to weigh in?