Algorithmic Specified Complexity and the Game of Life

Ewert, Dembski, and Marks have a forthcoming paper: “Algorithmic Specified Complexity and the Game of Life” – It appears to be behind paywalls though. Can anyone provide a copy?

Please note, any comments not directly addressing the math or mechanics of this post will moved to Guano (thanks Neil and Alan)

My earlier post:

1. In Conway’s Life: http://en.wikipedia.org/wiki/Conway%27s_Game_of_Life
2. There is the Glider-Producing Switch Engine http://conwaylife.com/wiki/Glider-producing_switch_engine
3. It is coded by 123 “On Cells” but requires a space of 67×60 in a specific configuration.
4. That’s 4,020 bits, > UPB.
5. It contains well matched parts : 4bli,3blo,2bee,1boat,1loaf,1ship,1glider http://wwwhomes.uni-bielefeld.de/achim/moving.html
6. It occurs naturally out of randomly configured dust : http://wwwhomes.uni-bielefeld.de/achim/moving.html
7. It can evolve from a much smaller entity (“time bomb” – 17 active cells): http://conwaylife.appspot.com/pattern/timebomb

Possible criticisms:

Information is hidden somewhere
This is under “standard” Life rules (B3/S23) which means there is precious little exogenous information:

1.Any live cell with fewer than two live neighbours dies, as if caused by under-population.
2.Any live cell with two or three live neighbours lives on to the next generation.
3.Any live cell with more than three live neighbours dies, as if by overcrowding.
4.Any dead cell with exactly three live neighbours becomes a live cell, as if by reproduction.

(Source: http://en.wikipedia.org/wiki/Conway%27s_Game_of_Life#Rules)

These are not self-replicating
This is not actually a requirement of Specified Complexity and it does send off some of its parts into the life universe.

Also interesting – some musings on how big a life universe might have to be to support self-replicating life: http://ieet.org/index.php/IEET/more/prisco20140915

106 thoughts on “Algorithmic Specified Complexity and the Game of Life

  1. I think their actual position is not that rules don’t affect the probability of particular patterns, but that they can’t calculate how they affect them. So they decided to approximate the probabilities as being those of random patterns, and they hope that this is a good enough approximation.

    That seems to me to be a tornado-in-junkyard approximation.

  2. 23. The claim in EDM’s appendix is false.

    In the appendix, EDM write:

    However, because a cell is only affected by its immediate neighbors, cells cannot affect the state of other cells which are sufficiently far away. We can thus ignore any pattern containing cells too far away to interact with one another. How far away is sufficient? We can inspect the equations we are testing against to see the number of ⊕ operations, after taking repetition into account. This gives us the number of iterations that could be checked, and thus the size of the observable universe for any given cell. We are not interested in any pattern where there is a gap larger than the size of the observable universe. Let U = L + T + 1 where L is the number of living cells in a pattern, and T is the number of ⊕ operations. Given a bounding-box larger than U ×U, there must exist a gap larger than the size of the observable universe.

    A counterexample is easy to provide. Consider a 100×100 square grid, empty except for two blinkers in opposite corners. There are six live cells, so L = 6. The blinker has a period of two, so T = 2. L + T + 1 = 9, so even a mere 10×10 bounding box should be more than sufficient, by EDM’s logic — but of course a 10×10 box isn’t even large enough to capture the pattern.

    Even our 100×100 box isn’t sufficiently large, though it satisfies EDM’s constraint. There could be live cells immediately outside the bounding box, next to the blinkers, that could interfere with their operation.

    The source of EDM’s error is easy to spot. They assumed that patterns are always formed out of contiguous live cells, which is false.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.