652 thoughts on “Evolving Wind Turbine Blades

  1. Frankie: Of course the GA produces the design- that is what it is doing-> producing designs that can either be kept or eliminated

    Alan Fox now:

    Alan Fox: This is obviously wrong.

    Alan Fox then:

    Alan Fox: The GA (as in the example in the OP) is acting like a filter or sieve, selecting the best of the randomly generated candidates. Obviously, the only parameter in this case is theoretical power output generated by the candidate designs.

    LoL.

  2. Mung: Because the initial starting population was randomly generated, unlike evolution.

    Wanna bet what happens if you start a good GA with a bunch of designs for the starting population?

  3. Alan Fox: I’m merely speculating how your argument might go, if you end up making one.

    Why don’t you speculate on your future responses and go ahead and send them to Guano now and save us all the drama in between?

  4. Mung:

    Frankie: Of course the GA produces the design- that is what it is doing-> producing designs that can either be kept or eliminated

    Alan Fox now:

    Alan Fox: This is obviously wrong.

    The candidates, as in the OP example are generated stochastically.

    Alan Fox then:

    Alan Fox: The GA (as in the example in the OP) is acting like a filter or sieve, selecting the best of the randomly generated candidates. Obviously, the only parameter in this case is theoretical power output generated by the candidate designs.

    LoL.

    I’m on a crusade to take back words that have been misappropriated. Just because I use “designs” in the ordinary sense doesn’t make me an ID sympathizer. There’s no conflict in what you quote from me above. You seem to be fond of semantic wordplay.

    ET correct out of sequence text.

  5. dazz: It can’t be both “contingent” and directed.

    LoL. Is any more evidence needed, any at all? Really? Evolution simply cannot be true because it invokes directed contingency! Evo-dice indeed.

  6. Mung: That depends on what you mean by fitness. Can you give an example?

    Biological fitness is measured in terms of breeding success. Fitter (in a particular niche) individuals in a population produce more offspring.

  7. Mung: That depends on what you mean by fitness. Can you give an example?

    Do you seriously need clarification still at this point on what’s fitness in a GA?

  8. Mung: That depends on what you mean by fitness. Can you give an example?

    Whereas in a GA, fitness is the specification that the GA is set to find. The OP uses the power output of the candidate blades in the wind tunnel simulation. The antenna example I linked to upthread uses the ability to receive radio signals

  9. Mung: That depends on what you mean by fitness. Can you give an example?

    We could use an objective function, or the rank of an objective function, for fitness, for example.

  10. Mung: BLUFF. BLUSTER.

    GA’s can start with fixed populations. Wes’ initial implementation of evodice gave the entire population a genome of [1,1,1,1,1,1]. It still worked. It is important that generations can change, but the initial population can be fixed – we’ve even bootstrapped in good solutions from other solution classes before.

    Arguing without understanding, Mung. Must be an emotional thing.

  11. dazz: You’re partially right on that one, the specific paths in that algo were contingent on random variation, but it was directed towards a solution because the fitness function was defined in the same terms as the solution.

    But the weasel algo ALWAYS converges into the same solution because it’s directed, while GA’s don’t

    In the weasel algo, if you repeat over and over again, it will always be going in the same direction: towards the target. True GA’s don’t do that

    But, but, but … THEY TAKE DIFFERENT PATHS! That’s what happens when you have moving goalpost.

  12. Frankie, let me lay this out for you, because I think that perhaps where others have tried and failed I can try and succeed.

    GA’s use directed contingency.
    Evolution uses directed contingency.
    Therefore, GA’s demonstrate that Evolution is true.

    It follows that:
    GA’s are not designed to solve problems.
    GA’s are not searches.
    GA’s are not teleological.
    GA’s do not require a designer.
    And, ID is false.

  13. So GAs just miraculously find solutions to the problems there were designed to solve? They just stumble along randomly and badda-bing, a solution!

    The more likely explanation is they find solutions because they are directed to that end. Directed by the intelligent design of the program.

  14. Alan Fox,

    The antenna example is more of the same- directed evolution, ie evolution by design. It uses a pre-specification of the antenna requirements to guide all internal solutions towards the final solution.

    Do you think the proper antenna would have been found if the pre-specification wasn’t part of the program?

  15. Mung: But, but, but … THEY TAKE DIFFERENT PATHS! That’s what happens when you have moving goalpost.

    Of course you have no clue what moving the goalposts means. What a surprise.
    And of course I present arguments against your claim that local maximums are “solutions” and all you have is hand-waiving

  16. Alan Fox: Again, debates don’t answer scientific issues. Testing, experimenting, observing and analysing is what works.

    How do we test your claims?

  17. It’s amazing, that with all of this knowledge about GAs neither Frankie nor Mung can build one. You can build one in excel if you’re motivated enough.

  18. Richardthughes: GA’s can start with fixed populations.

    Who said that a GA cannot start with a set of identical candidates solutions? Me? Where?

    You and Alan Fox. Let’s make up an imaginary Mung and attribute to him imaginary thoughts. Who needs actual evidence.

  19. Richardthughes: It’s amazing, that with all of this knowledge about GAs neither Frankie nor Mung can build one.

    Just how is it that you know that I cannot code a GA?

  20. Because in good faith I must believe all the things you’re getting wrong and your shitty previous attempt are not a cunning ruse of a GA mastermind.

  21. Maybe I’ll write a Pizza GA.

    The “best” pizza will be the one with the most randomly selected set of ingredients, because in this case our customer requirement is “surprise me!”

  22. Richardthughes: Because in good faith I must believe all the things you’re getting wrong and your shitty previous attempt are not a cunning ruse of a GA mastermind.

    So we have an actual starting point.

    Quote something that I have said about genetic algorithms that you know to be false. Be prepared to defend your claim.

  23. Richardthughes: How is “Find Pizza” different from “Find the best Pizza”?

    You were supposed to give examples from two different GA’s so that we could see how fitness had a different meaning in the two different GA’s.

    Find the best Pizza
    Find the best Ice Cream
    Find the best [fill in the blank]

    IOW, there is a specific instance of how fitness is to be judged and there is the abstraction of fitness. You’re conflating the two.

  24. Mung: You were supposed to give examples from two different GA’s so that we could see how fitness had a different meaning in the two different GA’s.

    Which I did:

    Richardthughes: We could use an objective function, or the rank of an objective function, for fitness, for example.

  25. Mung: Quote something that I have said about genetic algorithms that you know to be false. Be prepared to defend your claim.

    Easy:

    “Frankie: Why are the final blades more efficient than the starting blades?

    Mung:Because the initial starting population was randomly generated, unlike evolution.”

  26. Mung. Write that GA.
    And don’t forget to add a the logic that prints out a message every time it finds a solution.

    Do you think you can do that?

  27. Mung,

    Multiple criteria / constraints is not a problem. We can do it with any solution class. Sometimes we minimize a composite penalty function, which is a neat trick for LPs than can’t actually idea target due to constraints.

  28. Allan Miller: But that isn’t the case. Sometimes the fittest organisms get hit by an asteroid, eaten anyway, crushed by a log …Fitness is not survival.

    You have no criteria by which to say the fittest didn’t survive, unless you change the definition of fitness.

    If you can claim that the fittest didn’t survive, because they were unlucky, then I can claim that the fittest ALWAYS didn’t survive, because they have always been unlucky. The fittest never survive. How can you refute that, if you can claim sometimes they don’t.

    Unless we have a definition for fitness other than the one that survives best, you are painted into a corner.

  29. You take the effort to try and explain things like ‘expected value’ to Phoodoo but he just can’t move forward, you get the same nonsense again and again.

  30. phoodoo: You have no criteria by which to say the fittest didn’t survive, unless you change the definition of fitness.

    Phoodoo, you keep ignoring that “fittest” /= “that which survives”. Here:

    Allan Miller: What is ‘fitter’ is that which tends to produce more offspring given a particular environmental circumstance. Clearly, not every environmental circumstance is the same, therefore that which succeeds is bound to vary.

    Why?

    If you can claim that the fittest didn’t survive, because they were unlucky, then I can claim that the fittest ALWAYS didn’t survive, because they have always been unlucky. The fittest never survive.How can you refute that, if you can claim sometimes they don’t.

    Because “the fittest” has nothing to do with “actual survival”, but rather with “expected survival” relative to a given environment. You keep ignoring this, thus your characterization is erroneous.

    Unless we have a definition for fitness other than the one that survives best, you are painted into a corner.

    Oh the irony…

  31. phoodoo,

    Hey phoodoo, I built a sim for you that has the property you think is impossible.
    In this environment, selecting for fecundity and simultaneously selecting for shortness yields beasties that have greater fecundity than selecting for fecundity alone:
    There’s a fitness surface, let’s call it “fecundity”. Our student is trying to optimize fecundity by adjusting two parameters, “height” and “muscle”.
    We assure him that the fitness surface is continuous, and let him know that height can take values between 0 and 100, and “muscle” can take any value (but there’s no point in looking outside of /- 500).
    He builds a hill-climber that, each iteration, picks a spot close (within a 0.1 by 0.1 square) to its current location (in terms of height and muscle), and moves to that spot if and only if the new location yields a higher fecundity.
    He releases sixty hill-climbers into this environment (at randomly chosen [muscle, height] points) and lets them walk until they reach a local maximum, at which point they will of course stop moving.
    He reports back that he has achieved a high score of fecundity =183 (at muscle = 1.6, height =100) with a runner up of fecundity =144 (at muscle = -3.1, height =100), and notices that all his climbers ended up at maximum height (100). Height seemed to help, so he runs the sim again, but adding a simultaneous selection in favor of tallness (add half the height to the fecundity when deciding whether to move).
    He gets the same result as before.

    His lab partner is suspicious, and she re-runs the sim, except selecting AGAINST height. She subtracts 85% of the height from the fecundity, thereby penalizing taller beasties when deciding whether to move.
    All sixty of her beasties achieve a fecundity of 200 (at muscle = 0, height = 100). She tells him “you were getting stuck in local maxima.”
    In this environment, you are able to find a better solution if you simultaneously select against an attribute which is maximal at all fitness maxima.
    What we know and they didn’t is that
    fecundity = height * Exp(-(muscle ^ 2) / 16) * Cos(muscle * 4) 100 – muscle ^ 2

    P.S. to the numerate members of the audience: I am making a point about fitness surfaces, not about GA’s, so tais toi, ne pas devant les enfants

  32. phoodoo: You have no criteria by which to say the fittest didn’t survive, unless you change the definition of fitness.

    You and Frankie should talk. I’d be interested in a joint OP. Do you think you could collaborate and produce something? You both have some special ideas relating to fitness and it’d be interesting to watch you either agree or disagree on those ideas.

  33. OMagain,

    You and Frankie should talk. I’d be interested in a joint OP. Do you think you could collaborate and produce something? You both have some special ideas relating to fitness and it’d be interesting to watch you either agree or disagree on those ideas.

    If they could somehow incorporate the cardinality of infinite sets into the OP it would be even more illuminating.

  34. Yes, I definitely would want to sticky a thread having combined authorship of Frankie, phoodoo, and mung. Provided they can come to agreement on what ID is and what the best arguments are against undirected evolution.

  35. Richardthughes: Mung: Because the initial starting population was randomly generated.”

    How is this a false claim about genetic algorithms? Are you saying that GA’s never use a starting population of randomly generated candidate solutions?

  36. Mung,

    Really?

    Frankie asks: “Why are the final blades more efficient than the starting blades?”

    And you answer:

    Mung: “Because the initial starting population was randomly generated, unlike evolution.”

    So you have told him that “the final blades more efficient than the starting blades” because “the initial starting population was randomly generated, unlike evolution.”

    This is untrue, idiotic and shows your lack of knowledge. Moreover, it can be and has been falsified by the efficacy of GAs that start with a fixed population:

    http://www.antievolution.org/cgi-bin/ikonboard/ikonboard.cgi?act=SP;f=14;t=7297;p=192347

    “. I started all dice with [1,1,1,1,1,1] as their chromosome.”

    Do you still think that “the final blades more efficient than the starting blades” because “the initial starting population was randomly generated, unlike evolution.”?

  37. petrushka:
    Yes, I definitely would want to sticky a thread having combined authorship of Frankie, phoodoo, and mung. Provided they can come to agreement on what ID is and what the best arguments are against undirected evolution.

    There aren’t any arguments for undirected evolution. So perhaps you should start with that.

  38. Patrick:
    OMagain,

    If they could somehow incorporate the cardinality of infinite sets into the OP it would be even more illuminating.

    Cardinality refers to a number whereas infinity does not

  39. Frankie: There aren’t any arguments for undirected evolution. So perhaps you should start with that.

    Woo Hoo a positive case for ID! Oh, wait…

  40. Biological fitness is an after-the-fact assessment. In this GA fitness is measured against an artificial and arbitrary “fitness function”.

    In real life whatever is good enough survives and has the chance at reproduction.

    Natural selection is blind and mindless whereas GAs are anything but.

Leave a Reply