You have thirteen characters: the numbers 1-9 and the operators for Plus, Minus, Divide and Multiply. Arrange them in a string so they have the highest possible value, or write a program to do this for you. If a string cannot execute as a mathematical function, it scores zero.
Full list [1,2,3,4,5,6,7,8,9,+,-,*,/]
There are 13! possibilities. How many score more than 0 (are mathematically viable)? How many steps did you program take / candidates did it evaluate. How did you know when to stop?
A population of one. You just don’t get it. But hey, you can call me Mung-Dazz!
I could call you so many things, but you do a great job at portraying yourself
Mung,
The (1+1)EA may be very simple, but it is an often used (and examined) algorithm. Dembski and Marks used it, but called it something like “Optimization by Mutation With Elitism”.
I was pleasently surprised how well it worked in this setting!
Mung, how do you think a bigger population would affect results?
I would think it would probably depend on other factors, which is why we say GA’s are tuned. There is no such thing as a general purpose GA that solves all problems.
For example if you start with an initial population of 1000 and kill off 999 of them and use only one as the parent for the next generation I don’t see how the result would be much different from what was achieved.
But like I keep saying, evolution isn’t like that.
Now if you start with an initial population of 1000 and need to give each genotype some probability of being represented in the next generation such that they all add up to 1 I bet you’ll have to change the fitness function.
Does evolution search for the right fitness function to allow the population to solve a problem?
Oh, and if there’s just one target, well, dazz sez that’s not allowed.
*FACEPALM*
You haven’t understood a single thing about GA’s, LMAO
Mung,
No. What you’re describing is roulette wheel selection which is completely separate from the fitness function.
I don’t think he has even figured out what represents the fitness in this problem.
He still thinks the GA has (a) target(s) so go figure! What an embarrassment
I think approach the problem from “why GAs work” rather than “why they shouldn’t” might help.
I have both Python and R installed as well. I’d love to see the other code, especially what’s been coded in R.
Why don’t you teach us, oh wise one? Didn’t you claim that Weasel is not a GA? Didn’t you claim it could not be a GA because the target was known in advance?
Did you publish your results from the current exercise showing all the possible solutions?
Or when they work and when they do not work.
You all are a barrel of laughs, that’s for sure.
Let go back to some fundamentals, because it will be interesting to see who denies what.
That’s right. Search.
That must really suck for people who claim GA’s are not search algorithms.
Mung,
I don’t really care what you call them. They work, can create information and are based on evolutionary processes. Word lawyering wont hide that.
According to Melanie Mitchell, roulette-wheel sampling is “a simple method of implementing fitness-proportionate selection.” So you’re conflating a type of selection with a common way of implementing it.
Oh Mung. It was going so well. “I ‘ve never programmed a GA but here’s what you’re getting wrong” just looks silly.
And that statement is so broad so as to be practically useless. Why don’t you or one of your buddies start a thread on GA’s?
Start with writing a general purpose GA and then have different people pose problems for it to solve and see how well it does at solving them.
GA’s work, sometimes. Sometimes they don’t work. I don’t work with GA’s, but I know enough about them to know that they are both designed around the problem and that various parts of the GA have to be “fine tuned” to work together.
A simple step-by-step exercise in writing a GA would make that blazingly obvious. You have enough fans of GA’s here to bring that about don’t you?
I’d really like to see how DiEb wrote a GA to solve your problem without coding into that GA knowledge of the problem and hints on how to solve it. Wouldn’t you? How do you quantify that, in terms of information?
Where did I say I’ve never programmed a GA?
Mung,
Sorry, I inferred that from your comments and participation o this thread and others pertaining to GAs.
Richard, given your background, have you ever programmed in R?
Please don’t call people moron.
Unless you’re in the Noyau thread.
Or unless you wish your comment to be Guano’d.
Posting your results without name-calling would have been plenty sufficient in this case.
The R Project for Statistical Computing
Exploring Everyday Things with R and Ruby
Thanks hotshoe_,
And happy new year.
It’s pretty amazing, isn’t it, that despite our differences you and I still manage not to call each other names? Or am I imagining things?
😉
Must be a testosterone thing.
Updated challenge: http://theskepticalzone.com/wp/math-genome-fun-2-upb-and-beyond/
Not much call for that working as a cashier.
LoL. Do they know you are overqualified?
Perhaps, like Phoodoo they think I’m at the upper boundaries of my abilities.
Mung,
There’s no need to keep all the permutations in memory simultaneously. My implementation has a very small memory footprint because it evaluates each permutation immediately after it is generated and discards it if it doesn’t beat the current winning genome.
See the code for permute_and_evaluate() in my Python implementation.
Mung,
Fitnesses don’t need to be probabilities, Mung.
You don’t understand my Weasel code, do you?
From ten trials with a GA based on my Weasel code:
-three trials got stuck at local maxima
-seven trials found the optimum solution (964*875+3-1/2), with the following performance:
104 generations, 5.27 sec
121 generations, 6.13 sec
294 generations, 14.92 sec
94 generations, 4.76 sec
145 generations, 7.35 sec
207 generations, 10.5 sec
141 generations, 7.15 sec
Population size is 10 with 4 survivors per generation.
DiEb,
I used two forms of mutation, choosing each with 50% probability: 1) a single transposition, and 2) moving a single character from one location to another.
With those two types of mutation operating, it seems harder to get stuck at a local maximum.
Dembski and Marks would argue that any optimization is a search…
## Parameters to set:
Number.of.Digits <- 9
#9 for decimal, 15 for hexadecimal, can be as big as 100 though after 35 there is not a representation as a string any longer
Stagnation.Limit <- 1000
#the GA will stop after Stagnatation.limit runs without an improvement
Max.Length.of.Permuation <- 4
#a child is created by rotating 2 to Max.Length.of.Permuation letters in the parent's string.
Sim.Zahl <- 1000
#number of simulations executed
## internal parameters
Op.Names <- c("+","-","/","*")
Alph.2 <- character(104)
Alph.2[1:35] <- unlist(strsplit("123456789ABCDEFGHIJKLMNOPQRSTUVWXYZ",""))
Alph.2[101:104] <- Op.Names
#Alph.2 is the biggest possible alphabet
Alphabet <- c(1:Number.of.Digits,101:104)
#Alphabet is the actually used alphabet
# the numbers are encoded as 1,2,3…,Number.of.Digits
# 101 : "+"
# 102 : "-"
# 103 : "/"
# 104 : "*"
Pos.Value <- (Number.of.Digits+1)^(0:(Number.of.Digits-4))
#The positional value of a digit according to the base
Is.Viable 100) return(FALSE) # last element cannot be an operator
if(v[which(v==103)-1] > 100) return(FALSE) #’*’ cannot be preceded by operator
if(v[which(v==104)-1] > 100) return(FALSE) #dito for ‘/’Con
return(TRUE)
}
As.True.String <- function(v){
##turns the vector v into a string (literally 🙂 )
return(paste(Alph.2[v],sep="",collapse=""))
}
Convert.To.Decimal.String <- function(v){
##turns the vector v into a string where numbers are represented in the decimal system
if(Number.of.Digits == 9) return(As.True.String(v)) #no conversion neccessary
H <- NULL
k<-1
while(k 100) {H <- c(H,Op.Names[v[k]-100])
}
else{
T <- v[k]
k <- k+1
while(k <= (Number.of.Digits+4) && v[k] <=100){
T <- (Number.of.Digits+1)*T+v[k]
k <- k+1
}
k <- k-1
H <- c(H,T)
}
k <- k+1
}
return(paste(H,sep="",collapse=""))
}
Evaluate <- function(v){
##returns fitness value – fastest for decimal string
return(eval(parse(text=Convert.To.Decimal.String(v))))
}
## part 1/1
DiEb,
EA <- function(Stagnation.Limit=1000,Max.Length.of.Permutation=4){
##the evolutionary algorithm, using one parent, one child
#Initialization:
Parent <- sample(Alphabet)
while(!Is.Viable(Parent)){
Parent <- sample(Alphabet)
}
Counter <- 0 #number of generations
Max <- 0
Stagnation.Counter <- 0 #number of unchanged generations
while(Stagnation.Counter < Stagnation.Limit){
#getting number of elements for this permutation:
Anz <- sample(2:Max.Length.of.Permuation,1)
#creating a child by permuting Anz elements:
Child <- Parent
T <- sample(1:(Number.of.Digits+4),Anz)
Child[T] <- Child[T[c(Anz,1:(Anz-1))]]
while(!Is.Viable(Child)){
T <- sample(1:(Number.of.Digits+4),Anz)
Child <- Parent
Child[T] <- Child[T[c(Anz,1:(Anz-1))]]
}
#counting generations:
Counter <- Counter +1
#counting unchanged generations:
Stagnation.Counter <- Stagnation.Counter +1
Ev Max){
Parent <- Child
Max <- Ev
#generation has changed!
Stagnation.Counter <- 0
}
}
return(c(Max,Counter,As.True.String(Parent)))
}
#### performing Sim.Zahl trials
system.time(Result <- sapply(1:Sim.Zahl,function(x) EA(Stagnation.Limit,Max.Length.of.Permutation)))
And evolution sometimes works, but 99% of the species have gone extinct. Pointless Mung is pointless.
### upps:
Is.Viable = function(v){
##it is much faster to check a string of viability this way than to evaluate the string and catch the error.
if(v[1]==103) return(FALSE)
if(v[1]==104) return(FALSE)
if(v[Number.of.Digits+4] > 100) return(FALSE) # last element cannot be an operator
if(v[which(v==103)-1] > 100) return(FALSE) #’*’ cannot be preceded by operator
if(v[which(v==104)-1] > 100) return(FALSE) #dito for ‘/’Con
return(TRUE)
}
##some of the code was mistaken for html…
I put the code in the sandbox. It can be copied & pasted directly into the R command and should execute (I hope that the comments aren’t to long).
The array
results
includes the fitness values, the number of generations and the winning string.BoxPlot
Histogram
Mung,
The terms are often used interchangeably.
Regardless, you’re ignoring the point. You wrote this:
You’re wrong and in such a way as to demonstrate your lack of knowledge of the domain.
hotshoe_ wrote:
Those comments have been moved to Guano.
And I already posted one source showing how you were wrong but since you still don’t get it I’ll post another.
“We will be concerned here with a type of algorithm know as a black-box optimization (or search) algorithm. Such algorithms include evolutionary algorithms, but are not limited to them. The problems which black-box optimization algorithms solve have just two defining attributes: a phase space, and a fitness function defined over that phase space. In the context of these algorithms, phase spaces are usually called search spaces. Also the term fitness function is usually reserved for evolutionary algorithms, the more general term being objective function or cost function (maximizing an objective function is equivalent to minimizing a cost function).”
– http://www.talkorigins.org/design/faqs/nfl/#evoalgs
Two birds. One stone. Awesome.
Where does this number of 99% come from?
Take the number of known species that exist today along with some reasonable estimate of species that exist that we do not yet know about. Multiply by 100.
How many extinct species do we know of? I doubt it even comes close.
Patrick is right, Mung.
Your statement is not only wrong, it also reveals your misunderstanding of GAs:
There’s no need to change the fitness function merely because the population size changes. Perhaps if you tried writing a GA, this would become apparent to you.
“Naturally enough, the Simple Genetic Algorithm (SGA) is realized by specifying the search space and identifying the heuristic function…nothing else is required to define the SGA”
“The problems which black-box optimization algorithms solve have just two defining attributes: a phase space, and a fitness function defined over that phase space.”
Fine, but that doesn’t address your error.
Why not try writing a GA? Don’t be intimidated. We’ll help you out when you run into trouble.
Mung,
That has nothing to do with your statement that demonstrated your lack of knowledge in this domain:
Changing the selection algorithm does not require changing the fitness function. You are wrong. You are so wrong, about such a fundamental concept, that it is clear you lack knowledge of the implementation of evolutionary algorithms. That’s fixable, but not if you refuse to recognize it.
I know, let’s pretend like I didn’t point out your error about roulette-wheel selection.
Here’s what I actually wrote:
I didn’t say anything about a selection algorithm.
Here’s what I wrote:
So yeah. You can change the size of the population and not change the fitness function. You people are incredible [not really]. Back in the Weasel thread I’m sure the subject came up of changing the parameters of the kieths Weasel program, and one of them is the population size.
#define POPULATION_SIZE 200 // total population size
But somehow I allegedly don’t know that you can change the population size without changing the fitness function. Wow. Just freaking wow.
Richard? No fans of GA’s?