# Math Genome Fun

You have thirteen characters: the numbers 1-9 and the operators for Plus, Minus, Divide and Multiply. Arrange them in a string so they have the highest possible value, or write a program to do this for you. If a string cannot execute as a mathematical function, it scores zero.

Full list [1,2,3,4,5,6,7,8,9,+,-,*,/]

There are 13! possibilities. How many score more than 0 (are mathematically viable)? How many steps did you program take / candidates did it evaluate. How did you know when to stop?

## 207 thoughts on “Math Genome Fun”

1. dazz: DiEb’s GA reached a maximum in 2.3 minutes with a population of 1

A population of one. You just don’t get it. But hey, you can call me Mung-Dazz!

2. Mung: A population of one. You just don’t get it. But hey, you can call me Mung-Dazz!

I could call you so many things, but you do a great job at portraying yourself

One parent, one child…

3. Mung,

Mung: A population of one. You just don’t get it. But hey, you can call me Mung-Dazz!

The (1+1)EA may be very simple, but it is an often used (and examined) algorithm. Dembski and Marks used it, but called it something like “Optimization by Mutation With Elitism”.

I was pleasently surprised how well it worked in this setting!

4. Richardthughes: Mung, how do you think a bigger population would affect results?

I would think it would probably depend on other factors, which is why we say GA’s are tuned. There is no such thing as a general purpose GA that solves all problems.

For example if you start with an initial population of 1000 and kill off 999 of them and use only one as the parent for the next generation I don’t see how the result would be much different from what was achieved.

But like I keep saying, evolution isn’t like that.

Now if you start with an initial population of 1000 and need to give each genotype some probability of being represented in the next generation such that they all add up to 1 I bet you’ll have to change the fitness function.

Does evolution search for the right fitness function to allow the population to solve a problem?

Oh, and if there’s just one target, well, dazz sez that’s not allowed.

5. Mung: I would think it would probably depend on other factors, which is why we say GA’s are tuned. There is no such thing as a general purpose GA that solves all problems.

For example if you start with an initial population of 1000 and kill off 999 of them and use only one as the parent for the next generation I don’t see how the result would be much different from what was achieved.

But like I keep saying, evolution isn’t like that.

Now if you start with an initial population of 1000 and need to give each genotype some probability of being represented in the next generation such that they all add up to 1 I bet you’ll have to change the fitness function.

Does evolution search for the right fitness function to allow the population to solve a problem?

Oh, and if there’s just one target, well, dazz sez that’s not allowed.

*FACEPALM*
You haven’t understood a single thing about GA’s, LMAO

6. Mung,

Now if you start with an initial population of 1000 and need to give each genotype some probability of being represented in the next generation such that they all add up to 1 I bet you’ll have to change the fitness function.

No. What you’re describing is roulette wheel selection which is completely separate from the fitness function.

7. Patrick:
Mung,

No.What you’re describing is roulette wheel selection which is completely separate from the fitness function.

I don’t think he has even figured out what represents the fitness in this problem.
He still thinks the GA has (a) target(s) so go figure! What an embarrassment

8. I think approach the problem from “why GAs work” rather than “why they shouldn’t” might help.

9. I have both Python and R installed as well. I’d love to see the other code, especially what’s been coded in R.

10. dazz: You haven’t understood a single thing about GA’s, LMAO

Why don’t you teach us, oh wise one? Didn’t you claim that Weasel is not a GA? Didn’t you claim it could not be a GA because the target was known in advance?

Did you publish your results from the current exercise showing all the possible solutions?

11. Richardthughes: I think approach the problem from “why GAs work” rather than “why they shouldn’t” might help.

Or when they work and when they do not work.

12. You all are a barrel of laughs, that’s for sure.

Let go back to some fundamentals, because it will be interesting to see who denies what.

Naturally enough, the Simple Genetic Algorithm (SGA) is realized by specifying the search space and identifying the heuristic function…nothing else is required to define the SGA; it is merely a special case of random heuristic search.

– Vose. The Simple Genetic Algorithm. p. 21.

That’s right. Search.

13. This chapter [Chapter 3] is the foundation of the entire development. A broad class of algorithms, Random Heuristic Search (RHS), is defined and some of its most basic properties are touched upon. The Simple Genetic Algorithm will emerge from this general context (in the next chapter) as a special case.

– Vose. The Simple Genetic Algorithm. p. 5

That must really suck for people who claim GA’s are not search algorithms.

14. Mung,

I don’t really care what you call them. They work, can create information and are based on evolutionary processes. Word lawyering wont hide that.

15. Patrick: What you’re describing is roulette wheel selection which is completely separate from the fitness function.

According to Melanie Mitchell, roulette-wheel sampling is “a simple method of implementing fitness-proportionate selection.” So you’re conflating a type of selection with a common way of implementing it.

16. Oh Mung. It was going so well. “I ‘ve never programmed a GA but here’s what you’re getting wrong” just looks silly.

17. Richardthughes: I don’t really care what you call them. They work, can create information and are based on evolutionary processes. Word lawyering wont hide that.

And that statement is so broad so as to be practically useless. Why don’t you or one of your buddies start a thread on GA’s?

Start with writing a general purpose GA and then have different people pose problems for it to solve and see how well it does at solving them.

GA’s work, sometimes. Sometimes they don’t work. I don’t work with GA’s, but I know enough about them to know that they are both designed around the problem and that various parts of the GA have to be “fine tuned” to work together.

A simple step-by-step exercise in writing a GA would make that blazingly obvious. You have enough fans of GA’s here to bring that about don’t you?

I’d really like to see how DiEb wrote a GA to solve your problem without coding into that GA knowledge of the problem and hints on how to solve it. Wouldn’t you? How do you quantify that, in terms of information?

18. Richardthughes: Oh Mung. It was going so well. “I ‘ve never programmed a GA but here’s what you’re getting wrong” just looks silly.

Where did I say I’ve never programmed a GA?

19. Mung,

Sorry, I inferred that from your comments and participation o this thread and others pertaining to GAs.

20. dazz: Yes I did moron

Please don’t call people moron.

Unless you’re in the Noyau thread.

Or unless you wish your comment to be Guano’d.

Posting your results without name-calling would have been plenty sufficient in this case.

21. Thanks hotshoe_,

And happy new year.

It’s pretty amazing, isn’t it, that despite our differences you and I still manage not to call each other names? Or am I imagining things?

😉

Must be a testosterone thing.

22. Mung:
Richard, given your background, have you ever programmed in R?

Not much call for that working as a cashier.

23. Mung: LoL. Do they know you are overqualified?

Perhaps, like Phoodoo they think I’m at the upper boundaries of my abilities.

24. Mung,

Well, my program eventually stopped due to running out of memory after more than 1.685B permutations. Guess maybe the built-in permutation function isn’t the way to go.

There’s no need to keep all the permutations in memory simultaneously. My implementation has a very small memory footprint because it evaluates each permutation immediately after it is generated and discards it if it doesn’t beat the current winning genome.

See the code for permute_and_evaluate() in my Python implementation.

25. Mung,

Now if you start with an initial population of 1000 and need to give each genotype some probability of being represented in the next generation such that they all add up to 1 I bet you’ll have to change the fitness function.

Fitnesses don’t need to be probabilities, Mung.

You don’t understand my Weasel code, do you?

26. From ten trials with a GA based on my Weasel code:

-three trials got stuck at local maxima
-seven trials found the optimum solution (964*875+3-1/2), with the following performance:

104 generations, 5.27 sec
121 generations, 6.13 sec
294 generations, 14.92 sec
94 generations, 4.76 sec
145 generations, 7.35 sec
207 generations, 10.5 sec
141 generations, 7.15 sec

Population size is 10 with 4 survivors per generation.

27. DiEb,

1) using just a single transposition doesn’t work, as you get easily stuck with 9*87654 as the product part

I used two forms of mutation, choosing each with 50% probability: 1) a single transposition, and 2) moving a single character from one location to another.

With those two types of mutation operating, it seems harder to get stuck at a local maximum.

28. ## Parameters to set:

Number.of.Digits <- 9
#9 for decimal, 15 for hexadecimal, can be as big as 100 though after 35 there is not a representation as a string any longer

Stagnation.Limit <- 1000
#the GA will stop after Stagnatation.limit runs without an improvement

Max.Length.of.Permuation <- 4
#a child is created by rotating 2 to Max.Length.of.Permuation letters in the parent's string.

Sim.Zahl <- 1000
#number of simulations executed

## internal parameters

Op.Names <- c("+","-","/","*")

Alph.2 <- character(104)
Alph.2[1:35] <- unlist(strsplit("123456789ABCDEFGHIJKLMNOPQRSTUVWXYZ",""))
Alph.2[101:104] <- Op.Names
#Alph.2 is the biggest possible alphabet

Alphabet <- c(1:Number.of.Digits,101:104)
#Alphabet is the actually used alphabet

# the numbers are encoded as 1,2,3…,Number.of.Digits
# 101 : "+"
# 102 : "-"
# 103 : "/"
# 104 : "*"

Pos.Value <- (Number.of.Digits+1)^(0:(Number.of.Digits-4))
#The positional value of a digit according to the base

Is.Viable 100) return(FALSE) # last element cannot be an operator
if(v[which(v==103)-1] > 100) return(FALSE) #’*’ cannot be preceded by operator
if(v[which(v==104)-1] > 100) return(FALSE) #dito for ‘/’Con
return(TRUE)

}

As.True.String <- function(v){
##turns the vector v into a string (literally 🙂 )
return(paste(Alph.2[v],sep="",collapse=""))
}

Convert.To.Decimal.String <- function(v){
##turns the vector v into a string where numbers are represented in the decimal system
if(Number.of.Digits == 9) return(As.True.String(v)) #no conversion neccessary
H <- NULL
k<-1
while(k 100) {H <- c(H,Op.Names[v[k]-100])

}
else{
T <- v[k]
k <- k+1
while(k <= (Number.of.Digits+4) && v[k] <=100){

T <- (Number.of.Digits+1)*T+v[k]
k <- k+1
}
k <- k-1
H <- c(H,T)

}
k <- k+1
}
return(paste(H,sep="",collapse=""))

}

Evaluate <- function(v){
##returns fitness value – fastest for decimal string
return(eval(parse(text=Convert.To.Decimal.String(v))))
}

## part 1/1
DiEb,

29. EA <- function(Stagnation.Limit=1000,Max.Length.of.Permutation=4){
##the evolutionary algorithm, using one parent, one child

#Initialization:
Parent <- sample(Alphabet)
while(!Is.Viable(Parent)){
Parent <- sample(Alphabet)
}

Counter <- 0 #number of generations
Max <- 0

Stagnation.Counter <- 0 #number of unchanged generations

while(Stagnation.Counter < Stagnation.Limit){

#getting number of elements for this permutation:
Anz <- sample(2:Max.Length.of.Permuation,1)

#creating a child by permuting Anz elements:
Child <- Parent
T <- sample(1:(Number.of.Digits+4),Anz)

Child[T] <- Child[T[c(Anz,1:(Anz-1))]]
while(!Is.Viable(Child)){
T <- sample(1:(Number.of.Digits+4),Anz)
Child <- Parent
Child[T] <- Child[T[c(Anz,1:(Anz-1))]]
}
#counting generations:
Counter <- Counter +1
#counting unchanged generations:
Stagnation.Counter <- Stagnation.Counter +1

Ev Max){
Parent <- Child
Max <- Ev
#generation has changed!
Stagnation.Counter <- 0

}
}

return(c(Max,Counter,As.True.String(Parent)))
}

#### performing Sim.Zahl trials

system.time(Result <- sapply(1:Sim.Zahl,function(x) EA(Stagnation.Limit,Max.Length.of.Permutation)))

30. Mung: GA’s work, sometimes. Sometimes they don’t work

And evolution sometimes works, but 99% of the species have gone extinct. Pointless Mung is pointless.

31. ### upps:

Is.Viable = function(v){
##it is much faster to check a string of viability this way than to evaluate the string and catch the error.
if(v[1]==103) return(FALSE)
if(v[1]==104) return(FALSE)
if(v[Number.of.Digits+4] > 100) return(FALSE) # last element cannot be an operator
if(v[which(v==103)-1] > 100) return(FALSE) #’*’ cannot be preceded by operator
if(v[which(v==104)-1] > 100) return(FALSE) #dito for ‘/’Con
return(TRUE)

}

##some of the code was mistaken for html…

32. I put the code in the sandbox. It can be copied & pasted directly into the R command and should execute (I hope that the comments aren’t to long).

The array `results` includes the fitness values, the number of generations and the winning string.

33. Mung,

What you’re describing is roulette wheel selection which is completely separate from the fitness function.

According to Melanie Mitchell, roulette-wheel sampling is “a simple method of implementing fitness-proportionate selection.” So you’re conflating a type of selection with a common way of implementing it.

The terms are often used interchangeably.

Regardless, you’re ignoring the point. You wrote this:

Now if you start with an initial population of 1000 and need to give each genotype some probability of being represented in the next generation such that they all add up to 1 I bet you’ll have to change the fitness function.

You’re wrong and in such a way as to demonstrate your lack of knowledge of the domain.

34. hotshoe_ wrote:

jazz: Yes I did moron

Please don’t call people moron.

Unless you’re in the Noyau thread.

Or unless you wish your comment to be Guano’d.

Posting your results without name-calling would have been plenty sufficient in this case.

Those comments have been moved to Guano.

35. Patrick: You’re wrong and in such a way as to demonstrate your lack of knowledge of the domain.

And I already posted one source showing how you were wrong but since you still don’t get it I’ll post another.

“We will be concerned here with a type of algorithm know as a black-box optimization (or search) algorithm. Such algorithms include evolutionary algorithms, but are not limited to them. The problems which black-box optimization algorithms solve have just two defining attributes: a phase space, and a fitness function defined over that phase space. In the context of these algorithms, phase spaces are usually called search spaces. Also the term fitness function is usually reserved for evolutionary algorithms, the more general term being objective function or cost function (maximizing an objective function is equivalent to minimizing a cost function).”

Two birds. One stone. Awesome.

36. dazz: And evolution sometimes works, but 99% of the species have gone extinct.

Where does this number of 99% come from?

Take the number of known species that exist today along with some reasonable estimate of species that exist that we do not yet know about. Multiply by 100.

How many extinct species do we know of? I doubt it even comes close.

37. Patrick is right, Mung.

Your statement is not only wrong, it also reveals your misunderstanding of GAs:

Now if you start with an initial population of 1000 and need to give each genotype some probability of being represented in the next generation such that they all add up to 1 I bet you’ll have to change the fitness function.

There’s no need to change the fitness function merely because the population size changes. Perhaps if you tried writing a GA, this would become apparent to you.

38. “Naturally enough, the Simple Genetic Algorithm (SGA) is realized by specifying the search space and identifying the heuristic function…nothing else is required to define the SGA”

“The problems which black-box optimization algorithms solve have just two defining attributes: a phase space, and a fitness function defined over that phase space.”

39. Fine, but that doesn’t address your error.

Why not try writing a GA? Don’t be intimidated. We’ll help you out when you run into trouble.

40. Mung,

You’re wrong and in such a way as to demonstrate your lack of knowledge of the domain.

And I already posted one source showing how you were wrong but since you still don’t get it I’ll post another.

“We will be concerned here with a type of algorithm know as a black-box optimization (or search) algorithm. Such algorithms include evolutionary algorithms, but are not limited to them. The problems which black-box optimization algorithms solve have just two defining attributes: a phase space, and a fitness function defined over that phase space. In the context of these algorithms, phase spaces are usually called search spaces. Also the term fitness function is usually reserved for evolutionary algorithms, the more general term being objective function or cost function (maximizing an objective function is equivalent to minimizing a cost function).”

That has nothing to do with your statement that demonstrated your lack of knowledge in this domain:

Now if you start with an initial population of 1000 and need to give each genotype some probability of being represented in the next generation such that they all add up to 1 I bet you’ll have to change the fitness function.

Changing the selection algorithm does not require changing the fitness function. You are wrong. You are so wrong, about such a fundamental concept, that it is clear you lack knowledge of the implementation of evolutionary algorithms. That’s fixable, but not if you refuse to recognize it.

41. Patrick: Changing the selection algorithm does not require changing the fitness function.

I know, let’s pretend like I didn’t point out your error about roulette-wheel selection.

Changing the selection algorithm does not require changing the fitness function.

Here’s what I actually wrote:

Now if you start with an initial population of 1000 and need to give each genotype some probability of being represented in the next generation such that they all add up to 1 I bet you’ll have to change the fitness function.

I didn’t say anything about a selection algorithm.

42. keiths: There’s no need to change the fitness function merely because the population size changes.

Here’s what I wrote:

For example if you start with an initial population of 1000 and kill off 999 of them and use only one as the parent for the next generation I don’t see how the result would be much different from what was achieved.

So yeah. You can change the size of the population and not change the fitness function. You people are incredible [not really]. Back in the Weasel thread I’m sure the subject came up of changing the parameters of the kieths Weasel program, and one of them is the population size.

#define POPULATION_SIZE 200 // total population size

But somehow I allegedly don’t know that you can change the population size without changing the fitness function. Wow. Just freaking wow.

43. Mung: Why don’t you or one of your buddies start a thread on GA’s?

Start with writing a general purpose GA and then have different people pose problems for it to solve and see how well it does at solving them.

GA’s work, sometimes. Sometimes they don’t work. I don’t work with GA’s, but I know enough about them to know that they are both designed around the problem and that various parts of the GA have to be “fine tuned” to work together.

A simple step-by-step exercise in writing a GA would make that blazingly obvious. You have enough fans of GA’s here to bring that about don’t you?

Richard? No fans of GA’s?

This site uses Akismet to reduce spam. Learn how your comment data is processed.