-
Notifications
You must be signed in to change notification settings - Fork 59
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Default GA
and DE
don't seem to work in v0.11.1
#98
Comments
Looks like I got it. Apparently, when I specify one initial point, the entire population will be just copies of this point: Evolutionary.jl/src/api/utilities.jl Lines 58 to 64 in 81d8a72
And then there's probably not enough variability for mutation or crossover to change anything, so the population doesn't change. If I use |
Thanks for testing package in such mode. I wouldn't never try to use this algorithms without specific parameters. But, I understand that for someone new trying evolutionary optimization the experience can be frustrating. You are correct, default parameters are useless. But there is not way to set default parameters for any model because operators are population dependent. If population is represented by binary strings, you need specific to binary string operations, same for numerical functions, and so on. I think the best way is to terminate optimization (with some useful message) when no-ops operators are used. If you use |
Maybe there shouldn't be any default parameters then? Having "useless" default parameters is:
The no-argument constructor could throw an error, for example: DE() = error("There is not way to set default parameters for any model because operators are population dependent, so please set parameters manually.") It's especially strange with On the other hand, the default
Does it mean that Also the documentation recommends this: |
Exactly my point, performing evolutionary optimization correctly is not only about initial guess, but also about which mutation operations and how they applied to population, i.e. rates.
In current implementation, it is problematic. I think more randomness needs to be added in such case.
By default, CMAES initializes parameters with a very convoluted procedure, that was refined for many years of research. CMAES become very fragile with incorrect parameters, and it requires good understanding of algorithm and problem to tune in algorithm for the best performance. |
I came across the same issue. I suggest you change the (other than that, really helpful) tutorial, so as not to include examples that use default parameters in GA and DE. These examples were the first that I tried to familiarize myself with the package, and it was a bit frustrating to get weird results even if copying the code. There was some time until I realised that there might be a problem with the parameters and not with the syntax I used. |
Same here. In my case I was using Got it to work by initializing with an NP = 100
de = Evolutionary.DE(populationSize=NP)
lb, ub = [0 1]
U = Uniform(lb, ub)
x0 = Evolutionary.initial_population(de, [rand(U, 1) for i in 1:NP])
f = OptimizationFunction(foo)
prob = Optimization.OptimizationProblem(f, x0, p, lb=lb, ub=ub)
sol = solve(prob, de) |
TL;DR
GA()
andDE()
don't move away from the initial point, say that any initial point is the optimum and report convergence, even though the algorithm isn't anywhere near the optimum.I'm new to this, so this entire issue might be stupid, but I can't get the algorithms to work even with default settings, I can't even get started with the most basic things. Maybe the defaults could be adjusted somehow?
Basic example
Try to minimize
f(x) = x^2
:No,
90^2 = 8100
is most definitely not the minimum ofx^2
.I can fiddle with the optimizer's settings, but it still doesn't move from the initial point:
All of these runs also report convergence, but the output is too long.
I eventually got the genetic algo to work after specifying
mutation=gaussian(), crossover=uniformbin()
(the defaults are apparently no-ops, but having no mutation and no crossover seems to defeat the purpose of having a genetic algorithm?):However, I couldn't get differential evolution to work, even for this simple function. Below are examples taken from the docs that don't seem to work either.
GA
exampleThis is taken from https://wildart.github.io/Evolutionary.jl/dev/tutorial/#Obtaining-results.
So apparently, the minimum is
f([0,0,0]) = 0
. However, I can get a lower value:f([1,1,1]) = -3
. In fact,GA
seems to accept any initial value as the solution:DE
algorithmI took the target function from this page: https://wildart.github.io/Evolutionary.jl/dev/constraints/.
Then I use the genetic algorithm as shown on that page:
If I run this multiple times (
GA
seems to be randomized, so I wanted to draw more samples), I get about[1, 3]
on average. So far so good.Now try the default
DE()
with the same function and the same starting point[0., 0.]
:Change the starting point:
DE
doesn't seem to care and says that the starting point is the optimum, whatever the starting point is.GA
againNow try the same function as above with the default
GA
:Same as the default
DE
: it thinks that the initial point is the optimum, whatever the initial point is.The text was updated successfully, but these errors were encountered: