| |
- __builtin__.object
-
- BestSolution
- Fcts
- OOOptimizer
-
- CMAES
- OptimDataLogger
-
- CMAESDataLogger
class BestSolution(__builtin__.object) |
|
container to keep track of the best solution seen |
|
Methods defined here:
- __init__(self, x=None, f=None, evals=None)
- take `x`, `f`, and `evals` to initialize the best solution.
The better solutions have smaller `f`-values.
- get(self)
- return ``(x, f, evals)``
- update(self, arx, arf, evals=None)
- initialize the best solution with `x`, `f`, and `evals`.
Better solutions have smaller `f`-values.
Data descriptors defined here:
- __dict__
- dictionary for instance variables (if defined)
- __weakref__
- list of weak references to the object (if defined)
|
class CMAES(OOOptimizer) |
|
class for non-linear non-convex minimization. The class implements
the interface define in `OOOptimizer`, namely the methods
`__init__()`, `ask()`, `tell()`, `stop()`, `result()`, and `disp()`.
Examples
--------
All examples minimize the function `elli`, the output is not shown.
(A preferred environment to execute all examples is
``ipython -pylab``.)
First we need to ::
import barecmaes2 as cma
The shortest example uses the inherited method
`OOOptimizer.optimize()`::
res = cma.CMAES(8 * [0.1], 0.5).optimize(cma.Fcts.elli)
See method `__init__()` for the input parameters to `CMAES`. We might
have a look at the result::
print(res[0]) # best solution and
print(res[1]) # its function value
`res` is the return value from method `CMAES.results()` appended with
`None` (no logger). In order to display more exciting output we rather
do ::
logger = cma.CMAESDataLogger()
res = cma.CMAES(9 * [0.5], 0.3).optimize(cma.Fcts.elli, logger)
logger.plot() # if matplotlib is available
Virtually the same example can be written with an explicit loop
instead of using `optimize()`. This gives the necessary insight
into the `CMAES` class interface and gives entire control over the
iteration loop ::
optim = cma.CMAES(9 * [0.5], 0.3) # calls CMAES.__init__()
logger = cma.CMAESDataLogger().register(optim) # logger instance
# this loop resembles optimize()
while not optim.stop(): # iterate
X = optim.ask() # get candidate solutions
# do whatever needs to be done, however rather don't
# change X unless like for example X[2] = optim.ask()[0]
f = [cma.Fcts.elli(x) for x in X] # evaluate solutions
optim.tell(X, f) # do all the real work
optim.disp(20) # display info every 20th iteration
logger.add() # log another "data line"
# final output
print('termination by', optim.stop())
print('best f-value =', optim.result()[1])
print('best solution =', optim.result()[0])
print('potentially better solution xmean =', optim.result()[5])
print('let's check f(xmean) =', cma.Fcts.elli(optim.result()[5]))
logger.plot() # if matplotlib is available
raw_input('press enter to continue') # prevents closing figures
A slightly longer example, which also plots within the loop, is
the implementation of function `fmin(...)`.
Details
-------
Most of the work is done in the method `tell(...)`. The method
`result()` returns more useful output.
:See: `fmin()`, `OOOptimizer.optimize()` |
|
- Method resolution order:
- CMAES
- OOOptimizer
- __builtin__.object
Methods defined here:
- __init__(self, xstart, sigma, max_eval='1e3*N**2', ftarget=None, popsize='4 + int(3 * log(N))', randn=<bound method Random.normalvariate of <random.Random object>>)
- Initialize` CMAES` object instance, the first two arguments are
mandatory.
Parameters
----------
`xstart`
``list`` of numbers (like ``[3, 2, 1.2]``), initial
solution vector
`sigma`
``float``, initial step-size (standard deviation in each
coordinate)
`max_eval`
``int`` or ``str``, maximal number of function
evaluations, a string is evaluated with ``N`` being the
search space dimension
`ftarget`
`float`, target function value
`randn`
normal random number generator, by default
``random.normalvariate``
- ask(self)
- return a list of lambda candidate solutions according to
m + sig * Normal(0,C) = m + sig * B * D * Normal(0,I)
- disp(self, verb_modulo=1)
- display some iteration info
- result(self)
- return (xbest, f(xbest), evaluations_xbest, evaluations,
iterations, xmean)
- stop(self)
- return satisfied termination conditions in a dictionary like
{'termination reason':value, ...}, for example {'tolfun':1e-12},
or the empty dict {}
- tell(self, arx, fitvals)
- update the evolution paths and the distribution parameters m,
sigma, and C within CMA-ES.
Parameters
----------
`arx`
a list of solutions, presumably from `ask()`
`fitvals`
the corresponding objective function values
Methods inherited from OOOptimizer:
- optimize(self, objectivefct, iterations=None, min_iterations=1, args=(), verb_disp=20, logger=None)
- iterate at least ``min_iterations`` and at most ``iterations``
over ``OOOptimizer self`` using objective function
``objectivefct``. Prints current information every ``verb_disp``,
uses ``OptimDataLogger logger``, and returns ``self``.
Example
=======
::
import barecmaes2 as cma
es = cma.CMAES(7 * [0.1], 0.5).optimize(cma.Fcts.rosenbrock)
print(es.result()[0])
Data descriptors inherited from OOOptimizer:
- __dict__
- dictionary for instance variables (if defined)
- __weakref__
- list of weak references to the object (if defined)
|
class CMAESDataLogger(OptimDataLogger) |
|
data logger for class CMAES, that can store and plot data.
An even more useful logger would write the data to files, to
prevent memory overload. |
|
- Method resolution order:
- CMAESDataLogger
- OptimDataLogger
- __builtin__.object
Methods defined here:
- __init__(self, verb_modulo=1)
- cma is the `OOOptimizer` class instance that is to be logged,
`verb_modulo` controls whether logging takes place for each call
to the method `add()`
- add(self, cma=None, force=False)
- append some logging data from CMAES class instance cma,
if ``number_of_times_called modulo verb_modulo`` equals zero
- plot(self, fig_number=322)
- plot the stored data in figure fig_number.
Dependencies: matlabplotlib/pylab.
Example
=======
::
>> import barecmaes2 as cma
>> es = cma.CMAES(3 * [0.1], 1)
>> logger = cma.CMAESDataLogger().register(es)
>> while not es.stop():
>> X = es.ask()
>> es.tell(X, [bc.Fcts.elli(x) for x in X])
>> logger.add()
>> logger.plot()
Data and other attributes defined here:
- plotted = 0
Methods inherited from OptimDataLogger:
- data(self)
- return logged data in a dictionary
- disp(self)
- display some data trace (not implemented)
- register(self, optim)
Data descriptors inherited from OptimDataLogger:
- __dict__
- dictionary for instance variables (if defined)
- __weakref__
- list of weak references to the object (if defined)
|
class Fcts(__builtin__.object) |
|
versatile collection of test functions |
|
Static methods defined here:
- elli(x)
- ellipsoid test objective function
- rosenbrock(x)
- Rosenbrock test objective function
- sphere(x)
- sphere, ``sum(x**2)``, test objective function
Data descriptors defined here:
- __dict__
- dictionary for instance variables (if defined)
- __weakref__
- list of weak references to the object (if defined)
|
class OOOptimizer(__builtin__.object) |
|
"abstract" base class for an OO optimizer interface. |
|
Methods defined here:
- __init__(self, xstart, **more_kwargs)
- abstract method, ``xstart`` is a mandatory argument
- ask(self)
- abstract method, AKA get, deliver new candidate solution(s),
a list of "vectors"
- disp(self, verbosity_modulo=1)
- display of some iteration info
- optimize(self, objectivefct, iterations=None, min_iterations=1, args=(), verb_disp=20, logger=None)
- iterate at least ``min_iterations`` and at most ``iterations``
over ``OOOptimizer self`` using objective function
``objectivefct``. Prints current information every ``verb_disp``,
uses ``OptimDataLogger logger``, and returns ``self``.
Example
=======
::
import barecmaes2 as cma
es = cma.CMAES(7 * [0.1], 0.5).optimize(cma.Fcts.rosenbrock)
print(es.result()[0])
- result(self)
- abstract method, return ``(x, f(x), ...)``, that is the
minimizer, its function value, ...
- stop(self)
- abstract method, return satisfied termination conditions in
a dictionary like ``{'termination reason':value, ...}``,
for example ``{'tolfun':1e-12}``, or the empty dictionary ``{}``.
The implementation of `stop()` should prevent an infinite loop.
- tell(self, solutions, function_values)
- abstract method, AKA update, prepare for next iteration
Data descriptors defined here:
- __dict__
- dictionary for instance variables (if defined)
- __weakref__
- list of weak references to the object (if defined)
|
class OptimDataLogger(__builtin__.object) |
|
"abstract" base class for a data logger that can be used with an
`OOOptimizer` |
|
Methods defined here:
- __init__(self)
- add(self, optim=None, force=False)
- abstract method, add a "data point" from the state of optim
into the logger
- data(self)
- return logged data in a dictionary
- disp(self)
- display some data trace (not implemented)
- plot(self)
- plot data
- register(self, optim)
Data descriptors defined here:
- __dict__
- dictionary for instance variables (if defined)
- __weakref__
- list of weak references to the object (if defined)
| |