Nevergrad is a Python 3.6+ library. It can be installed with:
pip install nevergrad
You can also install the master branch instead of the latest release with:
pip install git+https://github.com/facebookresearch/nevergrad@master#egg=nevergrad
Alternatively, you can clone the repository and run
pip install -e . from inside the repository folder.
By default, this only installs requirements for the optimization and parametrization subpackages. If you are also interested in the benchmarking part,
you should install with the
[benchmark] flag (example:
pip install nevergrad[benchmark]), and if you also want the test tools, use
[all] flag (example:
pip install -e .[all]).
zshyou will need to run
pip install 'nevergrad[all]'instead of
pip install nevergrad[all]
under Windows, you may need to preinstall torch (for
allinstallations) using Pytorch installation instructions.
Basic optimization example¶
By default all optimizers assume a centered and reduced prior at the beginning of the optimization (i.e. 0 mean and unitary standard deviation).
Optimizing (minimizing!) a function using an optimizer (here
OnePlusOne) can be easily run with:
import nevergrad as ng def square(x): return sum((x - .5)**2) # optimization on x as an array of shape (2,) optimizer = ng.optimizers.OnePlusOne(parametrization=2, budget=100) recommendation = optimizer.minimize(square) # best value print(recommendation.value) # >>> [0.49971112 0.5002944]
parametrization=n is a shortcut to state that the function has only one variable, of dimension
See the parametrization tutorial for more complex parametrizations.
recommendation holds the optimal value(s) found for the provided function. It can be
directly accessed through
recommendation.value which is here a
np.ndarray of size 2.
You can print the full list of optimizers with:
import nevergrad as ng print(sorted(ng.optimizers.registry.keys()))
The [optimization documentation](docs/optimization.md) contains more information on how to use several workers,
take full control of the optimization through the
tell interface, perform multiobjective optimization,
as well as pieces of advice on how to choose the proper optimizer for your problem.
Structure of the package¶
The goals of this package are to provide:
gradient/derivative-free optimization algorithms, including algorithms able to handle noise.
tools to parametrize any code, making it painless to optimize your parameters/hyperparameters, whether they are continuous, discrete or a mixture of continuous and discrete parameters.
functions on which to test the optimization algorithms.
benchmark routines in order to compare algorithms easily.
The structure of the package follows its goal, you will therefore find subpackages:
optimization: implementing optimization algorithms
parametrization: specifying what are the parameters you want to optimize
functions: implementing both simple and complex benchmark functions
benchmark: for running experiments comparing the algorithms on benchmark functions
common: a set of tools used throughout the package