Getting started
Installing
Nevergrad is a Python 3.6+ library. It can be installed with:
pip install nevergrad
You can also install the main
branch instead of the latest release with:
pip install git+https://github.com/facebookresearch/nevergrad@main#egg=nevergrad
A conda-forge version is also available thanks to @oblute:
conda install -c conda-forge nevergrad
Alternatively, you can clone the repository and run pip install -e .
from inside the repository folder.
By default, this only installs requirements for the optimization and parametrization subpackages. If you are also interested in the benchmarking part,
you should install with the [benchmark]
flag (example: pip install nevergrad[benchmark]
), and if you also want the test tools, use
the [all]
flag (example: pip install --use-deprecated=legacy-resolver -e .[all]
).
Notes:
with
zsh
you will need to runpip install 'nevergrad[all]'
instead ofpip install nevergrad[all]
under Windows, you may need to preinstall torch (for
benchmark
orall
installations) using Pytorch installation instructions.
Installing on Windows
For Windows installation, please refer to the Windows documention.
Basic optimization example
By default all optimizers assume a centered and reduced prior at the beginning of the optimization (i.e. 0 mean and unitary standard deviation).
Optimizing (minimizing!) a function using an optimizer (here NgIohTuned
, our adaptative optimization algorithm) can be easily run with:
import nevergrad as ng
def square(x):
return sum((x - 0.5) ** 2)
# optimization on x as an array of shape (2,)
optimizer = ng.optimizers.NGOpt(parametrization=2, budget=100)
recommendation = optimizer.minimize(square) # best value
print(recommendation.value)
# >>> [0.49999998 0.50000004]
parametrization=n
is a shortcut to state that the function has only one variable, of dimension n
,
See the parametrization tutorial for more complex parametrizations.
recommendation
holds the optimal value(s) found for the provided function. It can be
directly accessed through recommendation.value
which is here a np.ndarray
of size 2.
You can print the full list of optimizers with:
import nevergrad as ng
print(sorted(ng.optimizers.registry.keys()))
The [optimization documentation](docs/optimization.md) contains more information on how to use several workers,
take full control of the optimization through the ask
and tell
interface, perform multiobjective optimization,
as well as pieces of advice on how to choose the proper optimizer for your problem.
Structure of the package
The goals of this package are to provide:
gradient/derivative-free optimization algorithms, including algorithms able to handle noise.
tools to parametrize any code, making it painless to optimize your parameters/hyperparameters, whether they are continuous, discrete or a mixture of continuous and discrete parameters.
functions on which to test the optimization algorithms.
benchmark routines in order to compare algorithms easily.
The structure of the package follows its goal, you will therefore find subpackages:
optimization
: implementing optimization algorithmsparametrization
: specifying what are the parameters you want to optimizefunctions
: implementing both simple and complex benchmark functionsbenchmark
: for running experiments comparing the algorithms on benchmark functionscommon
: a set of tools used throughout the package