Nevergrad - A gradient-free optimization platform

This documentation is a work in progress, feel free to help us update/improve/restucture it!
Quick start
nevergrad
is a Python 3.6+ library. It can be installed with:
pip install nevergrad
You can find other installation options (including for Windows users) in the Getting started section.
Feel free to join Nevergrad users Facebook group.
Minimizing a function using an optimizer (here NGOpt
, our adaptative optimization algorithm) can be easily run with:
import nevergrad as ng
def square(x):
return sum((x - 0.5) ** 2)
# optimization on x as an array of shape (2,)
optimizer = ng.optimizers.NGOpt(parametrization=2, budget=100)
recommendation = optimizer.minimize(square) # best value
print(recommendation.value)
# >>> [0.49999998 0.50000004]

Convergence of a population of points to the minima with two-points DE.
nevergrad
can also support bounded continuous variables as well as discrete variables, and mixture of those.
To do this, one can specify the input space:
import nevergrad as ng
def fake_training(learning_rate: float, batch_size: int, architecture: str) -> float:
# optimal for learning_rate=0.2, batch_size=4, architecture="conv"
return (learning_rate - 0.2) ** 2 + (batch_size - 4) ** 2 + (0 if architecture == "conv" else 10)
# Instrumentation class is used for functions with multiple inputs
# (positional and/or keywords)
parametrization = ng.p.Instrumentation(
# a log-distributed scalar between 0.001 and 1.0
learning_rate=ng.p.Log(lower=0.001, upper=1.0),
# an integer from 1 to 12
batch_size=ng.p.Scalar(lower=1, upper=12).set_integer_casting(),
# either "conv" or "fc"
architecture=ng.p.Choice(["conv", "fc"]),
)
optimizer = ng.optimizers.NGOpt(parametrization=parametrization, budget=100)
recommendation = optimizer.minimize(fake_training)
print(recommendation.kwargs) # shows the recommended keyword arguments of the function
# >>> {'learning_rate': 0.1998, 'batch_size': 4, 'architecture': 'conv'}
Learn more about parametrization in the Parametrization section!
Contents
- Getting started
- How to perform optimization
- Basic example
- Using several workers
- Ask and tell interface
- Choosing an optimizer
- Telling non-asked points, or suggesting points
- Adding callbacks
- Optimization with constraints
- Optimizing machine learning hyperparameters
- Example with permutation
- Example of chaining, or inoculation, or initialization of an evolutionary algorithm
- Multiobjective minimization with Nevergrad
- Reproducibility
- Parametrizing your optimization
- Examples - Nevergrad for machine learning
- Optimizers API Reference
- Optimizer API
- Callbacks
- Configurable optimizers
- Optimizers
BayesOptim
CM
CMandAS2
CMandAS3
Chaining
ChoiceBase
ConfPSO
ConfPortfolio
ConfSplitOptimizer
ConfiguredPSO
EDA
EMNA
MEDA
MPCEDA
MetaCMA
MultiDiscrete
MultipleSingleRuns
NGO
NGOpt
NGOpt10
NGOpt12
NGOpt13
NGOpt14
NGOpt15
NGOpt16
NGOpt21
NGOpt36
NGOpt38
NGOpt39
NGOpt4
NGOpt8
NGOptBase
NGOptRW
NoisyBandit
NoisySplit
PCEDA
ParametrizedBO
ParametrizedCMA
ParametrizedMetaModel
ParametrizedOnePlusOne
ParametrizedTBPSA
Portfolio
Rescaled
SPSA
SQPCMA
Shiwa
SplitOptimizer
cGA
smooth_copy()
- Parametrization API reference
- Running algorithm benchmarks
- Examples - Nevergrad for R
- Examples of benchmarks
- Noisy optimization
- One-shot optimization
- Comparison-based methods for ill-conditioned problems
- Ill-conditioned function
- Discrete
- List of benchmarks
basic()
compabasedillcond()
dim10_select_one_feature()
dim10_select_two_features()
dim10_smallbudget()
doe_dim4()
illcond()
metanoise()
noise()
oneshot1()
oneshot2()
oneshot3()
oneshot4()
repeated_basic()
adversarial_attack()
alldes()
aquacrop_fao()
bonnans()
causal_similarity()
complex_tsp()
constrained_illconditioned_parallel()
control_problem()
deceptive()
doe()
double_o_seven()
far_optimum_es()
fishing()
fiveshots()
harderparallel()
hdbo4d()
hdmultimodal()
illcondi()
illcondipara()
image_multi_similarity()
image_multi_similarity_cv()
image_multi_similarity_pgan()
image_multi_similarity_pgan_cv()
image_quality()
image_quality_cv()
image_quality_cv_pgan()
image_quality_pgan()
image_quality_proxy()
image_quality_proxy_pgan()
image_similarity()
image_similarity_and_quality()
image_similarity_and_quality_cv()
image_similarity_and_quality_cv_pgan()
image_similarity_and_quality_pgan()
image_similarity_pgan()
image_single_quality()
image_single_quality_pgan()
instrum_discrete()
keras_tuning()
mixsimulator()
mlda()
mldakmeans()
mltuning()
mono_rocket()
morphing_pgan_quality()
multimodal()
multiobjective_example()
multiobjective_example_hd()
multiobjective_example_many()
multiobjective_example_many_hd()
naive_seq_keras_tuning()
naive_seq_mltuning()
naivemltuning()
nano_naive_seq_mltuning()
nano_seq_mltuning()
neuro_control_problem()
newdoe()
noisy()
olympus_emulators()
olympus_surfaces()
oneshot()
oneshot_mltuning()
paraalldes()
parahdbo4d()
parallel()
parallel_small_budget()
paramultimodal()
pbbob()
pbo_reduced_suite()
pbo_suite()
pbt()
photonics()
photonics2()
powersystems()
ranknoisy()
realworld()
reduced_yahdlbbbob()
rocket()
seq_keras_tuning()
seq_mltuning()
sequential_fastgames()
sequential_instrum_discrete()
simple_tsp()
skip_ci()
spsa_benchmark()
team_cycling()
unit_commitment()
yabbob()
yabigbbob()
yaboundedbbob()
yaboxbbob()
yaconstrainedbbob()
yahdbbob()
yahdlbbbob()
yahdnoisybbob()
yahdnoisysplitbbob()
yahdsplitbbob()
yamegapenbbob()
yamegapenbigbbob()
yamegapenboundedbbob()
yamegapenboxbbob()
yamegapenhdbbob()
yanoisybbob()
yanoisysplitbbob()
yaonepenbbob()
yaonepenbigbbob()
yaonepenboundedbbob()
yaonepenboxbbob()
yaonepennoisybbob()
yaonepenparabbob()
yaonepensmallbbob()
yaparabbob()
yapenbbob()
yapenboundedbbob()
yapenboxbbob()
yapennoisybbob()
yapenparabbob()
yapensmallbbob()
yasmallbbob()
yasplitbbob()
yatinybbob()
yatuningbbob()
yawidebbob()
- Examples - Working with Pyomo model
- Installation and configuration on Windows
- Contributing to Nevergrad
- Open Optimization Competition 2020
Citing
@misc{nevergrad,
author = {J. Rapin and O. Teytaud},
title = {{Nevergrad - A gradient-free optimization platform}},
year = {2018},
publisher = {GitHub},
journal = {GitHub repository},
howpublished = {\url{https://GitHub.com/FacebookResearch/Nevergrad}},
}
License
nevergrad
is released under the MIT license. See LICENSE for additional details about it, as well as our Terms of Use and Privacy Policy.
Copyright © Meta Platforms, Inc.