fairseq2.optim¶
The optim module provides optimizers and optimization utilities for training neural networks.
Coming soon: This documentation is being developed. The optim module includes:
AdamW and other optimizers
Learning rate schedulers
Gradient clipping utilities
Optimization configuration
Please refer to the source code and examples in the meantime.