Welcome to xFormers’s documentation!¶
xFormers is a PyTorch based library which hosts flexible Transformers parts. They are interoperable and optimized building blocks, which can be optionally be combined to create some state of the art models.
Components Documentation
Build models and blocks programatically
Tutorials and examples
- Tutorials
- Replace all attentions from an existing ViT model with a sparse equivalent?
- Using BlockSparseAttention
- How to Enable Fused Operations Using AOTAutograd and NVFuser
- Extend the xFormers parts zoo
- I’m only interested in testing out the attention mechanisms that are hosted here
- Building an encoder, comparing to PyTorch
- Building full models
- Using the Reversible block
- Using Triton-based layers
- Hierarchical Transformers
Some custom parts