Position Embeddings¶
- class xformers.components.positional_embedding.RotaryEmbedding(dim_model: int, *_, **__)[source]¶
Bases:
Module
The rotary position embeddings from RoFormer (Su et. al). A crucial insight from the method is that the query and keys are transformed by rotation matrices which depend on the relative positions.
Other implementations are available in the Rotary Transformer repo and in GPT-NeoX, GPT-NeoX was an inspiration
- class xformers.components.positional_embedding.SinePositionalEmbedding(dim_model: int, *args, **kwargs)[source]¶
Bases:
PositionEmbedding
- class xformers.components.positional_embedding.VocabEmbedding(dim_model: int, seq_len: int, vocab_size: int, dropout: float = 0.0, *args, **kwargs)[source]¶
Bases:
PositionEmbedding
- xformers.components.positional_embedding.build_positional_embedding(config: Union[Dict[str, Any], PositionEmbeddingConfig])[source]¶
Builds a position encoding from a config.
This assumes a ‘name’ key in the config which is used to determine what attention class to instantiate. For instance, a config {“name”: “my_position_encoding”, “foo”: “bar”} will find a class that was registered as “my_position_encoding” (see
register_positional_embedding()
) and call .from_config on it.
- xformers.components.positional_embedding.register_positional_embedding(name: str, config: ~typing.Any = <class 'xformers.components.positional_embedding.base.PositionEmbeddingConfig'>)¶
Registers a subclass.
This decorator allows xFormers to instantiate a given subclass from a configuration file, even if the class itself is not part of the xFormers library.