Python package
attention
Legacy attention mechanisms for graph-based neural networks.
Modules
attention_with_rope: Attention with rotary position embeddings.interfaces: Attention interface definitions.mask_config: Attention mask configuration utilities.multi_latent_attention: Multi-latent attention mechanism.multihead_attention: Multi-head attention implementation.ragged_attention: Attention for variable-length sequences.
Was this page helpful?
Thank you! We'll create more content like this.
Thank you for helping us improve!