Results for "intra-sequence relations"
Self-Attention
Intermediate
Attention where queries/keys/values come from the same sequence, enabling token-to-token interactions.
Positional Encoding
Intermediate
Injects sequence order into Transformers, since attention alone is permutation-invariant.
Masked Language Model
Intermediate
Predicts masked tokens in a sequence, enabling bidirectional context; often used for embeddings rather than generation.
Protein Folding
Advanced
Predicting protein 3D structure from sequence.