Results for "masked"
Masked Language Model
IntermediatePredicts masked tokens in a sequence, enabling bidirectional context; often used for embeddings rather than generation.
A masked language model is like a fill-in-the-blank game where some words in a sentence are hidden, and the model has to guess what those words are. For example, if you have the sentence 'The cat sat on the ___', the model looks at the words around the blank to figure out that 'mat' is a good gue...
Predicts masked tokens in a sequence, enabling bidirectional context; often used for embeddings rather than generation.
Learning from data by constructing “pseudo-labels” (e.g., next-token prediction, masked modeling) without manual annotation.
A model that assigns probabilities to sequences of tokens; often trained by next-token prediction.