Tokenization
IntermediateConverting text into discrete units (tokens) for modeling; subword tokenizers balance vocabulary size and coverage.
Full Definition
Converting text into discrete units (tokens) for modeling; subword tokenizers balance vocabulary size and coverage.
Keywords
Domains
Related Terms
Concept Map
See how Tokenization connects to other concepts.
Open Knowledge Graph