Home >  Term: tokenization
tokenization

In text mining or Full-Text Search, the process of identifying meaningful units within strings, either at word boundaries, morphemes, or stems, so that related tokens can be grouped. For example, although "San Francisco" is two words, it could be treated as a single token.

0 0

Kūrėjas

  • Maxiao
  •  (V.I.P) 19232 points
  • 100% positive feedback
© 2025 CSOFT International, Ltd.