Home >  Term: Tokenisierung
Tokenisierung

In text mining or Full-Text Search, the process of identifying meaningful units within strings, either at word boundaries, morphemes, or stems, so that related tokens can be grouped. For example, although "San Francisco" is two words, it could be treated as a single token.

0 0

Kūrėjas

  • Hellaweiss
  •  (Diamond) 9392 points
  • 100% positive feedback
© 2025 CSOFT International, Ltd.