tokenizer

From Wiktionary, the free dictionary
Jump to navigation Jump to search

English

[edit]

Etymology

[edit]

tokenize +‎ -er

Noun

[edit]

tokenizer (plural tokenizers)

  1. (computing) A system that parses an input stream into its component tokens.