Keyword Tokenizer - gnuhub/elasticsearch GitHub Wiki

A tokenizer of type keyoword that emits the entire input as a single input.

The following are settings that can be set for a keyword tokenizer type:

  • buffer_size: The term buffer size. Defaults to 256.
⚠️ **GitHub.com Fallback** ⚠️