Search API Japanese Tokenizer is a Drupal module that segments and indexes Japanese text at the word level. By default, Drupal's standard search and the "Search API module" use N-gram segmentation, which can be imprecise for Japanese. This module improves search performance using advanced natural language processing without requiring external search engines like Apache Solr or Elasticsearch.
Search API Japanese Normalizer is a module that provides a processor for the Drupal Search API module. This processor standardizes variations in Japanese text, improving search accuracy.