Normalizer ” + normalizerName + ” not found for field ” + name + ” – How to solve this Elasticsearch error

Normalizer ” + normalizerName + ” not found for field ” + name + ” – How to solve this Elasticsearch error

Opster Team

February-21, Version: 1.7-8.0

To understand why your need to define a normalizer before using it, you should run the Elasticsearch Error Check-Up. It will help you resolve this issue and prevent it in the future.

This guide will help you check for common problems that cause the log “normalizer ” + normalizerName + ” not found for field ” + name + “” to appear. It’s important to understand the issues related to the log, so to get started, read the general overview on common issues and tips related to the Elasticsearch concept: index.

Background

Elasticsearch has only a lowercase built-in normalizer so far, so using any other normalizer requires building a custom one. A normalizer needs to be defined under the analysis settings section when creating an index. The normalizer is applied before indexing the field value. 

You have to use the same normalizer name (that you have defined in the settings) when adding it for a particular field in the mapping section. The above error arises when the normalizer used in the mapping for a particular field is not defined in the analysis settings section.

How to reproduce this exception

Create index:

PUT /my-index
{
 "settings": {
   "analysis": {
     "normalizer": {
       "my_normalizer": {
         "type": "custom",
         "char_filter": [],
         "filter": ["lowercase", "asciifolding"]
       }
     }
   }
 },
 "mappings": {
   "properties": {
     "opster": {
       "type": "keyword",
       "normalizer": "lower_normalizer"
     }
   }
 }
}

The response generated will be:

{
 "error": {
   "root_cause": [
     {
       "type": "mapper_parsing_exception",
       "reason": "normalizer [lower_normalizer] not found for field [opster]"
     }
   ],
   "type": "mapper_parsing_exception",
   "reason": "Failed to parse mapping [_doc]: normalizer [lower_normalizer] not found for field [opster]",
   "caused_by": {
     "type": "mapper_parsing_exception",
     "reason": "normalizer [lower_normalizer] not found for field [opster]"
   }
 },
 "status": 400
}

How to fix this exception

The exception clearly describes that the normalizer specified for the field opster, does not exist. You need to define the normalizer, in the settings of the index before using it on a field.

To define the normalizer in your settings, recreate your index like this:

PUT /my-index
{
 "settings": {
   "analysis": {
     "normalizer": {
       "my_normalizer": {
         "type": "custom",
         "char_filter": [],
         "filter": ["lowercase", "asciifolding"]
       }
     }
   }
 },
 "mappings": {
   "properties": {
     "foo": {
       "type": "keyword",
       "normalizer": "my_normalizer"
     }
   }
 }
}

The response generated will be:

{
 "acknowledged": true,
 "shards_acknowledged": true,
 "index": "my-index"
}

Log Context

Log”normalizer [” + normalizerName + “] not found for field [” + name + “]”classname  is KeywordFieldMapper.java We extracted the following from Elasticsearch source code for those seeking an in-depth context :

NamedAnalyzer normalizer = Lucene.KEYWORD_ANALYZER;
  NamedAnalyzer searchAnalyzer = Lucene.KEYWORD_ANALYZER;
  if (normalizerName == null || "default".equals(normalizerName) == false) {
  normalizer = indexAnalyzers.getNormalizer(normalizerName);
  if (normalizer == null) {
  throw new MapperParsingException("normalizer [" + normalizerName + "] not found for field [" + name + "]");
  }
  if (splitQueriesOnWhitespace) {
  searchAnalyzer = indexAnalyzers.getWhitespaceNormalizer(normalizerName);
  } else {
  searchAnalyzer = normalizer;

 

Run the Check-Up to get a customized report like this:

Analyze your cluster