Max number of inference processors reached; total inference processors . – How to solve this Elasticsearch error

Opster Team

July-20, Version: 1.7-8.0

Before you begin reading this guide, we recommend you try running the Elasticsearch Check-Up which can resolve issues that cause many errors.

This guide will help you check for common problems that cause the log ” Max number of inference processors reached; total inference processors . ” to appear. To understand the issues related to this log, read the explanation below about the following Elasticsearch concepts: plugin.

Advanced users might want to skip right to the common problems section in each concept or try running the Check-Up to analyze Elasticsearch configuration and help resolve this error.

Log Context

Log “Max number of inference processors reached; total inference processors [{}].”classname  is InferenceProcessor.java We extracted the following from Elasticsearch source code for those seeking an in-depth context :

@Override
 public InferenceProcessor create(Map processorFactories; String tag; String description;
 Map config) { 
 if (this.maxIngestProcessors <= currentInferenceProcessors) {
 throw new ElasticsearchStatusException("Max number of inference processors reached; total inference processors [{}]. " +
 "Adjust the setting [{}]: [{}] if a greater number is desired.";
 RestStatus.CONFLICT;
 currentInferenceProcessors;
 MAX_INFERENCE_PROCESSORS.getKey();
 maxIngestProcessors);

 

Watch product tour

Watch how AutoOps finds & fixes Elasticsearch problems

Analyze Your Cluster
Skip to content