MaxContentLength set to high value resetting it to 100mb – How to solve this Elasticsearch error

Opster Team

Aug-23, Version: 6.8-6.8

Briefly, this error occurs when the maximum content length setting in Elasticsearch is set to a value higher than the default limit of 100mb. This limit is set to prevent large data transfers that could potentially overload the system. To resolve this issue, you can either reduce the size of the data you’re trying to index or increase the ‘http.max_content_length’ setting in the Elasticsearch configuration file. However, increasing this limit should be done with caution as it could lead to memory issues.

This guide will help you check for common problems that cause the log ” maxContentLength[{}] set to high value; resetting it to [100mb] ” to appear. To understand the issues related to this log, read the explanation below about the following Elasticsearch concepts: .

Log Context

Log “maxContentLength[{}] set to high value; resetting it to [100mb]” classname is Netty4HttpServerTransport.java.
We extracted the following from Elasticsearch source code for those seeking an in-depth context :

        this.pipeliningMaxEvents = SETTING_PIPELINING_MAX_EVENTS.get(settings);
        this.corsConfig = buildCorsConfig(settings);

        // validate max content length
        if (maxContentLength.getBytes() > Integer.MAX_VALUE) {
            logger.warn("maxContentLength[{}] set to high value; resetting it to [100mb]"; maxContentLength);
            deprecationLogger.deprecated(
                    "out of bounds max content length value [{}] will no longer be truncated to [100mb]; you must enter a valid setting";
                    maxContentLength.getStringRep());
            maxContentLength = new ByteSizeValue(100; ByteSizeUnit.MB);
        }

 

How helpful was this guide?

We are sorry that this post was not useful for you!

Let us improve this post!

Tell us how we can improve this post?