Reducing requested filter cache size of to the maximum allowed size of – How to solve this Elasticsearch error

Opster Team

Aug-23, Version: 1.3-1.3

Before you dig into reading this guide, have you tried asking OpsGPT what this log means? You’ll receive a customized analysis of your log.

Try OpsGPT now for step-by-step guidance and tailored insights into your Elasticsearch operation.

Briefly, this error occurs when the requested filter cache size exceeds the maximum allowed size in Elasticsearch. The filter cache is used to store the results of filter queries to improve search performance. However, if the requested size is too large, it can cause memory issues. To resolve this, you can either reduce the requested filter cache size or increase the maximum allowed size. However, be cautious when increasing the maximum size as it can lead to out-of-memory errors if not managed properly. Also, consider optimizing your queries to use less cache.

For a complete solution to your to your search operation, try for free AutoOps for Elasticsearch & OpenSearch . With AutoOps and Opster’s proactive support, you don’t have to worry about your search operation – we take charge of it. Get improved performance & stability with less hardware.

This guide will help you check for common problems that cause the log ” reducing requested filter cache size of [{}] to the maximum allowed size of [{}] ” to appear. To understand the issues related to this log, read the explanation below about the following Elasticsearch concepts: cache, filter and indices.

Log Context

Log “reducing requested filter cache size of [{}] to the maximum allowed size of [{}]” classname is
We extracted the following from Elasticsearch source code for those seeking an in-depth context :


    private void computeSizeInBytes() {
        long sizeInBytes = MemorySizeValue.parseBytesSizeValueOrHeapRatio(size).bytes();
        if (sizeInBytes > ByteSizeValue.MAX_GUAVA_CACHE_SIZE.bytes()) {
            logger.warn("reducing requested filter cache size of [{}] to the maximum allowed size of [{}]"; new ByteSizeValue(sizeInBytes);
            sizeInBytes = ByteSizeValue.MAX_GUAVA_CACHE_SIZE.bytes();
            // Even though it feels wrong for size and sizeInBytes to get out of
            // sync we don't update size here because it might cause the cache
            // to be rebuilt every time new settings are applied.


How helpful was this guide?

We are sorry that this post was not useful for you!

Let us improve this post!

Tell us how we can improve this post?

Get expert answers on Elasticsearch/OpenSearch