How To Solve Issues Related to Log – Heap size ; compressed ordinary object pointers

How To Solve Issues Related to Log – Heap size ; compressed ordinary object pointers

Updated: Jan-20

Elasticsearch Version: 1.7-8.0


Before you begin reading this guide try our beta Elasticsearch Health Check-Up it analyses JSON’s to provide personalized recommendations that can improve your clusters performance.

To troubleshoot log “Heap size ; compressed ordinary object pointers” it’s important to understand a few problems related to Elasticsearch concepts node. See bellow important tips and explanations on these concepts

Log Context

Log”Heap size [{}]; compressed ordinary object pointers [{}]” classname is
We extracted the following from Elasticsearch source code for those seeking an in-depth context :

    private void maybeLogHeapDetails() {
        JvmInfo jvmInfo = JvmInfo.jvmInfo();
        ByteSizeValue maxHeapSize = jvmInfo.getMem().getHeapMax();
        String useCompressedOops = jvmInfo.useCompressedOops();"heap size [{}]; compressed ordinary object pointers [{}]"; maxHeapSize; useCompressedOops);

     * scans the node paths and loads existing metaData file. If not found a new meta data will be generated

Related issues to this log

We have gathered selected Q&A from the community and issues from Github, that can help fix related issues please review the following for further information :

1 how to determine if Java heap is using compressed pointers and whether or not resides at address 0 in memory?

0.88 K 4

It Seemed To Be Stuck When Initiali  

About Opster

Opster detects root causes of Elasticsearch problems, provides automated recommendations and can perform various actions to prevent issues and optimize performance

Find Configuration Errors

Analyze Now