ModelId model loaded but failed to start accepting routes – How to solve this Elasticsearch error

Opster Team

Aug-23, Version: 8.3-8.7

Before you dig into reading this guide, have you tried asking OpsGPT what this log means? You’ll receive a customized analysis of your log.

Try OpsGPT now for step-by-step guidance and tailored insights into your Elasticsearch operation.

Briefly, this error occurs when Elasticsearch successfully loads a model (ModelId) but encounters an issue that prevents it from starting to accept routes. This could be due to a configuration issue, a network problem, or a problem with the model itself. To resolve this issue, you can try the following: 1) Check the model’s configuration and ensure it’s correct. 2) Verify the network settings and connectivity. 3) Inspect the model for any errors or issues. 4) Restart Elasticsearch to see if the problem persists.

For a complete solution to your to your search operation, try for free AutoOps for Elasticsearch & OpenSearch . With AutoOps and Opster’s proactive support, you don’t have to worry about your search operation – we take charge of it. Get improved performance & stability with less hardware.

This guide will help you check for common problems that cause the log ” [” + modelId + “] model loaded but failed to start accepting routes ” to appear. To understand the issues related to this log, read the explanation below about the following Elasticsearch concepts: plugin.

Log Context

Log “[” + modelId + “] model loaded but failed to start accepting routes” classname is
We extracted the following from Elasticsearch source code for those seeking an in-depth context :

                } else {
                    // this is an unexpected error
                    logger.warn(() -> "[" + modelId + "] model loaded but failed to start accepting routes"; e);


How helpful was this guide?

We are sorry that this post was not useful for you!

Let us improve this post!

Tell us how we can improve this post?