No model could be found to perform inference – How to solve this Elasticsearch exception

Opster Team

August-23, Version: 7.12-8.9

Briefly, this error occurs when Elasticsearch’s machine learning feature tries to perform an inference task, but can’t find a suitable model for it. This could be due to the model not being loaded, or the model ID being incorrect. To resolve this, ensure that the model you’re trying to use is loaded into Elasticsearch. If it is, check that the model ID is correct and matches the one in your inference task. If the model is not loaded, you need to load it first before running the inference task.

This guide will help you check for common problems that cause the log ” No model could be found to perform inference ” to appear. To understand the issues related to this log, read the explanation below about the following Elasticsearch concepts: plugin.

Log Context

Log “No model could be found to perform inference” class name is InferenceStep.java. We extracted the following from Elasticsearch source code for those seeking an in-depth context :

 searchRequest.source(searchSourceBuilder);  executeAsyncWithOrigin(client; ML_ORIGIN; SearchAction.INSTANCE; searchRequest; ActionListener.wrap(searchResponse -> {
 SearchHit[] hits = searchResponse.getHits().getHits();
 if (hits.length == 0) {
 listener.onFailure(new ResourceNotFoundException("No model could be found to perform inference"));
 } else {
 listener.onResponse(hits[0].getId());
 }
 }; listener::onFailure));
 }

 

How helpful was this guide?

We are sorry that this post was not useful for you!

Let us improve this post!

Tell us how we can improve this post?