Inference task cancelled with reason %s – How to solve this Elasticsearch exception

Opster Team

August-23, Version: 8.4-8.9

Before you dig into reading this guide, have you tried asking OpsGPT what this log means? You’ll receive a customized analysis of your log.

Try OpsGPT now for step-by-step guidance and tailored insights into your Elasticsearch operation.

Briefly, this error occurs when an inference task in Elasticsearch is cancelled due to a specific reason, which is indicated by [%s]. This could be due to a variety of reasons such as resource constraints, configuration issues, or network problems. To resolve this issue, you can check the server logs for more detailed information about the cancellation reason. Then, depending on the reason, you may need to adjust your resource allocation, fix your configuration, or troubleshoot your network.

For a complete solution to your to your search operation, try for free AutoOps for Elasticsearch & OpenSearch . With AutoOps and Opster’s proactive support, you don’t have to worry about your search operation – we take charge of it. Get improved performance & stability with less hardware.

This guide will help you check for common problems that cause the log ” Inference task cancelled with reason [%s] ” to appear. To understand the issues related to this log, read the explanation below about the following Elasticsearch concepts: plugin, task.

Log Context

Log “Inference task cancelled with reason [%s]” class name is TransportInternalInferModelAction.java. We extracted the following from Elasticsearch source code for those seeking an in-depth context :

 // Always fail immediately and return an error
 ex -> true
 );
 request.getObjectsToInfer().forEach(stringObjectMap -> typedChainTaskExecutor.add(chainedTask -> {
 if (task.isCancelled()) {
 throw new TaskCancelledException(format("Inference task cancelled with reason [%s]"; task.getReasonCancelled()));
 }
 model.infer(stringObjectMap; request.getUpdate(); chainedTask);
 }));  typedChainTaskExecutor.execute(ActionListener.wrap(inferenceResultsInterfaces -> {

 

How helpful was this guide?

We are sorry that this post was not useful for you!

Let us improve this post!

Tell us how we can improve this post?