Perform text embedding inference on the service Generally available; Added in 8.11.0
Query parameters
-
Specifies the amount of time to wait for the inference request to complete.
External documentation
Body Required
-
The input data type for the text embedding model. Possible values include:
SEARCHINGESTCLASSIFICATIONCLUSTERINGNot all services support all values. Unsupported values will trigger a validation exception. Accepted values depend on the configured inference service, refer to the relevant service-specific documentation for more info.
The
input_typeparameter specified on the root level of the request body will take precedence over theinput_typeparameter specified intask_settings. -
Task settings for the individual inference request. These settings are specific to the you specified and override the task settings specified when initializing the service.
POST /_inference/text_embedding/{inference_id}
Console
POST _inference/text_embedding/my-cohere-endpoint { "input": "The sky above the port was the color of television tuned to a dead channel.", "input_type": "ingest" } resp = client.inference.text_embedding( inference_id="my-cohere-endpoint", input="The sky above the port was the color of television tuned to a dead channel.", input_type="ingest", ) const response = await client.inference.textEmbedding({ inference_id: "my-cohere-endpoint", input: "The sky above the port was the color of television tuned to a dead channel.", input_type: "ingest", }); response = client.inference.text_embedding( inference_id: "my-cohere-endpoint", body: { "input": "The sky above the port was the color of television tuned to a dead channel.", "input_type": "ingest" } ) $resp = $client->inference()->textEmbedding([ "inference_id" => "my-cohere-endpoint", "body" => [ "input" => "The sky above the port was the color of television tuned to a dead channel.", "input_type" => "ingest", ], ]); curl -X POST -H "Authorization: ApiKey $ELASTIC_API_KEY" -H "Content-Type: application/json" -d '{"input":"The sky above the port was the color of television tuned to a dead channel.","input_type":"ingest"}' "$ELASTICSEARCH_URL/_inference/text_embedding/my-cohere-endpoint" client.inference().textEmbedding(t -> t .inferenceId("my-cohere-endpoint") .input("The sky above the port was the color of television tuned to a dead channel.") .inputType("ingest") ); Request example
Run `POST _inference/text_embedding/my-cohere-endpoint` to perform text embedding on the example sentence using the Cohere integration,
{ "input": "The sky above the port was the color of television tuned to a dead channel.", "input_type": "ingest" } Response examples (200)
An abbreviated response from `POST _inference/text_embedding/my-cohere-endpoint`.
{ "text_embedding": [ { "embedding": [ { 0.018569946, -0.036895752, 0.01486969, -0.0045204163, -0.04385376, 0.0075950623, 0.04260254, -0.004005432, 0.007865906, 0.030792236, -0.050476074, 0.011795044, -0.011642456, -0.010070801 } ] } ] }