API Reference

Endpoints API

Endpoints API

Our Endpoints API is a Scalable and efficient way to interact with Multiple LLMs. One key and endless possibilities, it allows teams to streamline their operations and focus on what matters most.

POST /v1/endpoints

This endpoint allows you to retrieve a pre-setup endpoint with retries and fallback capabilities configured in the Orquesta Admin platform.

Body parameters

key string

Key of the endpoint created in the Orquesta admin dashboard.

stream boolean

If set, partial message content will be sent. Tokens will be sent as data-only server-sent events as they become available, with the stream terminated by a data: [DONE] message.

context object optional

Key-value pairs that match your data model and fields declared in your configuration matrix. If you send multiple prompt keys, the context will be applied to the evaluation of each key.

metadata object optional

Key-value pairs that you want to attach to the log generated by this request.

variables object optional

Key-value pairs variables to replace in your prompts. If a variable is not provided that is defined in the prompt, the default variables are used.

Request

POST  /v1/endpoints

curl 'https://api.orquesta.cloud/v1/endpoints' \
  -H 'Authorization: Bearer {apiKey}' \
  -H "Content-Type: application/json" \
  -d $'{"keys":["completion_prompt","chat_prompt"],"context":{"environments":"develop"}, "variables":{"variable_key_one":"variable_value_one"}}'

Response

{
  "trace_id": "ed4219ac-ed42-4caf-ae25-5fd8df35662b",
  "content": "On the IRS website, you can use the \"Where's My Refund?\" tool to check the status of your tax refund. You will need to provide your Social Security number, filing status, and the exact amount of your expected refund. The tool will provide you with the most up-to-date information on the status of your refund.",
  "is_final": true
}

trace_id string

A unique identifier is used to track and reference a specific interaction or request within Orquesta. The value of the trace_id is always returned when you make a request to Orquesta.

content string

Response from the LLM configured within your endpoint. When using stream:true this value will contain the streaming part of the text from the LLM.

is_final boolean

Always set to true when stream property is set to false. When the property is set to true, every chunk will contain the status of the interaction of the LLM. When the is_final property is returned as true, that indicates the streaming is completed.

Add metrics to your endpoint

POST /v1/metrics

This endpoint allows you to add metrics to the log of each request.

Body parameters

At least metadata or score have to be reported. If there is no property a 400 error will be thrown.

trace_id  string

Your own custom key-value pairs can be attached to the logs. This is useful for storing additional information related to your interactions with the LLM providers or specifics within your application.

metadata  object optional

Your own custom key-value pairs can be attached to the logs. This is useful for storing additional information related to your interactions with the LLM providers or specifics within your application.

score number optional
Feedback provided by your end user. Number between 0 and 100.

Request

POST  /v1/metrics

curl 'https://api.orquesta.cloud/v1/metrics' \
  -H 'Authorization: Bearer {apiKey}' \
  -H "Content-Type: application/json" \
  -d $'{"trace_id": "c4674e1d5b834713a86da1286a2cfc5a","score":100,"metadata":{"custom1":"aaa","custom2":123}}'

Response

{
  "ok": true
}

Endpoints API

Our Endpoints API is a Scalable and efficient way to interact with Multiple LLMs. One key and endless possibilities, it allows teams to streamline their operations and focus on what matters most.

POST /v1/endpoints

This endpoint allows you to retrieve a pre-setup endpoint with retries and fallback capabilities configured in the Orquesta Admin platform.

Body parameters

key string

Key of the endpoint created in the Orquesta admin dashboard.

stream boolean

If set, partial message content will be sent. Tokens will be sent as data-only server-sent events as they become available, with the stream terminated by a data: [DONE] message.

context object optional

Key-value pairs that match your data model and fields declared in your configuration matrix. If you send multiple prompt keys, the context will be applied to the evaluation of each key.

metadata object optional

Key-value pairs that you want to attach to the log generated by this request.

variables object optional

Key-value pairs variables to replace in your prompts. If a variable is not provided that is defined in the prompt, the default variables are used.

Request

POST  /v1/endpoints

curl 'https://api.orquesta.cloud/v1/endpoints' \
  -H 'Authorization: Bearer {apiKey}' \
  -H "Content-Type: application/json" \
  -d $'{"keys":["completion_prompt","chat_prompt"],"context":{"environments":"develop"}, "variables":{"variable_key_one":"variable_value_one"}}'

Response

{
  "trace_id": "ed4219ac-ed42-4caf-ae25-5fd8df35662b",
  "content": "On the IRS website, you can use the \"Where's My Refund?\" tool to check the status of your tax refund. You will need to provide your Social Security number, filing status, and the exact amount of your expected refund. The tool will provide you with the most up-to-date information on the status of your refund.",
  "is_final": true
}

trace_id string

A unique identifier is used to track and reference a specific interaction or request within Orquesta. The value of the trace_id is always returned when you make a request to Orquesta.

content string

Response from the LLM configured within your endpoint. When using stream:true this value will contain the streaming part of the text from the LLM.

is_final boolean

Always set to true when stream property is set to false. When the property is set to true, every chunk will contain the status of the interaction of the LLM. When the is_final property is returned as true, that indicates the streaming is completed.

Add metrics to your endpoint

POST /v1/metrics

This endpoint allows you to add metrics to the log of each request.

Body parameters

At least metadata or score have to be reported. If there is no property a 400 error will be thrown.

trace_id  string

Your own custom key-value pairs can be attached to the logs. This is useful for storing additional information related to your interactions with the LLM providers or specifics within your application.

metadata  object optional

Your own custom key-value pairs can be attached to the logs. This is useful for storing additional information related to your interactions with the LLM providers or specifics within your application.

score number optional
Feedback provided by your end user. Number between 0 and 100.

Request

POST  /v1/metrics

curl 'https://api.orquesta.cloud/v1/metrics' \
  -H 'Authorization: Bearer {apiKey}' \
  -H "Content-Type: application/json" \
  -d $'{"trace_id": "c4674e1d5b834713a86da1286a2cfc5a","score":100,"metadata":{"custom1":"aaa","custom2":123}}'

Response

{
  "ok": true
}

Endpoints API

Our Endpoints API is a Scalable and efficient way to interact with Multiple LLMs. One key and endless possibilities, it allows teams to streamline their operations and focus on what matters most.

POST /v1/endpoints

This endpoint allows you to retrieve a pre-setup endpoint with retries and fallback capabilities configured in the Orquesta Admin platform.

Body parameters

key string

Key of the endpoint created in the Orquesta admin dashboard.

stream boolean

If set, partial message content will be sent. Tokens will be sent as data-only server-sent events as they become available, with the stream terminated by a data: [DONE] message.

context object optional

Key-value pairs that match your data model and fields declared in your configuration matrix. If you send multiple prompt keys, the context will be applied to the evaluation of each key.

metadata object optional

Key-value pairs that you want to attach to the log generated by this request.

variables object optional

Key-value pairs variables to replace in your prompts. If a variable is not provided that is defined in the prompt, the default variables are used.

Request

POST  /v1/endpoints

curl 'https://api.orquesta.cloud/v1/endpoints' \
  -H 'Authorization: Bearer {apiKey}' \
  -H "Content-Type: application/json" \
  -d $'{"keys":["completion_prompt","chat_prompt"],"context":{"environments":"develop"}, "variables":{"variable_key_one":"variable_value_one"}}'

Response

{
  "trace_id": "ed4219ac-ed42-4caf-ae25-5fd8df35662b",
  "content": "On the IRS website, you can use the \"Where's My Refund?\" tool to check the status of your tax refund. You will need to provide your Social Security number, filing status, and the exact amount of your expected refund. The tool will provide you with the most up-to-date information on the status of your refund.",
  "is_final": true
}

trace_id string

A unique identifier is used to track and reference a specific interaction or request within Orquesta. The value of the trace_id is always returned when you make a request to Orquesta.

content string

Response from the LLM configured within your endpoint. When using stream:true this value will contain the streaming part of the text from the LLM.

is_final boolean

Always set to true when stream property is set to false. When the property is set to true, every chunk will contain the status of the interaction of the LLM. When the is_final property is returned as true, that indicates the streaming is completed.

Add metrics to your endpoint

POST /v1/metrics

This endpoint allows you to add metrics to the log of each request.

Body parameters

At least metadata or score have to be reported. If there is no property a 400 error will be thrown.

trace_id  string

Your own custom key-value pairs can be attached to the logs. This is useful for storing additional information related to your interactions with the LLM providers or specifics within your application.

metadata  object optional

Your own custom key-value pairs can be attached to the logs. This is useful for storing additional information related to your interactions with the LLM providers or specifics within your application.

score number optional
Feedback provided by your end user. Number between 0 and 100.

Request

POST  /v1/metrics

curl 'https://api.orquesta.cloud/v1/metrics' \
  -H 'Authorization: Bearer {apiKey}' \
  -H "Content-Type: application/json" \
  -d $'{"trace_id": "c4674e1d5b834713a86da1286a2cfc5a","score":100,"metadata":{"custom1":"aaa","custom2":123}}'

Response

{
  "ok": true
}

Start powering your SaaS with LLMs

Start

powering

your SaaS

with LLMs

Start powering your SaaS with LLMs