SDK
Node.js SDK
Node.js SDK
In this guide, you will learn how to use the Node.js SDK with Orquesta. From installation to creating a client instance, usage and references.
Source Code
The source code can be found here: https://github.com/orquestadev/orquesta-javascript/tree/main/packages/node
Installation
Using the Node package manager pr yarn, you can install Orquesta.
npm install @orquesta/node
yarn add @orquesta/node
Creating a client instance
You can get your workspace API key from the settings section in your Orquesta workspace.
https://my.orquesta.dev/<workspace>/settings/developers
Initialize Orquesta module using your API Key.
import { createClient } from '@orquesta/node';
const client = createClient({
api_key: '__API_KEY__',
ttl: 3600,
});
When creating a client instance, the following connection settings can be adjusted:
api_key
: string - your workspace API key to use for authentication.environment?
: string - the environment to use for the client. Not required but recommended to use so it's added to the evaluation context automatically.ttl?
: number - the time to live in seconds for the local cache. Default is 3600 seconds (1 hour).
Usage - Endpoints
Use the Endpoints API to query
or stream your endpoints from Orquesta.
Using endpoints to generate a LLM response based on your use case with Orquesta provides a low-latency, secure connection to the Endpoints API online prediction service. Getting out of the box metrics and logging for your LLMs.
Endpoints API support streaming and querying. The endpoint response type is OrquestaEndpointResponse
. We recommend to use the code snippets provided in the Orquesta Admin panel to reduce risk of errors and improve ease of use.
Example: Querying an endpoint
const completion = await client.endpoints.query({
key: 'customer_service',
context: { environments: 'production', country: 'NLD' },
variables: { firstname: 'John', city: 'New York' },
metadata: { customer_id: 'Qwtqwty90281' },
});
console.log(completion.content);
Example: Streaming your endpoints
We provide support for streaming responses using Server Sent Events (SSE). The stream
method output a RxJS
observable that you can subscribe and receive the streaming chunks of data from the endpoint. The chunk type is OrquestaEndpointResponse
let endpointRef: OrquestaEndpoint;
const stream = client.endpoints
.stream({
key: 'customer_service',
context: { environments: 'production', country: 'NLD' },
variables: { firstname: 'John', city: 'New York' },
metadata: { customer_id: 'Qwtqwty90281' },
})
.subscribe({
next: (endpoint) => {
endpointRef = endpoint;
console.log(endpoint.content);
},
complete: () => {
// Executed when the streaming is completed
},
error: (error) => {
console.log(error);
},
});
If you need to cancel a stream, you can call stream.unsubscribe()
method.
Logging score and metadata for endpoints
After every query, Orquesta will generate a log with the result of the evaluation. You can add metadata
and score
to the endpoint by using the addMetrics
method.
endpointRef.addMetrics({
score: 85,
metadata: {
custom: 'custom_metadata',
chain_id: 'ad1231xsdaABw',
},
});
Usage - Prompts
Use the Prompts API to query your prompts from Orquesta.
You can use Orquesta in prompt management mode by consuming our Prompts API. The prompt value type is OrquestaPrompt
. We recommend to use the code snippets provided in the Orquesta Admin panel to reduce risk of errors and improve ease of use.
We support an unified data model structure for all our prompts and provide helper functions that map the returned value from Orquesta to the specific provider.
The query
method receives an object of type OrquestaPromptRequest
as parameter.
Example: Querying a prompt
import { OrquestaPrompt } from '@orquesta/core';
const prompt: OrquestaPrompt = await client.prompts.query({
key: 'prompt_key',
context: { environments: 'production', country: 'NLD' },
variables: { firstname: 'John', city: 'New York' },
metadata: { chain_id: 'ad1231xsdaABw' },
});
Logging responses and metadata for prompts
After every query, Orquesta will generate a log with the result of the evaluation. You can add metadata and information about the interaction with the LLM to the log by using the addMetrics
method.
The properties score
, latency
, llm_response
and economics
are reserved and used to generate your real-time dashboards. metadata
is a set of key-value pairs that you can use to add custom information to the log.
Example: Add metrics to your request log
prompt.addMetrics({
score: 85,
latency: 4000,
llm_response: 'Orquesta is awesome!',
economics: {
prompt_tokens: 1200,
completion_tokens: 750,
total_tokens: 1950,
},
metadata: {
custom: 'custom_metadata',
chain_id: 'ad1231xsdaABw',
total_interactions: 200,
},
});
Usage - Remote Configurations
Orquesta has a powerful Remote Configurations API that allows you to configure and run all your environments and services remotely dynamically. Orquesta supports different types of remote configurations, and we recommend always typing the query
method to help Typescript infer the correct type.
The useOrquestaRemoteConfig
hook receives an object of type OrquestaRemoteConfigQuery
as parameter.
Supported types: boolean
, number
, string
, json
, array
Example: Querying a configuration of type boolean
const config: OrquestaRemoteConfig<boolean> =
await client.remoteConfigs.query<boolean>({
key: 'boolean_config',
default_value: false,
context: { environments: 'production', role: 'admin' },
metadata: { timestamp: Date.now() },
});
Example: Querying a configuration of type string
const config: OrquestaRemoteConfig<string> =
await client.remoteConfigs.query<string>({
key: 'string_config',
default_value: 'string_value',
context: { environments: 'production', country: 'NL' },
metadata: { timestamp: Date.now() },
});
Example: Querying a configuration of type number
const config: OrquestaRemoteConfig<number> =
await client.remoteConfigs.query<number>({
key: 'number_config',
default_value: 1990,
context: { environments: 'production', market: 'US', domain: 'ecommerce' },
metadata: { timestamp: Date.now() },
});
Example: Querying a configuration of type array
const config: OrquestaRemoteConfig<string[]> = await client.remoteConfigs.query<
string[]
>({
key: 'array_config',
default_value: ['value1', 'value2'],
context: { environments: 'acceptance', isEnable: true },
metadata: { timestamp: Date.now() },
});
Example: Querying a configuration of type json
It's recommended to add an interface to the json configuration to help Typescript infer the correct type.
const config: OrquestaRemoteConfig<object> =
await client.remoteConfigs.query<object>({
key: 'json_config',
default_value: { dashboardEnabled: false, theme: 'dark' },
context: { environments: 'develop', platform: 'mobile' },
metadata: { timestamp: Date.now() },
});
Additional metadata logging
After every query, Orquesta will generate a log with data about the request. You can add metadata to the log using the addMetrics
method anytime.
metadata
is a set of key-value pairs
that you can use to add custom information to the log.
Example: Add metrics to your request log
prompt.addMetrics({
metadata: {
custom: 'custom_metadata',
user_clicks: 20,
selected_option: 'option1',
},
});
Orquesta API
Endpoints API
Types:
Methods:
client.endpoints.query({ ...params }) -> Promise<OrquestaEndpoint>
client.endpoints.stream({ ...params }) -> Observable<OrquestaEndpoint>
Prompts API
Types:
Methods:
client.prompts.query({ ...params }) -> Promise<OrquestaPrompt>
RemoteConfigs API
Types:
Methods:
client.remoteconfigs.query({ ...params }) -> Promise<OrquestaRemoteConfig<T>>
Node.js SDK
In this guide, you will learn how to use the Node.js SDK with Orquesta. From installation to creating a client instance, usage and references.
Source Code
The source code can be found here: https://github.com/orquestadev/orquesta-javascript/tree/main/packages/node
Installation
Using the Node package manager pr yarn, you can install Orquesta.
npm install @orquesta/node
yarn add @orquesta/node
Creating a client instance
You can get your workspace API key from the settings section in your Orquesta workspace.
https://my.orquesta.dev/<workspace>/settings/developers
Initialize Orquesta module using your API Key.
import { createClient } from '@orquesta/node';
const client = createClient({
api_key: '__API_KEY__',
ttl: 3600,
});
When creating a client instance, the following connection settings can be adjusted:
api_key
: string - your workspace API key to use for authentication.environment?
: string - the environment to use for the client. Not required but recommended to use so it's added to the evaluation context automatically.ttl?
: number - the time to live in seconds for the local cache. Default is 3600 seconds (1 hour).
Usage - Endpoints
Use the Endpoints API to query
or stream your endpoints from Orquesta.
Using endpoints to generate a LLM response based on your use case with Orquesta provides a low-latency, secure connection to the Endpoints API online prediction service. Getting out of the box metrics and logging for your LLMs.
Endpoints API support streaming and querying. The endpoint response type is OrquestaEndpointResponse
. We recommend to use the code snippets provided in the Orquesta Admin panel to reduce risk of errors and improve ease of use.
Example: Querying an endpoint
const completion = await client.endpoints.query({
key: 'customer_service',
context: { environments: 'production', country: 'NLD' },
variables: { firstname: 'John', city: 'New York' },
metadata: { customer_id: 'Qwtqwty90281' },
});
console.log(completion.content);
Example: Streaming your endpoints
We provide support for streaming responses using Server Sent Events (SSE). The stream
method output a RxJS
observable that you can subscribe and receive the streaming chunks of data from the endpoint. The chunk type is OrquestaEndpointResponse
let endpointRef: OrquestaEndpoint;
const stream = client.endpoints
.stream({
key: 'customer_service',
context: { environments: 'production', country: 'NLD' },
variables: { firstname: 'John', city: 'New York' },
metadata: { customer_id: 'Qwtqwty90281' },
})
.subscribe({
next: (endpoint) => {
endpointRef = endpoint;
console.log(endpoint.content);
},
complete: () => {
// Executed when the streaming is completed
},
error: (error) => {
console.log(error);
},
});
If you need to cancel a stream, you can call stream.unsubscribe()
method.
Logging score and metadata for endpoints
After every query, Orquesta will generate a log with the result of the evaluation. You can add metadata
and score
to the endpoint by using the addMetrics
method.
endpointRef.addMetrics({
score: 85,
metadata: {
custom: 'custom_metadata',
chain_id: 'ad1231xsdaABw',
},
});
Usage - Prompts
Use the Prompts API to query your prompts from Orquesta.
You can use Orquesta in prompt management mode by consuming our Prompts API. The prompt value type is OrquestaPrompt
. We recommend to use the code snippets provided in the Orquesta Admin panel to reduce risk of errors and improve ease of use.
We support an unified data model structure for all our prompts and provide helper functions that map the returned value from Orquesta to the specific provider.
The query
method receives an object of type OrquestaPromptRequest
as parameter.
Example: Querying a prompt
import { OrquestaPrompt } from '@orquesta/core';
const prompt: OrquestaPrompt = await client.prompts.query({
key: 'prompt_key',
context: { environments: 'production', country: 'NLD' },
variables: { firstname: 'John', city: 'New York' },
metadata: { chain_id: 'ad1231xsdaABw' },
});
Logging responses and metadata for prompts
After every query, Orquesta will generate a log with the result of the evaluation. You can add metadata and information about the interaction with the LLM to the log by using the addMetrics
method.
The properties score
, latency
, llm_response
and economics
are reserved and used to generate your real-time dashboards. metadata
is a set of key-value pairs that you can use to add custom information to the log.
Example: Add metrics to your request log
prompt.addMetrics({
score: 85,
latency: 4000,
llm_response: 'Orquesta is awesome!',
economics: {
prompt_tokens: 1200,
completion_tokens: 750,
total_tokens: 1950,
},
metadata: {
custom: 'custom_metadata',
chain_id: 'ad1231xsdaABw',
total_interactions: 200,
},
});
Usage - Remote Configurations
Orquesta has a powerful Remote Configurations API that allows you to configure and run all your environments and services remotely dynamically. Orquesta supports different types of remote configurations, and we recommend always typing the query
method to help Typescript infer the correct type.
The useOrquestaRemoteConfig
hook receives an object of type OrquestaRemoteConfigQuery
as parameter.
Supported types: boolean
, number
, string
, json
, array
Example: Querying a configuration of type boolean
const config: OrquestaRemoteConfig<boolean> =
await client.remoteConfigs.query<boolean>({
key: 'boolean_config',
default_value: false,
context: { environments: 'production', role: 'admin' },
metadata: { timestamp: Date.now() },
});
Example: Querying a configuration of type string
const config: OrquestaRemoteConfig<string> =
await client.remoteConfigs.query<string>({
key: 'string_config',
default_value: 'string_value',
context: { environments: 'production', country: 'NL' },
metadata: { timestamp: Date.now() },
});
Example: Querying a configuration of type number
const config: OrquestaRemoteConfig<number> =
await client.remoteConfigs.query<number>({
key: 'number_config',
default_value: 1990,
context: { environments: 'production', market: 'US', domain: 'ecommerce' },
metadata: { timestamp: Date.now() },
});
Example: Querying a configuration of type array
const config: OrquestaRemoteConfig<string[]> = await client.remoteConfigs.query<
string[]
>({
key: 'array_config',
default_value: ['value1', 'value2'],
context: { environments: 'acceptance', isEnable: true },
metadata: { timestamp: Date.now() },
});
Example: Querying a configuration of type json
It's recommended to add an interface to the json configuration to help Typescript infer the correct type.
const config: OrquestaRemoteConfig<object> =
await client.remoteConfigs.query<object>({
key: 'json_config',
default_value: { dashboardEnabled: false, theme: 'dark' },
context: { environments: 'develop', platform: 'mobile' },
metadata: { timestamp: Date.now() },
});
Additional metadata logging
After every query, Orquesta will generate a log with data about the request. You can add metadata to the log using the addMetrics
method anytime.
metadata
is a set of key-value pairs
that you can use to add custom information to the log.
Example: Add metrics to your request log
prompt.addMetrics({
metadata: {
custom: 'custom_metadata',
user_clicks: 20,
selected_option: 'option1',
},
});
Orquesta API
Endpoints API
Types:
Methods:
client.endpoints.query({ ...params }) -> Promise<OrquestaEndpoint>
client.endpoints.stream({ ...params }) -> Observable<OrquestaEndpoint>
Prompts API
Types:
Methods:
client.prompts.query({ ...params }) -> Promise<OrquestaPrompt>
RemoteConfigs API
Types:
Methods:
client.remoteconfigs.query({ ...params }) -> Promise<OrquestaRemoteConfig<T>>
Node.js SDK
In this guide, you will learn how to use the Node.js SDK with Orquesta. From installation to creating a client instance, usage and references.
Source Code
The source code can be found here: https://github.com/orquestadev/orquesta-javascript/tree/main/packages/node
Installation
Using the Node package manager pr yarn, you can install Orquesta.
npm install @orquesta/node
yarn add @orquesta/node
Creating a client instance
You can get your workspace API key from the settings section in your Orquesta workspace.
https://my.orquesta.dev/<workspace>/settings/developers
Initialize Orquesta module using your API Key.
import { createClient } from '@orquesta/node';
const client = createClient({
api_key: '__API_KEY__',
ttl: 3600,
});
When creating a client instance, the following connection settings can be adjusted:
api_key
: string - your workspace API key to use for authentication.environment?
: string - the environment to use for the client. Not required but recommended to use so it's added to the evaluation context automatically.ttl?
: number - the time to live in seconds for the local cache. Default is 3600 seconds (1 hour).
Usage - Endpoints
Use the Endpoints API to query
or stream your endpoints from Orquesta.
Using endpoints to generate a LLM response based on your use case with Orquesta provides a low-latency, secure connection to the Endpoints API online prediction service. Getting out of the box metrics and logging for your LLMs.
Endpoints API support streaming and querying. The endpoint response type is OrquestaEndpointResponse
. We recommend to use the code snippets provided in the Orquesta Admin panel to reduce risk of errors and improve ease of use.
Example: Querying an endpoint
const completion = await client.endpoints.query({
key: 'customer_service',
context: { environments: 'production', country: 'NLD' },
variables: { firstname: 'John', city: 'New York' },
metadata: { customer_id: 'Qwtqwty90281' },
});
console.log(completion.content);
Example: Streaming your endpoints
We provide support for streaming responses using Server Sent Events (SSE). The stream
method output a RxJS
observable that you can subscribe and receive the streaming chunks of data from the endpoint. The chunk type is OrquestaEndpointResponse
let endpointRef: OrquestaEndpoint;
const stream = client.endpoints
.stream({
key: 'customer_service',
context: { environments: 'production', country: 'NLD' },
variables: { firstname: 'John', city: 'New York' },
metadata: { customer_id: 'Qwtqwty90281' },
})
.subscribe({
next: (endpoint) => {
endpointRef = endpoint;
console.log(endpoint.content);
},
complete: () => {
// Executed when the streaming is completed
},
error: (error) => {
console.log(error);
},
});
If you need to cancel a stream, you can call stream.unsubscribe()
method.
Logging score and metadata for endpoints
After every query, Orquesta will generate a log with the result of the evaluation. You can add metadata
and score
to the endpoint by using the addMetrics
method.
endpointRef.addMetrics({
score: 85,
metadata: {
custom: 'custom_metadata',
chain_id: 'ad1231xsdaABw',
},
});
Usage - Prompts
Use the Prompts API to query your prompts from Orquesta.
You can use Orquesta in prompt management mode by consuming our Prompts API. The prompt value type is OrquestaPrompt
. We recommend to use the code snippets provided in the Orquesta Admin panel to reduce risk of errors and improve ease of use.
We support an unified data model structure for all our prompts and provide helper functions that map the returned value from Orquesta to the specific provider.
The query
method receives an object of type OrquestaPromptRequest
as parameter.
Example: Querying a prompt
import { OrquestaPrompt } from '@orquesta/core';
const prompt: OrquestaPrompt = await client.prompts.query({
key: 'prompt_key',
context: { environments: 'production', country: 'NLD' },
variables: { firstname: 'John', city: 'New York' },
metadata: { chain_id: 'ad1231xsdaABw' },
});
Logging responses and metadata for prompts
After every query, Orquesta will generate a log with the result of the evaluation. You can add metadata and information about the interaction with the LLM to the log by using the addMetrics
method.
The properties score
, latency
, llm_response
and economics
are reserved and used to generate your real-time dashboards. metadata
is a set of key-value pairs that you can use to add custom information to the log.
Example: Add metrics to your request log
prompt.addMetrics({
score: 85,
latency: 4000,
llm_response: 'Orquesta is awesome!',
economics: {
prompt_tokens: 1200,
completion_tokens: 750,
total_tokens: 1950,
},
metadata: {
custom: 'custom_metadata',
chain_id: 'ad1231xsdaABw',
total_interactions: 200,
},
});
Usage - Remote Configurations
Orquesta has a powerful Remote Configurations API that allows you to configure and run all your environments and services remotely dynamically. Orquesta supports different types of remote configurations, and we recommend always typing the query
method to help Typescript infer the correct type.
The useOrquestaRemoteConfig
hook receives an object of type OrquestaRemoteConfigQuery
as parameter.
Supported types: boolean
, number
, string
, json
, array
Example: Querying a configuration of type boolean
const config: OrquestaRemoteConfig<boolean> =
await client.remoteConfigs.query<boolean>({
key: 'boolean_config',
default_value: false,
context: { environments: 'production', role: 'admin' },
metadata: { timestamp: Date.now() },
});
Example: Querying a configuration of type string
const config: OrquestaRemoteConfig<string> =
await client.remoteConfigs.query<string>({
key: 'string_config',
default_value: 'string_value',
context: { environments: 'production', country: 'NL' },
metadata: { timestamp: Date.now() },
});
Example: Querying a configuration of type number
const config: OrquestaRemoteConfig<number> =
await client.remoteConfigs.query<number>({
key: 'number_config',
default_value: 1990,
context: { environments: 'production', market: 'US', domain: 'ecommerce' },
metadata: { timestamp: Date.now() },
});
Example: Querying a configuration of type array
const config: OrquestaRemoteConfig<string[]> = await client.remoteConfigs.query<
string[]
>({
key: 'array_config',
default_value: ['value1', 'value2'],
context: { environments: 'acceptance', isEnable: true },
metadata: { timestamp: Date.now() },
});
Example: Querying a configuration of type json
It's recommended to add an interface to the json configuration to help Typescript infer the correct type.
const config: OrquestaRemoteConfig<object> =
await client.remoteConfigs.query<object>({
key: 'json_config',
default_value: { dashboardEnabled: false, theme: 'dark' },
context: { environments: 'develop', platform: 'mobile' },
metadata: { timestamp: Date.now() },
});
Additional metadata logging
After every query, Orquesta will generate a log with data about the request. You can add metadata to the log using the addMetrics
method anytime.
metadata
is a set of key-value pairs
that you can use to add custom information to the log.
Example: Add metrics to your request log
prompt.addMetrics({
metadata: {
custom: 'custom_metadata',
user_clicks: 20,
selected_option: 'option1',
},
});
Orquesta API
Endpoints API
Types:
Methods:
client.endpoints.query({ ...params }) -> Promise<OrquestaEndpoint>
client.endpoints.stream({ ...params }) -> Observable<OrquestaEndpoint>
Prompts API
Types:
Methods:
client.prompts.query({ ...params }) -> Promise<OrquestaPrompt>
RemoteConfigs API
Types:
Methods:
client.remoteconfigs.query({ ...params }) -> Promise<OrquestaRemoteConfig<T>>