@opendrift/sdk
v0.1.0Verbalized Sampling for any LLM. Generate multiple ranked responses with probability scores in three lines of code.
Installation
npm install @opendrift/sdkOr with your preferred package manager:
bun add @opendrift/sdk
pnpm add @opendrift/sdk
yarn add @opendrift/sdkQuick Start
The fastest way to get started. No LLM API keys to manage, access to hundreds of models through a single SDK, pay-as-you-go with OpenDrift credits.
import { drift, createOpenDriftClient } from '@opendrift/sdk'
// 1. Create a client with your OpenDrift API key
const client = createOpenDriftClient({
apiKey: process.env.OPENDRIFT_API_KEY,
})
// 2. Run Verbalized Sampling
const result = await drift({
prompt: 'What is the meaning of life?',
model: 'openai/gpt-4o',
k: 5, // generate 5 responses
tau: 0.3, // probability threshold
send: client.send,
})
// 3. Use the ranked responses
for (const r of result.responses) {
console.log(`[${(r.probability * 100).toFixed(1)}%] ${r.text}`)
}You can also bring your own LLM provider (OpenAI, Anthropic, OpenRouter, etc.) — see below.
API Keys
To use the OpenDrift API you need an API key. Keys are created in the API Keys dashboard.
Create a key
Go to API Keys and click "Create Key". Give it a name (e.g. "My App").
Copy immediately
The full key (starting with odk_) is shown only once. Copy it and store it securely.
Set as environment variable
export OPENDRIFT_API_KEY=odk_...Security: API keys are hashed before storage. Only the prefix (odk_a1b2...) is visible in the dashboard. If a key is compromised, revoke it immediately from the API Keys page.
API Reference
POST /api/sdk/complete
The endpoint called by createOpenDriftClient. Authenticates via Bearer token. You can also call it directly.
Headers
Authorization: Bearer odk_...
Content-Type: application/jsonRequest Body
{
"messages": [
{ "role": "system", "content": "Optional system context" },
{ "role": "user", "content": "Your prompt here" }
],
"model": "openai/gpt-4o", // optional, default: openai/gpt-4o
"k": 5, // optional, default: 5
"tau": 0.3 // optional, default: 0.3
}Response
{
"content": "raw model output string",
"responses": [
{ "text": "...", "probability": 0.35, "model": "openai/gpt-4o" },
{ "text": "...", "probability": 0.25, "model": "openai/gpt-4o" },
...
]
}cURL Example
curl -X POST https://opendrift.xyz/api/sdk/complete \
-H "Authorization: Bearer odk_..." \
-H "Content-Type: application/json" \
-d '{
"messages": [{ "role": "user", "content": "What is the meaning of life?" }],
"model": "openai/gpt-4o",
"k": 5,
"tau": 0.3
}'Error Codes
| Status | Meaning |
|---|---|
| 401 | Missing, invalid, or revoked API key |
| 400 | Invalid request body |
| 500 | Upstream model error |
Core Functions
The SDK exposes low-level building blocks if you need full control. These are pure functions with zero dependencies.
buildSystemPrompt(k, tau)
Returns the Verbalized Sampling system prompt string. Inject this as the system message in your LLM call.
import { buildSystemPrompt } from '@opendrift/sdk'
const systemPrompt = buildSystemPrompt(5, 0.3)
// Returns the full system prompt instructing the LLM to produce
// 5 responses in XML format with probability scoresparseDriftOutput(raw, model)
Parses raw LLM output containing XML-formatted responses into structured results sorted by probability. Falls back gracefully if the model doesn't follow the expected format.
import { parseDriftOutput } from '@opendrift/sdk'
const responses = parseDriftOutput(rawOutput, 'gpt-4o')
// Returns DriftResponse[] sorted by probability (descending)
// Each: { text: string, probability: number, model: string }
// If the model doesn't follow XML format, returns:
// [{ text: rawOutput, probability: 1.0, model: 'gpt-4o' }]drift(params)
Orchestrates the full Verbalized Sampling flow: builds the system prompt, calls your send function, and parses the response.
import { drift } from '@opendrift/sdk'
const result = await drift({
prompt: 'Your question here',
model: 'openai/gpt-4o',
k: 5,
tau: 0.3,
systemPromptOverride: 'Optional extra system context',
send: yourSendFunction,
})
// result: DriftRunResultBring Your Own Provider
The drift() function accepts any send function that takes messages and returns a string. This means it works with any LLM provider.
OpenAI
import { drift } from '@opendrift/sdk'
import OpenAI from 'openai'
const openai = new OpenAI({ apiKey: process.env.OPENAI_API_KEY })
const result = await drift({
prompt: 'Explain quantum computing',
model: 'gpt-4o',
k: 5,
tau: 0.3,
send: async (messages) => {
const res = await openai.chat.completions.create({
model: 'gpt-4o',
messages,
})
return res.choices[0].message.content!
},
})Anthropic
import { drift } from '@opendrift/sdk'
import Anthropic from '@anthropic-ai/sdk'
const anthropic = new Anthropic()
const result = await drift({
prompt: 'Explain quantum computing',
model: 'claude-sonnet-4-20250514',
k: 5,
tau: 0.3,
send: async (messages) => {
const system = messages
.filter((m) => m.role === 'system')
.map((m) => m.content)
.join('\n\n')
const userMessages = messages.filter((m) => m.role !== 'system')
const res = await anthropic.messages.create({
model: 'claude-sonnet-4-20250514',
max_tokens: 4096,
system,
messages: userMessages,
})
return res.content[0].type === 'text' ? res.content[0].text : ''
},
})OpenRouter
Access hundreds of models through a single API using the OpenAI SDK format.
import { drift } from '@opendrift/sdk'
import OpenAI from 'openai'
const openrouter = new OpenAI({
baseURL: 'https://openrouter.ai/api/v1',
apiKey: process.env.OPENROUTER_API_KEY,
})
const result = await drift({
prompt: 'Explain quantum computing',
model: 'anthropic/claude-sonnet-4',
k: 5,
tau: 0.3,
send: async (messages) => {
const res = await openrouter.chat.completions.create({
model: 'anthropic/claude-sonnet-4',
messages,
})
return res.choices[0].message.content!
},
})Any Provider
Write a custom send function for any LLM that supports chat completions:
const result = await drift({
prompt: 'Your question',
model: 'your-model',
k: 5,
tau: 0.3,
send: async (messages) => {
const res = await fetch('https://your-llm-api.com/chat', {
method: 'POST',
headers: {
'Content-Type': 'application/json',
'Authorization': 'Bearer ...',
},
body: JSON.stringify({ messages }),
})
const data = await res.json()
return data.content // must return a string
},
})Types
interface Message {
role: 'system' | 'user' | 'assistant'
content: string
}
interface DriftResponse {
text: string // The response text
probability: number // 0.0 – 1.0
model: string // Model that generated it
}
interface DriftRunResult {
responses: DriftResponse[] // Sorted by probability (desc)
rawContent: string // Raw model output
model: string
k: number
tau: number
}
interface DriftParams {
prompt: string
model: string
k: number // 1–10
tau: number // 0.01–1.0
systemPromptOverride?: string
send: (messages: Message[]) => Promise<string>
}
interface OpenDriftClientOptions {
apiKey: string
baseUrl?: string // defaults to https://opendrift.xyz
}