Skip to main content

POST /v1/chat/completions

Creates a chat completion for the given messages and model.

Request body

model
string
required
The model ID to use. See the models page for available models.Example: openai/gpt-oss-120b
messages
array
required
A list of messages comprising the conversation.Each message is an object with:
  • role — one of system, user, or assistant
  • content — the message text
max_tokens
integer
Maximum number of tokens to generate. Defaults to the model’s maximum.
temperature
number
Sampling temperature between 0 and 2. Lower values are more deterministic. Defaults to 1.
top_p
number
Nucleus sampling parameter. Defaults to 1.
stream
boolean
If true, responses are sent as server-sent events. Defaults to false.
stop
string | array
Up to 4 sequences where the model will stop generating.

Example request

curl https://api.inducta.ai/v1/chat/completions \
  -H "Content-Type: application/json" \
  -H "Authorization: Bearer $INDUCTA_API_KEY" \
  -d '{
    "model": "openai/gpt-oss-120b",
    "messages": [
      {"role": "user", "content": "How many moons are there?"}
    ],
    "max_tokens": 100
  }'

Example response

{
  "id": "chatcmpl-abc123",
  "object": "chat.completion",
  "created": 1709000000,
  "model": "openai/gpt-oss-120b",
  "choices": [
    {
      "index": 0,
      "message": {
        "role": "assistant",
        "content": "There are over 200 known moons in our solar system..."
      },
      "finish_reason": "stop"
    }
  ],
  "usage": {
    "prompt_tokens": 12,
    "completion_tokens": 45,
    "total_tokens": 57
  }
}

Streaming

Set stream: true to receive responses as server-sent events. Each event contains a chunk of the response.
curl https://api.inducta.ai/v1/chat/completions \
  -H "Content-Type: application/json" \
  -H "Authorization: Bearer $INDUCTA_API_KEY" \
  -d '{
    "model": "openai/gpt-oss-120b",
    "messages": [{"role": "user", "content": "Hello"}],
    "stream": true
  }'
Each chunk follows the format:
data: {"id":"chatcmpl-abc123","object":"chat.completion.chunk","choices":[{"delta":{"content":"Hello"},"index":0}]}

data: [DONE]
The final chunk includes a usage field with token counts.