Voxta docs

OpenAI-compatible

Generic adapter for any service that implements OpenAI's chat-completions API contract.

The OpenAI-compatible service connects Voxta to anything that speaks OpenAI's chat-completions API contract. Use it for self-hosted servers (vLLM, sglang, LiteLLM, LocalAI) or third-party providers that aren't in Voxta's first-class catalog (Groq, Together, Fireworks, DeepInfra, etc.).

Setup

Confirm the endpoint speaks OpenAI

Check that the service exposes POST /v1/chat/completions with the standard OpenAI request/response shape.

Add to Voxta

Manage Services → + Add Services → OpenAI Compatible → Add.

Fill in:

  • Base URL — the endpoint's base. For example, https://api.together.xyz/v1 or http://localhost:8000/v1.
  • API Key — whatever the endpoint needs (some don't require one — leave blank).
  • Model — the model name the endpoint expects.

On this page