Prompt templates
The Scriban-based content templates Voxta uses to build the prompt content (separate from prompt formatting).
Prompt templates control the content of what gets sent to the LLM — the system instructions, the character information, the scenario framing. Compare to prompt formatting which controls the wire format the LLM expects.
Voxta uses Scriban as its templating language, so you can use variables, conditionals, loops, and function calls.
Why customize a template
The default templates work for most chats out of the box. You'd customize a template to:
- Change the AI's tone or persona globally — make every character lean into a particular style.
- Inject custom context — domain knowledge, role-specific instructions.
- Adjust how scenario state is described — tweak the wording of context blocks, events, summaries.
- Match a model's training distribution — some fine-tunes respond better to specific phrasing.
Where templates live
Each scenario can override the character's template via the Scenario template field in the Scenario General tab. If not overridden, Voxta uses the first character's template.
Beyond that, the underlying default templates ship with Voxta and aren't typically edited directly — you override them at the scenario or character level instead.
Template syntax
Scriban basics:
Plain text with {{ variables }}.
{{~ if has_flag "some_flag" ~}}
This appears when the flag is set.
{{~ end ~}}
{{~ for character in characters ~}}
- {{ character.name }}: {{ character.description }}
{{~ end ~}}See the Studio templates reference for the full list of variables (user, char, chars, scenario, has_flag, now, chat_flow, etc.) and helper functions Voxta exposes.