settings

Pick a provider. The free preset uses Kismet's default backend (whatever this instance is configured with). Custom lets you point at any OpenAI-compatible endpoint with your own key — choose whether to keep it browser-local or sync it to your account.

Credentials are server-managed — you won't see them. Pick a model and we'll route calls through.

Type a model id or fetch the list from the provider. Save the form first if you just changed the API base URL or key.

sampling — advanced (leave blank to use instruct defaults)

vLLM / together / llama.cpp only

an llm cyoa engine · self-hosted or cloud · bring your own key · guidelines