Models
configure models
Models are configured per agent using a model URI. Open an agent, then set the Default model field in the agent editor. Providers must be configured first; see Providers.
model URI format
{scheme}://{provider}/{model}?{params}
schemecan bellm(text),vlm(vision), ortemb(text embeddings).providermust match a configured provider name (for exampleopenaiorgroq).modelis the provider's model identifier.paramsis optional query string settings forwarded to the provider.
supported query params
Chat Completions (openai providers)
OpenAI-compatible chat completion providers support these query parameters on the model URI:
frequency_penalty(float, -2.0 to 2.0)presence_penalty(float, -2.0 to 2.0)max_completion_tokens(int, > 0)max_tokens(int, > 0)temperature(float, 0.0 to 2.0)parallel_tool_calls(bool, defaulttrue)service_tier(auto,default,flex,scale,priority)verbosity(low,medium,high)
Responses API (open_responses providers)
Responses providers support these query parameters on the model URI:
max_output_tokens(int, > 0)temperature(float, 0.0 to 2.0)top_p(float, 0.0 to 1.0)parallel_tool_calls(bool, defaulttrue)reasoning_effort(string, optional)include_reasoning_encrypted(bool, defaulttrue)
examples
llm://openai/gpt-4ollm://groq/llama-3.1-70b?temperature=0.2llm://openai/gpt-5-2025-08-07?reasoning_effort=high