Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
5 changes: 4 additions & 1 deletion server/services/llm/anthropic.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -91,7 +91,9 @@ Before using Anthropic LLM services, you need:
</ParamField>

<ParamField path="system_instruction" type="str" default="None">
Optional system instruction to use as the system prompt.
Optional system instruction/prompt for the model. Sets the overall behavior
and context. If set, this takes precedence over any system message in the
context.
</ParamField>

### InputParams
Expand Down Expand Up @@ -194,6 +196,7 @@ await task.queue_frame(
- **Extended thinking**: Enabling thinking increases response quality for complex tasks but adds latency. The `budget_tokens` value must be at least 1024 with current models. Extended thinking is disabled by default.
- **Custom clients**: You can pass custom Anthropic client instances (e.g., `AsyncAnthropicBedrock` or `AsyncAnthropicVertex`) via the `client` parameter to use Anthropic models through other cloud providers.
- **Retry behavior**: When `retry_on_timeout=True`, the first attempt uses the `retry_timeout_secs` timeout. If it times out, a second attempt is made with no timeout limit.
- **System instruction precedence**: If both `system_instruction` (from the constructor) and a system message in the context are set, the constructor's `system_instruction` takes precedence and a warning is logged.

## Event Handlers

Expand Down
5 changes: 4 additions & 1 deletion server/services/llm/aws.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -112,7 +112,9 @@ Before using AWS Bedrock LLM services, you need:
</ParamField>

<ParamField path="system_instruction" type="str" default="None">
Optional system instruction to use as the system prompt.
Optional system instruction/prompt for the model. Sets the overall behavior
and context. If set, this takes precedence over any system message in the
context.
</ParamField>

### InputParams
Expand Down Expand Up @@ -193,6 +195,7 @@ await task.queue_frame(
- **No-op tool handling**: AWS Bedrock requires at least one tool to be defined when tool content exists in the conversation. The service automatically adds a placeholder tool when needed to prevent API errors.
- **Model-specific parameters**: Some models (e.g., Claude Sonnet 4.5) don't allow certain parameter combinations. The service only includes explicitly set parameters in the inference config to avoid conflicts.
- **Retry behavior**: When `retry_on_timeout=True`, the first attempt uses the `retry_timeout_secs` timeout. If it times out, a second attempt is made with no timeout limit.
- **System instruction precedence**: If both `system_instruction` (from the constructor) and a system message in the context are set, the constructor's `system_instruction` takes precedence and a warning is logged.

## Event Handlers

Expand Down
5 changes: 4 additions & 1 deletion server/services/llm/openai.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -103,7 +103,9 @@ Before using OpenAI LLM services, you need:
</ParamField>

<ParamField path="system_instruction" type="str" default="None">
Optional system instruction to prepend to messages.
Optional system instruction/prompt for the model. Sets the overall behavior
and context. If set, this takes precedence over any system message in the
context.
</ParamField>

### InputParams
Expand Down Expand Up @@ -182,6 +184,7 @@ await task.queue_frame(
- **OpenAI-compatible providers**: Many third-party LLM providers offer OpenAI-compatible APIs. You can use `OpenAILLMService` with them by setting `base_url` to the provider's endpoint.
- **Retry behavior**: When `retry_on_timeout=True`, the first attempt uses the `retry_timeout_secs` timeout. If it times out, a second attempt is made with no timeout limit.
- **Function calling**: Supports OpenAI's tool/function calling format. Register function handlers on the pipeline task to handle tool calls automatically.
- **System instruction precedence**: If both `system_instruction` (from the constructor) and a system message in the context are set, the constructor's `system_instruction` takes precedence and a warning is logged.

## Event Handlers

Expand Down
Loading