Why Native Tool Use?
| Aspect | XML in Prompt | Native API |
|---|---|---|
| Accuracy | Depends on the model parsing XML | Structured by the API, unambiguous |
| Tokens | XML consumes context window tokens | Separate fields, more efficient |
| Cache | No optimization | cache_control:ephemeral on Anthropic |
| Validation | Manual | Automatic by the provider API |
| Compatibility | Any provider | OpenAI and Anthropic (others continue with XML) |
Architecture
ToolAwareClient Interface
TheToolAwareClient interface extends the base LLMClient with tool support:
Automatic Detection
Detection is done via type assertion, with no configuration required:Providers that do not implement
ToolAwareClient continue to work normally via SendPrompt.Data Types
ToolDefinition
ToolDefinition
Defines a tool available to the model:
ToolCall and ToolResult
ToolCall and ToolResult
Represent a tool call by the model and its result:
LLMResponse
LLMResponse
Unified response that can contain text and/or tool calls:
Provider Implementations
- OpenAI
- Anthropic (Claude)
Uses the
tools field in the Chat Completions API:- Sends tools as a
toolsarray withtool_choice: "auto" - Processes
tool_callsinchoices[0].message toolmessages in the history link results to thetool_call_id
ContentBlock with Cache Control
For Anthropic, the system prompt is split into blocks with cache control:Fallback Integration
The fallback chain (llm/fallback) supports SendPromptWithTools automatically. Providers without native tool use support are skipped in the tool call chain but remain available for plain text requests.