Problem/Motivation
The Anthropic provider uses the OpenAI compatibility layer which ignores Anthropic-specific features:
- Extended thinking - requires
thinking parameter
- Prompt caching - requires
cache_control blocks on content
Both require the native Anthropic Messages API.
Steps to reproduce
These features cannot be enabled - the OpenAI compatibility layer ignores these parameters entirely.
Proposed resolution
Add native API support following the pattern used by fetchAvailableModels() which already uses direct HTTP with Anthropic headers.
Remaining tasks
- Add
chatWithNativeApi() method for direct Anthropic API calls
- Support extended thinking via
thinking parameter
- Support prompt caching via
cache_control blocks
User interface changes
None initially.
API changes
New protected method: chatWithNativeApi()
Data model changes
None. Token tracking (reasoning, cached) already exists in base module.
Comments
Comment #2
camoa commented