Problem/Motivation

The Anthropic provider uses the OpenAI compatibility layer which ignores Anthropic-specific features:

  • Extended thinking - requires thinking parameter
  • Prompt caching - requires cache_control blocks on content

Both require the native Anthropic Messages API.

Steps to reproduce

These features cannot be enabled - the OpenAI compatibility layer ignores these parameters entirely.

Proposed resolution

Add native API support following the pattern used by fetchAvailableModels() which already uses direct HTTP with Anthropic headers.

Remaining tasks

  • Add chatWithNativeApi() method for direct Anthropic API calls
  • Support extended thinking via thinking parameter
  • Support prompt caching via cache_control blocks

User interface changes

None initially.

API changes

New protected method: chatWithNativeApi()

Data model changes

None. Token tracking (reasoning, cached) already exists in base module.

Command icon Show commands

Start within a Git clone of the project using the version control instructions.

Or, if you do not have SSH keys set up on git.drupalcode.org:

Comments

camoa created an issue. See original summary.

camoa’s picture

Status: Active » Closed (duplicate)

Now that this issue is closed, review the contribution record.

As a contributor, attribute any organization that helped you, or if you volunteered your own time.

Maintainers, credit people who helped resolve this issue.