Skip to content

[BUG] _supports_caching doesn't recognize Bedrock application inference profile ARNs #1705

@maud-mcevoy

Description

@maud-mcevoy

Checks

  • I have updated to the lastest minor and patch version of Strands
  • I have checked the documentation and this is not expected behavior
  • I have searched ./issues and there are no duplicates of my issue

Strands Version

1.26.0

Python Version

3.12

Operating System

macOS 26.2

Installation Method

pip

Steps to Reproduce

  1. Create a Bedrock application inference profile that copies from a Claude system inference profile (e.g., us.anthropic.claude-haiku-4-5-20251001-v1:0)
  2. Create a BedrockModel using the application inference profile ARN as the model_id
  3. Enable prompt caching with cache_config=CacheConfig(strategy="auto")
  4. Make any API call through the model

Expected Behavior

Message caching is enabled and cache points are injected into messages

Actual Behavior

  1. Warning logged on every API call: "model_id=<...> | cache_config is enabled but this model does not support caching"
  2. _inject_cache_point is never called
  3. Message caching is silently disabled

Additional Context

The _supports_caching property only checks if "claude" or "anthropic" is in the model_id string:

def _supports_caching(self) -> bool:
    model_id = self.config.get("model_id", "").lower()
    return "claude" in model_id or "anthropic" in model_id

This works for system inference profile IDs (e.g., us.anthropic.claude-haiku-4-5-20251001-v1:0) but fails for application inference profile ARNs which don't contain these strings.

Possible Solution

No response

Related Issues

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions