generated from amazon-archives/__template_Apache-2.0
-
Notifications
You must be signed in to change notification settings - Fork 656
Open
Labels
bugSomething isn't workingSomething isn't working
Description
Checks
- I have updated to the lastest minor and patch version of Strands
- I have checked the documentation and this is not expected behavior
- I have searched ./issues and there are no duplicates of my issue
Strands Version
1.26.0
Python Version
3.12
Operating System
macOS 26.2
Installation Method
pip
Steps to Reproduce
- Create a Bedrock application inference profile that copies from a Claude system inference profile (e.g., us.anthropic.claude-haiku-4-5-20251001-v1:0)
- Create a BedrockModel using the application inference profile ARN as the model_id
- Enable prompt caching with cache_config=CacheConfig(strategy="auto")
- Make any API call through the model
Expected Behavior
Message caching is enabled and cache points are injected into messages
Actual Behavior
- Warning logged on every API call:
"model_id=<...> | cache_config is enabled but this model does not support caching" - _inject_cache_point is never called
- Message caching is silently disabled
Additional Context
The _supports_caching property only checks if "claude" or "anthropic" is in the model_id string:
def _supports_caching(self) -> bool:
model_id = self.config.get("model_id", "").lower()
return "claude" in model_id or "anthropic" in model_id
This works for system inference profile IDs (e.g., us.anthropic.claude-haiku-4-5-20251001-v1:0) but fails for application inference profile ARNs which don't contain these strings.
Possible Solution
No response
Related Issues
No response
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
bugSomething isn't workingSomething isn't working