Welcome to Portkey Forum

Updated 3 weeks ago

Enabling Cache Control for Anthropic API Requests in Portkey

Does Portkey have any way to enable Cache Control for all requests sent for Anthropic API? Like there's now a simple button in the playground, so maybe some config option that will add all input to the cache control. Basically to try and make it work like OpenAI's caching; automatic and always on.

The way we're using Portkey with HARPA AI, we don't really have a way to set cache control otherwise, so we can't use caching.
G
A
4 comments
Hey @Anshul , currently no, the UI also handles this per message object only, not for the entire request, we were thinking if we should implement this, but since openai doesn't have anything comparable we kept it on hold
I think it'll be a great feature if we can add this,
I'll communicate this to the product team
OpenAI's caching works really well for our team because it's automatic. I checked with Portkey logs and it does work most of the time for us.

And thank you for considering this feature. Would be a lifesaver really. We use large-prompt automations that use 100k+ static tokens, so prompt caching would be a big benefit. It's just the app we use, HARPA AI, does not support prompt caching either, so a global ON or something would be very nice.
Add a reply
Sign up and join the conversation on Discord