Welcome to Portkey Forum

Updated 3 weeks ago

@Vrushank | Portkey Hi! I am noticing

Hi! I am noticing that the system prompt is missing in the logging when using Anthropic and Portkey as a proxy. Anthropic uses the parameter system rather than a system message like OpenAI does. I think you might not be picking up any system parameter when working as proxy mode. This also affects reported token count

Additionally, the response is JSON-only
V
Y
6 comments
Unfortunately, @Yorick this is expected behaviour, as Anthropic's /messages route is not supported on Portkey. This is by design - we make it work on the Gateway's /chat/completions route instead.

In this case, Portkey just acts as a dumb proxy for your requests and other features like cost metrics, fallbacks, caching etc. will not be supported.
I'd suggest to move to using the Portkey SDK or the OpenAI SDK with Portkey to call Anthropic.

Though I do want to understand @Yorick if there are any critical features with Anthropic SDK that we are missing out on Portkey
I am using Vercel's AI SDK. That is why I need to proxy
@Vrushank | Portkey So you see this message above
Got it, @Yorick - give me a day, and I'll get back to you on the Vercel AI support for Portkey
Sorry for the delay here!!
Add a reply
Sign up and join the conversation on Discord