Welcome to Portkey Forum

Updated 3 weeks ago

the openai proxy approach will work

Oh good, nice to know that the OpenAI proxy approach will work. Looking at the trace, it looks like Tool Choice = required But it's probably because I was calling Claude which isn't supported yet. No worries on the timing, I'm still in pre-launch research mode and the OpenAI tools are working fine for now. Thanks!
K
V
S
14 comments
Hi @Vrushank | Portkey πŸ‘‹ Just wanted to check in and see if you have any updates on the Anthropic function calling? Sonnet 3.5 is so good that I'd love to swap out my gpt-4 function calling in my coding agent. I think it could be a major unlock. I'm not blocked or anything, just curious if you have an update on when this might be available so I can plan whether or not to wait or just write up a temp util function...
So we are prioritizing this and looking to have it latest this week. It's also a top priority for some more of customers πŸ˜„
Hi there - just wanted to check in on status for this. Also, do you offer support for extra headers like this option to extend the max token output for Sonnet 3.5? https://x.com/alexalbert__/status/1812921642143900036
Specifically, the way they're implementing this via the SDK is:
const anthropic = new Anthropic({
apiKey: process.env.ANTHROPIC_API_KEY,
defaultHeaders: {
"anthropic-beta": "max-tokens-3-5-sonnet-2024-07-15",
},
});
Hey @Kevin Leneway - function calling interoperability on Anthropic, Vertex, Gemini API, and Bedrock is now supported on Portkey!

Just use your existing OpenAI function calling code on Portkey and change the providers and everything should keep working as it is!
Yes! Saw this, we'll pick it up soon. cc @visarg @sega

Also looking for contributors who would like to take it up!

We are tracking it here - https://github.com/Portkey-AI/gateway/issues/465
Great news! I had just implemented a workaround yesterday for the tool calling but I'm pulling that out now. I also added a direct call to Anthropic that uses the new header, here's the code if it's helpful: https://github.com/jacob-ai-bot/jacob/blob/main/src/server/anthropic/request.ts#L37

@sega I'm not familiar with the Portkey codebase but it looks like maybe adding the header here would enable it? https://github.com/Portkey-AI/gateway/blob/491d2651b1e5785b1c2ca885550544074d78e194/src/providers/anthropic/api.ts#L5

"anthropic-beta": "max-tokens-3-5-sonnet-2024-07-15"

Also you likely would be able to remove this line since it's from 2023 and would override the default header? But I'm not sure why this was added or what it does, so...πŸ€·β€β™‚οΈ
https://github.com/Portkey-AI/gateway/blob/491d2651b1e5785b1c2ca885550544074d78e194/src/providers/anthropic/api.ts#L11
Thanks so much for the awesome support as always, I'm continually evangelizing Portkey to everyone I meet. πŸ™‚
Again, thanks for implementing this. I'm trying it out now and it's working but it seems to only be outputting one tool choice at a time, instead of multiple tools in parallel. Is this a known issue? Here's my code, I'm passing in a parallel_tool_calls=true param, is that the correct way to do it? Here's a trace id if it's helpful: bef8a83f-4279-4f6f-8318-9623719361f0
Hey Kevin, If you need the LLM to return a tool call always, you can use tool_choice: required
Gateway returns multiple tool calls in response as is, here's an example request in anthropic that returns multiple tool calls
Oh thank you, that’s helpful. I switched back to GPT-4o which might just be a better choice for this use case.
Thanks so much Kevin! Means a lot!
Btw, @Kevin Leneway in the meanwhile we add the support. for this on the Gateway, since you're using the OpenAI SDK, you could send the anthropic-beta header directly using our forward-headers feature and with that you can set the maxTokens for the Claude 3.5 Sonnet model to be 8192.

In your OpenAI defaultHeaders, you can add:
Plain Text
'anthropic-beta': 'max-tokens-3-5-sonnet-2024-07-15',
'x-portkey-forward-headers':'anthropic-beta'

headers
That's a great tip - thanks!
Add a reply
Sign up and join the conversation on Discord