Welcome to Portkey Forum

Updated 8 months ago

Cache issues with function calls

At a glance
Cache problems with function calls again. The cache fails to return the same response as the original request did, making my schema validation fail every time.
H
V
v
12 comments
Traceid: 5d6f8756-7f69-4aca-b9fa-600dc623f6fa
I`m guessing it is because you do not return these:

{
"code": "invalid_type",
"expected": "number",
"received": "undefined",
"path": [
"usage",
"prompt_tokens"
],
"message": "Required"
},
{
"code": "invalid_type",
"expected": "number",
"received": "undefined",
"path": [
"usage",
"completion_tokens"
],
"message": "Required"
}

Usage and completion_tokens.
Thanks, @Hans Magnus checking shortly ๐Ÿƒโ€โ™‚๏ธ
@Hans Magnus do you please mind DMing me your Portkey associated email?
Hey. OpenAI recently released stream_options param which allows users to set include_usage flag. Setting this to true will return the usage object as the last stream chunk.

Vercel AI currently hardcodes stream_options: { include_usage: true } for openai calls and so it also expects the usage object in the stream response. But portkey currently does not hardcode stream_options. Due to this, the stream response does not contain usage object and the validation fails. Can you please add stream_options param as mentioned above in your portkey request body and try with that?

stream_options ref: https://platform.openai.com/docs/api-reference/chat/create#chat-create-stream_options

vercel ai hardcoded stream_options code ref: https://github.com/vercel/ai/blob/03eb9e3d34db4734ab8cf927749efaebcec1b217/packages/openai/src/openai-chat-language-model.ts#L200-L203
Right, thanks! They do not hardcode it, as you can use compatible mode.
Attachment
Screenshot_2024-06-13_at_15.33.25.png
I can confirm that it worked beautifully! ๐Ÿคฉ Thanks for the follow-up
Just a heads up. This started to fail again, but the other way around. So I turned on strict and now caching works. But since this can happen runtime if feels brittle. Any idea why you suddenly started to return streaming options?
Thatโ€™s merely because OpenAI API sends it
@visarg if you have any thoughts!
Add a reply
Sign up and join the conversation on Discord