Welcome to Portkey Forum

Updated 8 months ago

Incomplete json response from openai

@Team Portkey - i'm running into a wierd issue where i am instructing open ai to return a json response. i'd expect a 100-200 word response. but when open ai starts making the json and ends abruptly after 2 lines, giving an incomplete and incorrectly formatted json. strange thing is that i have a retry set and it gives the exact same response 5 times in a row. i dont have any caching enabled, could this be an issue on portkey side?
1
v
t
e
10 comments
Hey @thismlguy - Are you passing any max_tokens value in the request. And what are the other parameters that you are passing to openai?
yes i am, but they are very high numbers
should i not add max_tokens?
checkout trace id - b334b746-4762-4b0c-bd77-37aa84f306b8
do u have some time to debug?
like on a google meet
Did you fix this? I got this behaviour when the prompt was overly complex and perhaps ambiguous. Simplyfing the task worked for me.
@ekevu. @thismlguy perhaps if convenient to you both, you could utilise the #huddle channel to quickly discuss when you're online. I know @thismlguy has been facing this problem for a while, but it's looking like it's hard to replicate unless you've seen the problem yourself.

What @ekevu. suggested is fantastic - it could very well be a prompt engineering problem at the end of the day πŸ™ƒ
i agree it's likely prompt engineering issue
simpler is better for sure, but there's a lot going on. it only happens in certain edge case, works mostly
Add a reply
Sign up and join the conversation on Discord