Welcome to Portkey Forum

Updated 12 months ago

Weird issue with higher size payload or something else using gpt4-1106 based model

Some weird issue, possibly with higher size payload or something else, is that using gpt4-1106 based Model, the response is generated and shown in portkey log but does not get sent back, any idea?
v
d
17 comments
Are these requests taking more than 100 seconds?
@visarg they sometimes do take around >100 sec
We would suggest you to move to our new routes that we released couple of days ago. The new prompt completions route will allow very high timeout. Here is a migration guide for the same
https://docs.portkey.ai/docs/changelog/portkeys-december-migration
sure got it, thanks for confirmation
@visarg how much time out limit newer routes would allow?
Also just to confirm change for us, is this the only change we have to do:
curl https://api.portkey.ai/v1/prompts/$PROMPT_ID/generate to
curl https://api.portkey.ai/v1/prompts/$PROMPT_ID/completions

rest remains same, right?
The response structure has changed a bit.We have standardized the response to follow openAI response structure. Earlier you were getting response nested under .data field. Now it will not be nested. Everything else stays the same for openAI saved models.

One more important change that we introduced with new prompt route is that users will have to explicitly send stream: true/false during the calls. Whatever is saved on UI for stream will not be taken into consideration. This is how a sample call will look like:

Plain Text
curl -X POST "https://api.portkey.ai/v1/prompts/PROMPT_ID/completions" \
-H "Content-Type: application/json" \
-H "x-portkey-api-key: $PORTKEY_API_KEY" \
-d '{
    "variables": {
        # variables to pass
    },
    "stream": true/false
}'
@visarg another quick question, would you prefer us using node js or Rest API, would there be any difference?
We would suggest you to use our SDKs because it abstracts out things for the user. But performance-wise, both REST API and SDK are the same.
Got it, and also could not find out in the documentation that this "stream": true/false parameter is mandatory
Ohh. Will quickly add that as a note.
So just to confirm it is mandatory now?
Yes. It is mandatory to set stream:true if you want to enable streaming. By default, stream is disabled for prompt completions.
No dont need stream, then probably if I am understand it correctly, dont need to send that variable
Yes. If you do not send stream true in SDK call then it will not do streaming
Add a reply
Sign up and join the conversation on Discord