The response structure has changed a bit.We have standardized the response to follow openAI response structure. Earlier you were getting response nested under .data field. Now it will not be nested. Everything else stays the same for openAI saved models.
One more important change that we introduced with new prompt route is that users will have to explicitly send stream: true/false during the calls. Whatever is saved on UI for stream will not be taken into consideration. This is how a sample call will look like:
curl -X POST "https://api.portkey.ai/v1/prompts/PROMPT_ID/completions" \
-H "Content-Type: application/json" \
-H "x-portkey-api-key: $PORTKEY_API_KEY" \
-d '{
"variables": {
# variables to pass
},
"stream": true/false
}'