Hey friends, love Portkey so far especially since it integrates an LLM gateway and an LLM observability solution into a single product! A few issues I've discovered so far:
- Integrating Sonnet 3.5, I've noticed that even though Portkey says it conforms to the OpenAI spec, for streaming responses I'm receiving an
end_turn
as finish reason, instead of stop
like OpenAI would do - While the docs say I can attach the
user
property to the /v1/chat/completions
request, the user property was only populated in the Portkey UI when I was using the _user
metadata KV - Not a bug per-se, but more of a usability issue, but it took me forever to figure out why the "Create" button for my new Guardrail is disabled -> I didn't put a name for the Guardrail, but there was no indication in the UI that I was missing the name