Welcome to Portkey Forum

Home
Members
koishore
k
koishore
Offline, last seen 6 days ago
Joined January 27, 2025
"choices": [
{
"index": 0,
"message": {
"role": "assistant",
"content": null,
"refusal": "I'm sorry, I can't assist with that request."
},
"logprobs": null,
"finish_reason": "stop"
}
],

possible reasons for getting this repeatedly on the same request?
3 comments
s
k
@Vrushank | Portkey @Siddharth | Portkey

There is a production issue following the latest update. The prompt functionality has started breaking, and it seems that URLs are no longer being supported and that image inputs only expect base64 encoded images. Additionally, the LLM being used (GPT-4o) is encountering errors, with the dashboard displaying generation errors after the UI refresh for the same example that was working before the changes. Can you get this checked ASAP?

cc: @Sabbyasachi @visarg
4 comments
k
S
R
is there a way to natively trigger a retry if this happens randomly or should i just have a retry mechanism in the code itself?
4 comments
k
V
Possible bug: When I choose a reasoning model (o1-mini, o3-mini etc) and set the max_tokens, it shows an error saying the following (screenshot attached).
1 comment
V
Is there any way to use Florence2 through Portkey's client? One possible way I think might work is through HuggingFace, but any plans of supporting it natively/ through something like a Fal or RunPod or Modal? If this already exists, can someone point me to the docs? Thanks!
2 comments
s
V