Welcome to Portkey Forum

Home
Members
deepanshu_11
d
deepanshu_11
Offline, last seen 7 hours ago
Joined November 4, 2024
Search filters appears to be down from many days, face this very often, any ideas, team?
4 comments
d
W
Yes have experienced something similar when using Caching that the data structure changes
2 comments
d
V
Wanted to report a bug we saw when we send some text which contained some noisy special characters like "��&Adobed�" or "���Ic��a��+��" and many such things to Portkey API, we did not see the Error log in Portkey UI side, the error returned was something like
Plain Text
"{\"status\":\"failure\",\"message\":\"Unexpected non-whitespace character after JSON at position 56568 (line 1 column 56569)\"}"
4 comments
d
v
Seeing an issue in Cost calculation with Assistants API, especially with Retrieve Run, dont think Retrieve Run should cost any cent but its showing as $ amount, @visarg can you check?
11 comments
v
d
Getting error in the /completions API using new version
Plain Text
"{\"status\":\"failure\",\"message\":\"Portkey Error: You need to pass prompt_id along with virtual_key. Only virtual keys are not allowed in /v1/prompts config.\"}"

CC: @ayush-portkey @Vrushank | Portkey
26 comments
v
d
a
Some weird issue, possibly with higher size payload or something else, is that using gpt4-1106 based Model, the response is generated and shown in portkey log but does not get sent back, any idea?
17 comments
d
v
@Vrushank | Portkey @ayush-portkey do we have polling API for fetching response, so if we send request and lose internet connection within this duration we don’t want to regenerate the data, something that can be used to poll for results?
5 comments
d
V
Seems some minor bug in filtering in the 'Feedback' Analytics tab, looks like no filter there is working somehow CC: @Vrushank | Portkey @ayush-portkey
7 comments
V
d
Is the issue with sending some large value in Prompt playground fixed? Sending a 104 KB text data file content while testing a model and it crashes the webpage somehow
4 comments
d
v
V
Is it possible to view the request variables being sent to the Models somewhere? @Vrushank | Portkey
8 comments
r
d
V
Also getting this error:
Plain Text
error
Error: Invalid value for 'tool_choice': 'tool_choice' is only allowed when 'tools' are specified.

whereas not using any tool etc, just plain Portkey Model
14 comments
v
d
V
Is there a limit of data we can test in the variable in the playground, trying gpt-4-1106 API but the large input keeps on crashing the portkey app for me? In OpenAI playground, it does seem to work with same size input
13 comments
v
d
V
Is it possible to send a Prompt model from the code and host it in portkey, so basically want to use and interact with the model without the Portkey UI
10 comments
d
r
V
How does Portkey integrate with Llamaindex query engine? CC: @Vrushank | Portkey
10 comments
V
d
QQ: How many max providers we can specify in the configurations?
1 comment
V
Feature Request: Would be really cool to know what change has been made to Portkey when it's asked to Click on Update and Re-load, maybe somewhere in the Portkey window to see the Update Trace-log if possible
CC: @rohit @Vrushank | Portkey
1 comment
V
How does "Score Distribution" work? If we send two scores, one as value and other in metadata, is it adding the scores and showing that? Is it possible to have tooltip over some of these to explain what it is considering?
4 comments
d
V
Does sending all the variables is necessary when using Prompts Template in Models? for example, lets say the API created from the prompt has 3 variables and we send 2 variables only, will it break the API?
4 comments
V
d
Hello Team, loved the new Status feature, its very cool, quick feedback, I think its not reflecting things right other "Cache Miss" such as fallback/loadbalance/retry
18 comments
V
d
Actually need to see logs which got particular scores, on analytics tab, can't see that atm
7 comments
V
d