Welcome to Portkey Forum

Updated 11 months ago

Portkey error in the /completions api using new version

Getting error in the /completions API using new version
Plain Text
"{\"status\":\"failure\",\"message\":\"Portkey Error: You need to pass prompt_id along with virtual_key. Only virtual keys are not allowed in /v1/prompts config.\"}"

CC: @ayush-portkey @Vrushank | Portkey
a
d
v
26 comments
can you please share the config you are using?
@ayush-portkey here's the config

Plain Text
{
    "cache": {
        "mode": "semantic"
    },
    "retry": {
        "attempts": 3
    },
    "strategy": {
        "mode": "single"
    },
    "targets": [
        {
            "provider": "openai",
            "virtual_key": "open-xyz",
            "override_params": {
                "model": "gpt-4-1106-preview",
                "response_format": {
                    "type": "json_object"
                }
            }
        }
    ]
}
Thanks. And you are using https://api.portkey.ai/v1/completions endpoint right?
yeah right, saw in the Config console and Dec migration document that it would be auto-migrated
was working fine till 27th Dec and now breaking
This is migrated new format only
Hey @deepanshu_11 , are you using prompt completions route?
Yes @visarg @ayush-portkey
Can you add the base prompt_id to all the blocks which has virtual_key in it.
Plain Text
{
    "cache": {
        "mode": "semantic"
    },
    "retry": {
        "attempts": 3
    },
    "strategy": {
        "mode": "single"
    },
    "targets": [
        {
            "provider": "openai",
            "prompt_id": "prompt_id",
            "virtual_key": "open-xyz",
            "override_params": {
                "model": "gpt-4-1106-preview",
                "response_format": {
                    "type": "json_object"
                }
            }
        }
    ]
}
Something like this
base prompt id = prompt id that you send in url.
okay so for eg: if I have multiple prompts that are using the same config, how to add all prompt_id (s)?
Current way is to create multiple configs. But let me fix that behaviour as it become an overhead for you. I will prefill configs on-the-fly from the request URL,
Quick update on this: We have finalized the changes. It should be pushed in a hour
Awesome, thanks for the update
This is done. Can you please check and let us know?
Sure lemme check it soon and get back
What should I expect though?
Dont see any prompt ids automatically added to the config
Nothing gets changes on the UI. The config will be as it is. You can use it with multiple prompts. What we did is to add prompt_id from URLs on the fly. So if you do not add any prompt_id in config, we will pick it up from this URL that you use to make the call.

https://api.portkey.ai/v1/prompts/PROMPT_ID/completions
But you should not get this error anymore:
Plain Text
"{\"status\":\"failure\",\"message\":\"Portkey Error: You need to pass prompt_id along with virtual_key. Only virtual keys are not allowed in /v1/prompts config.\"}"
Amazing, great user experience, loved this!
Add a reply
Sign up and join the conversation on Discord