Welcome to Portkey Forum

Updated last week

Defining primary and secondary models for prompt completion and fallback

Hi guys, Is there any way for defining my primary model on UI (directly on PortKey web) and the secondary of fallback models in my code?
I am using prompt completion and fallback for LLM level.
v
e
3 comments
Hey, the current way of doing this is to use configs with the prompt completions API. You can structure your config like this:
Plain Text
{
    "strategy": {
        "mode": "fallback"
    },
    "targets": [
        {
            "prompt_id": "prompt_id_1"
        },
        {
            "prompt_id": "prompt_id_1",
            "override_params": {
                "model": "model-2"
            }
        }
    ]
}


Here,
  • prompt_id_1 can be the same prompt for which you are making the prompt completions request.
  • The first target is not doing any overrides so it will use all the saved prompt values including the model.
  • If it fails, the second target will be used which has the override_params setting for model.
You can attach this config in your prompt completions request. Here is an example: https://portkey.ai/docs/guides/use-cases/smart-fallback-with-model-optimized-prompts#fallback-configs-using-prompt-templates
Oh, thanks. This way just allow me to chage the main model directly on the PortKey UI.
Add a reply
Sign up and join the conversation on Discord