Hey, the current way of doing this is to use configs with the prompt completions API. You can structure your config like this:
{
"strategy": {
"mode": "fallback"
},
"targets": [
{
"prompt_id": "prompt_id_1"
},
{
"prompt_id": "prompt_id_1",
"override_params": {
"model": "model-2"
}
}
]
}
Here,
prompt_id_1
can be the same prompt for which you are making the prompt completions request.- The first target is not doing any overrides so it will use all the saved prompt values including the model.
- If it fails, the second target will be used which has the override_params setting for model.
You can attach this config in your prompt completions request. Here is an example:
https://portkey.ai/docs/guides/use-cases/smart-fallback-with-model-optimized-prompts#fallback-configs-using-prompt-templates