Welcome to Portkey Forum

Updated 9 months ago

Understanding the Use of Override Params in API Calls

I am trying the understand the use of override params in the following snippet:
Plain Text
{
  "strategy": {
      "mode": "fallback",
  },
  "targets": [
    {
      "virtualKey": "openai-virtual-key",
    },
    {
      "virtualKey": "anthropic-virtual-key",
      "override_params": {
          "model": "claude-1"
      }
    }
  ]
}

Here is how I understood this, please correct me:

  1. API calls is made through a default call ideally. It is instantiated with a virtual_key and the model (assume gpt-3.5) is specificed during the chat completion call.
  2. Now, when #1 tries and fails, the above strategy will be applied. Meaning the call will be routed to target #1 above β€” openai-virtual-key + model (assume gpt-3.5)
  3. Finally if #2 fails, same call is set through β€” anthropic-virtual-key, since the model in the chat completions call is gpt-3.5 until then, it's improtant to override the model to claude
V
S
3 comments
1 - If the Config is present in your request, any provider/model set elsewhere is ignored. So, the first call itself will be routed to 'openai-virtual-key' directly

2 - When #1 fails, second target, i.e. 'anthropic-virtual-key' is triggered. Portkey has default models set for all providers, which is why just having virtual_key in a target is sufficient sometimes. For example, the default model for OpenAI is gpt-3.5-turbo and for Anthropic, it is claude-2

3 - If you want to route to claude-2 you can get away with not setting the model through override_params here. Otherwise, you will need to set it.
Thanks for the explanation, Vrushank

I have some docs-related requests:
  • A section that lists the default models as a reference in the docs. Please ignore if that's already available, I might've missed it.
  • A note on the #1 point you mentioned.
Makes sense! Will fix both. Thanks for pointing it out!
Add a reply
Sign up and join the conversation on Discord