Welcome to Portkey Forum

Updated last week

Llm model router implementation for query routing

Is there any LLM model router implementation that folks use for determining which LLM to route based on user's query? Prefer open-source
S
1 comment
Hey @deepanshu_11 I have not seen much success with non deterministic routers. Portkey allows you to conditionally route your requests based on metadata attached with a user query.

A simple use case to this would be:
  • all the free users of app use- GPT-4o
  • all the paid users will use O1 model.
you can send this metadata simply using portkey's API.

Here are the link conditonally routing docs using Portkey: https://portkey.ai/docs/product/ai-gateway/conditional-routing
Add a reply
Sign up and join the conversation on Discord