Welcome to Portkey Forum

Updated last month

Prompting with Google file bucket urls on other llms

Hi there, I'm trying to figure out how the fallback strategy works with vision on gemini and other LLMs? I've posted a github issue but no response (it seems the issues are not very actively monitored)

https://github.com/Portkey-AI/gateway/issues/721

I think your config is fantastic, but both fallback, loadbalance or really any strategy which uses multi LLMs is incompatible with gemini + others (say gemini + openai).

The reason being is that gemini requires image URLs to be uploaded to a google storage bucket and you to provide the URLs gs://..... (or googles internal file manager, but same issue) in either case those urls are not accessable by other LLMs since they are not public links, just internal google bucket links.

So, how could we use one call which passes gemini compatible urls + public urls for other LLMs ? right now this seems impossible. We need a way to pass both kinds of urls in a special way in the "image_url" param.

We can slightly workaround it if we only send to one LLM and don't use any config (which is sad since it's a key selling point of portkey). We know the LLM we are sending to and we can prepare the data correctly.

However, if we want to use configs the problem gets worse because if we are using a config id, then we don't even know if portkey backend will send to non-google LLMs, we don't really know anything since that's handled completely transparently by Portkey and only admin level clients could even see the config settings.
G
C
19 comments
Hey @Cookie Monster the gateway can’t store the image in the Google bucket for you
The requirement is such that you send different urls
Gemini also supports public urls, so there is no way the gateway could know beforehand what the users intention is
You can use metadata based conditional routing if that suits you better
Prompting with Google file bucket urls on other llms
No I think you misunderstand me.

I already have a google bucket URL. However google and other LLMs take different image_urls.

How can the fallback or load balance strategy work across different LLMs in this case?
google takes gs:// urls, while other LLMs take https URLs
how can I fallback from openai to gemini if they take different https urls?
Gemini does NOT support public urls
try sending a https linked image to gemini and it will reject you.
it only accepts urls either:
  1. Using the free google file manager (which provides a private link for google and gives users 20gb free upload). Once an image is uploaded here it's not accessable publically.
  2. Using google storage bucket urls
It doesn't accept arbitrary https urls of images such that openai accepts.
Do you have a corresponding http(s) url for the asset in Google bucket?
You can’t expect the same url to work for both
You can consider using a conditional router and send a config object with every request
Use override params inside the config object and use it to override the image_url param in your request for a specific target
Use override params inside the config object and use it to override the image_url param in your request for a specific target

I see, this could work. I didnt' really see documentation on overriding params on a per provider, could you send me the link? or a quick example
there's a few examples at the bottom also
we can get on a quick 5 min call if you need help
Add a reply
Sign up and join the conversation on Discord