Welcome to Portkey Forum

Updated 2 months ago

Eta for this support

Any ETA for this support?
V
e
S
24 comments
Not at the moment. But we'd love to have community contribution for this
Ok, I will check it. One question about this for what I can see in the code: I see that the payload format for embedding in most of the providers is something like model and input, and then in input like a list of strings to embed. In the case of vertex, the payload is a bit different as is an instances object with a list oj objects containing the different texts and the task type. Is it ok to adapt the config for this provider? or do I need to follow the same format of input and texts in order for the PR to be approved? I know you can transform it, but I expect people to just have to follow the same payload format as in the vertex documentation when they submit the requests
This is spectacular. Thank you so much @elentaure.!! Really appreciate it
Glad it worked out!
@sega will review & merge it shortly
Added a few comments @elentaure.
regarding this, the gateway wouldn't be able to support the task_type parameter, doing so would mean diverging from OpenAI format, since the task_type parameter is optional, it shouldn't be an issue
I understand that, but as mentioned, transforming this would mean losing functionality from the vertex api. The format is not the same as OpenAi as its a different product. Why should the consumer not be able to leverage the API as google intended? even if the field is optional we would be limiting the capabilities of the vertex embeddings and the task type would always be RETRIEVAL_QUERY
The intention of my implementation is to be able to send as input the payload that vertex requires in the api. That way, checking the official documentation for example should be enough to generate a valid payload
Is there a reason to have only openai format regardless of the provider and the model?
OpenAI supported parameters cover all of (>95%) the essenstial functionality required for interfacing with AI models
since it's an optional param, it shouldn't affect how the embeddings are created
@elentaure. Do you plan to use this exclusively through the API or through the SDK also?
Let me ask internally. Diverging from OpenAi format would break the functionality in the SDk?
we could add support for this in the sdk and the gateway, so far we've not diverged from OpenAI format because we didn't find any edge case compelling enough to do so. If passing task type is essenstial for you, we can discuss it internally and implement it
Sounds good, let me ask and Ill let you know. Some of my colleagues are already in contact via email as well
Thanks for the support
Hey @elentaure. checked with the team, we can support task_type. Infact portkey supports additional params for a lot of providers, added the comment here
https://github.com/Portkey-AI/gateway/pull/622#discussion_r1778200276
And this works with the SDK as well (extra params are passed as **kwargs)
Cool, Ill check it
Add a reply
Sign up and join the conversation on Discord