Welcome to Portkey Forum

Home
Members
ruu8010
r
ruu8010
Offline, last seen 2 days ago
Joined January 24, 2025
I'm trying to create a virtual key to connect to my Azure AI Foundry deployed models. When I use Azure Inference as provider I can only select a model from a default list, but I don't see my actual deployed models there. How could I make this work? For example: I have LLama 3.3 deployed in Azure AI Foundry, how can I creaet a virtual key to connect to that model?
8 comments
T
r
V
b
I was wondering if someone can explain how can call a prompt endpoint from my openwebui client using a function? Just changing the endpoint in the plugin doesn’t seem to work and gives a 500 error?
2 comments
V
r
I was wondering if someone can explain how can call a prompt endpoint from my openwebui client using a function? Just changing the endpoint in the plugin doesn’t seem to work and gives a 500 error?
11 comments
s
r