Welcome to Portkey Forum

Updated 3 days ago

Customizing Models with Portkey

How can I suppoert custom models w/ Portkey?
V
f
14 comments
Hmm. So I need to host it, and protect it w/ an API key, then route it all through you. Is that right?
Portkey doesn't do any inferencing itself
We work as a middleware gateway + observability layer
What if I don't want to make the model inference available through a public gateway?
You can run Portkey Gateway locally with npx @portkey-ai/gateway and route to your inference API without exposing it publicly
The caveat is - this is the open source version, it doesn't come with observability or dashboard yet.
Okay, so self-host?
But not the full Portkey experience. Just the Gateway
We are working on a way to let you run Portkey locally and still be able to plug to our hosted dashboard + observability suite
This is coming fairly soon!
Add a reply
Sign up and join the conversation on Discord