Log in
Log into community
Welcome to Portkey Forum
New post
View all posts
Related posts
Did this answer your question?
๐
๐
๐
Powered by
Hall
Active
Updated 3 days ago
0
Follow
Customizing Models with Portkey
Customizing Models with Portkey
Active
0
Follow
f
ferpaderpa
3 days ago
ยท
How can I suppoert custom models w/ Portkey?
V
f
14 comments
Share
Open in Discord
V
Vrushank | Portkey
edited 3 days ago
Hey! Does this do the job?
https://portkey.ai/docs/integrations/llms/byollm
f
ferpaderpa
3 days ago
Hmm. So I need to host it, and protect it w/ an API key, then route it all through you. Is that right?
V
Vrushank | Portkey
3 days ago
Correct!
V
Vrushank | Portkey
3 days ago
Portkey doesn't do any inferencing itself
V
Vrushank | Portkey
3 days ago
We work as a middleware gateway + observability layer
f
ferpaderpa
3 days ago
What if I don't want to make the model inference available through a public gateway?
V
Vrushank | Portkey
3 days ago
You can run Portkey Gateway locally with
npx @portkey-ai/gateway
and route to your inference API without exposing it publicly
V
Vrushank | Portkey
3 days ago
The caveat is - this is the open source version, it doesn't come with observability or dashboard yet.
f
ferpaderpa
3 days ago
Okay, so self-host?
V
Vrushank | Portkey
3 days ago
Yes!
V
Vrushank | Portkey
3 days ago
But not the full Portkey experience. Just the Gateway
f
ferpaderpa
3 days ago
Hmm, okay
V
Vrushank | Portkey
3 days ago
We are working on a way to let you run Portkey locally and still be able to plug to our hosted dashboard + observability suite
V
Vrushank | Portkey
3 days ago
This is coming fairly soon!
Add a reply
Sign up and join the conversation on Discord
Join on Discord