Welcome to Portkey Forum

Updated 7 months ago

Monitor logs to Llama3 (locally)

Trying to do: Monitor logs to Llama3 (through Ollama) using Portkey

Problem:
I set up ngrok and llama3 running in the local, when I try to run chat completions call using Portkey, I see following response:
Plain Text
bun ollama-llama3.js
reaching llama3
{
  provider: "ollama",
  getHeaders: [Function: getHeaders],
}


Code
Plain Text
const portkey = new Portkey({
  apiKey: process.env.PORTKEY_API_KEY, 
  provider: 'ollama',
  customHost: 'https://6b73-165-1-160-105.ngrok-free.app ',
  traceID: 'ollama-llama3'
});

console.log('reaching llama3');

const chatCompletion = await portkey.chat.completions.create({
  messages: [{ role: 'user', content: 'Say this is a test' }],
  model: 'llama3'
});

console.log(chatCompletion);


Expecting:
To see any kind of completions response to saying it's an test.
S
V
v
10 comments
Additional Observation: There are no activity visible on the Portkey UI.
Is it working directly without using Portkey?
Can you please also try restarting your ollama server?
I tried restarting ollama server.

Yes, Llama works without Portkey (I can chat with it on Terminal)
Confirming that this should work. 🧐 Maybe an issue on binding to the corect URL some auth issue?
Let me DM you πŸ™‚
This looks like a SDK issue. The API call is working fine. But the response is not coming when using SDK. Checking this.
Found a bug in the SDK that customHost is absent in the headers when request is sent to AI gateway. (fix PR is on the way)
This was merged and released. Thanks for the PR!
Add a reply
Sign up and join the conversation on Discord