Trying to do: Monitor logs to Llama3 (through Ollama) using Portkey
Problem:
I set up
ngrok
and
llama3
running in the local, when I try to run chat completions call using Portkey, I see following response:
bun ollama-llama3.js
reaching llama3
{
provider: "ollama",
getHeaders: [Function: getHeaders],
}
Codeconst portkey = new Portkey({
apiKey: process.env.PORTKEY_API_KEY,
provider: 'ollama',
customHost: 'https://6b73-165-1-160-105.ngrok-free.app ',
traceID: 'ollama-llama3'
});
console.log('reaching llama3');
const chatCompletion = await portkey.chat.completions.create({
messages: [{ role: 'user', content: 'Say this is a test' }],
model: 'llama3'
});
console.log(chatCompletion);
Expecting:
To see any kind of completions response to saying it's an test.