Welcome to Portkey Forum

Updated 4 weeks ago

virtual key creation

Not able to create virtual Keys
W
s
41 comments
hey @s44002 ! can you provide more details. For example. for which provider
i tried anthropic
and Google - ai.google.dev not vertexai
all 3 have working tested keys
that get rejected on creation of virtual keys
I am assuming you used Portkey dashboard for creating these?
to create the virtual keys yes
the API keys, i had them earlier bcs of other work stuff
checking this on priority
we found the issue. pushing a fix . thanks for notifying this
hey @s44002 this should be fixed. can you refresh your cache and try again?
is the portkey framework absolutely stable for production use
hey @s44002
The part that handles all your actual llm traffic (the Gateway) is totally solid. It's the backbone that's been running smoothly for ages now and has served more than 2 billion api requests.
What you ran into was just a UI hiccup in our dashboard where you set things up. None of your production stuff or Virtual Keys were affected since the Gateway runs completely separate from the dashboard.
The Gateway gets super rigorous testing before any updates, since we know how critical it is for your production workloads. While, the UI might hit some weird edge cases as we keep building it out, we designed it specifically so that it can't mess with your actual API calls, by designed the Gateway as stateless.
Bottom line:Your production stuff is 100% safe and stable. The UI thing was just surface level.
Please let me know if that does make sense.
Ps: we openly display our service status at https://status.portkey.ai/
Context: i have had enough problems with openrouter not sending a response altogether, or sending a very malformed response,
the logs part is the actual LLM calls that go through gateway
it does make sense
we try to solve any issues as soon as possible and will definitely take feedback positively. if you are ok to share, what exactly is the issue that you faced with Open Router?
when i sent a request where parallelly tools were called and i sent their responses as well accordingly, i recieved no response from their api at all
and i was stuck hanging for quite a while
once, the tool use parameter - tooluseID came so long that it ate up the entire context window like.. LegitidKQKQKQKQKQKQKKQKQKQKQKQKQ and so on
i have had many cases where the content generated was NONE.. i used it through the openai sdk ofc, so idk if that was the problem\
and my application requires streaming to be off
hmm. you can try using openai sdk with changing the base url to Portkey end point and try to check if you find any difference betwen both the responses. For context, we support OpenRouter, but for us, it acts a provider
what is the exact difference
other than the imports
between using the portkey lib
and openai sdk
its just a convenient way so that you can use it without importing our lib and not changing any of your existing code which uses Openai sdk. Let us know if you need any help regarding this
but except for the change in imports
does anything else in the code need to change to move to portkey?
no major changes in code. you need to add two parameters
portkey api key and portkey base url
Plain Text
import OpenAI from 'openai';

import { PORTKEY_GATEWAY_URL, createHeaders } from 'portkey-ai'
const openai = new OpenAI({
  baseURL: PORTKEY_GATEWAY_URL,
  defaultHeaders: createHeaders({
    provider: "openai",
    apiKey: "PORTKEY_API_KEY" // defaults to process.env["PORTKEY_API_KEY"]
  })
});

async function main() {
  const stream = await openai.chat.completions.create({
    model: 'gpt-4',
    messages: [{ role: 'user', content: 'Say this is a test' }],
    stream: true,
  });

  for await (const chunk of stream) {
    process.stdout.write(chunk.choices[0]?.delta?.content || '');
  }
}

main();

you can find more examples here https://portkey.ai/docs/integrations/llms/openai
also, tool calling is supported on Portkey on anthropic right?
@s44002 yes. we do
Add a reply
Sign up and join the conversation on Discord