Welcome to Portkey Forum

Updated 8 months ago

Navigating the Portkey Gateway with Instructor JS: Addressing the Max_Tokens Requirement

At a glance
Not able to use portkey gateway with instructor js because max_tokens seems to be mandatory, code in thread
H
V
11 comments
Plain Text
const Instructor = require("@instructor-ai/instructor");
const OpenAI = require("openai");
const { z } = require("zod");

const { createHeaders } = require('portkey-ai');
// You need to run portkey SDK for this npx @portkey-ai/gateway
const PORTKEY_GATEWAY_URL = "http://localhost:8787/v1"
console.log(PORTKEY_GATEWAY_URL)
const porkeyClient = new OpenAI({
    apiKey: process.env.ANTHROPIC_API_KEY,
    baseURL: PORTKEY_GATEWAY_URL,
    defaultHeaders: createHeaders({
        provider: "anthropic",
        apiKey: "PORTKEY_API_KEY",
        model: 'claude-3-sonnet-20240229',
        max_tokens: 512
  })
});


const oai = new OpenAI({
  apiKey: process.env.OPENAI_API_KEY ?? undefined,
  organization: process.env.OPENAI_ORG_ID ?? undefined
})


const client = Instructor.default({
    client: porkeyClient,
    mode: "TOOLS"
  })

const UserSchema = z.object({
  // Description will be used in the prompt
  age: z.number().describe("The age of the user"),
  name: z.string()
})


// User will be of type z.infer<typeof UserSchema>
async function main(){
const user = await client.chat.completions.create({
  messages: [{ role: "user", content: "Jason Liu is 30 years old" }],
  model: "gpt-3.5-turbo",
  response_model: {
    schema: UserSchema,
    name: "User"
  }
})

console.log(user)
}
// { age: 30, name: "Jason Liu" }
main()
Bad flag, I could provide max_tokens in the completion api, though still fails
Attachments
Screenshot_2024-05-09_at_5.03.54_PM.png
Screenshot_2024-05-09_at_5.02.23_PM.png
Tried this out - looks to be Instructor JS issue. Anthropic is supported only through the llm client library and not natively. Need to check if it works with that.
Confirming that Portkey works with Instructor Python - Anthropic
@Harsh Gupta lmk if this helps!
Really appreciate the quick support πŸ™‚
Two things:
  • max_tokens shouldn't be a mandatory parameter, it's not in the native clients and it looks like max_tokens is manadatory/optional across providers in an inconsistent way.
  • Might be worth making a PR to the instructor library. The better portkey works with the ecosystem, the better it is.
max_tokens - yeah this is a deliberate choice. It is only Anthropic that expects max_tokens to be mandatory. Hence, we don't enforce it at our level, but for Anthropic, you'd require that (whether you use Portkey or you use Anthropic directly).

Our basic philosphy here is not do any modifications to the LLM/provider behaviour and serve it as it is
Instructor - Absoultely! Really excited about Instructor cloud - thinking that an integration with that would be simpler and better!
Add a reply
Sign up and join the conversation on Discord