Welcome to Portkey Forum

Home
Members
Harold Senzei
H
Harold Senzei
Offline, last seen 23 hours ago
Joined November 21, 2024
Hey when using prompt render with response_mode json set in the template. I have a new parameter in response body
Plain Text
  "response_format": {
      "type": "json_object"
    },
but when using the python sdk the PromptRenderData object does not have that field is it expected?
Plain Text
class PromptRenderData(BaseModel):
    messages: Optional[List[ChatCompletionMessage]] = None
    prompt: Optional[str] = None
    model: Optional[str] = None
    suffix: Optional[str] = None
    max_tokens: Optional[int] = None
    temperature: Optional[float] = None
    top_k: Optional[int] = None
    top_p: Optional[float] = None
    n: Optional[int] = None
    stop_sequences: Optional[List[str]] = None
    timeout: Union[float, None] = None
    functions: Optional[List[Function]] = None
    function_call: Optional[Union[None, str, Function]] = None
    logprobs: Optional[bool] = None
    top_logprobs: Optional[int] = None
    echo: Optional[bool] = None
    stop: Optional[Union[str, List[str]]] = None
    presence_penalty: Optional[int] = None
    frequency_penalty: Optional[int] = None
    best_of: Optional[int] = None
    logit_bias: Optional[Dict[str, int]] = None
    user: Optional[str] = None
    organization: Optional[str] = None
    tool_choice: Optional[Union[None, str]] = None
    tools: Optional[List[Tool]] = None

how do I access it?
3 comments
V
H
C
When I am using json mode for prompt library and I have an empty variable, the render endpoint returns 502

example :

create a prompt template with this,

Plain Text
[{
  "content": [
    {
      "type": "text",
      "text": "You are an helpful AI assistant. My name is {{name}}"
    }
  ],
  "role": "system"
},{{history_messages}},{
  "content": [
    {
      "type": "text",
      "text": "{{input}}"
    }
  ],
  "role": "user"
}]


If you call the render endpoint with
Plain Text
variables= {"name" : "portkey"}

you get 500 error


If you call with
Plain Text
variables= {"name" : "portkey", "history_messages" : []}

you get 502 error.

So in this how do I call with an empty history_messages?
7 comments
W
H
b
is there any example on using prompt library with tools and forced tool use with openai?
2 comments
G
H
Hey, I cannot seem to get portkey to work with langchain for google's models.
This is a sample of what I am using
Plain Text
from langchain_openai import ChatOpenAI
from portkey_ai import createHeaders, PORTKEY_GATEWAY_URL

PORTKEY_API_KEY = "..."
VIRUTAL_KEY = "..." # Virtual Key I created

portkey_headers = createHeaders(api_key=PORTKEY_API_KEY,virtual_key= VIRUTAL_KEY)

llm = ChatOpenAI(api_key="x", base_url=PORTKEY_GATEWAY_URL, default_headers=portkey_headers, model="gemini-1.5-pro")

llm.invoke("What is the meaning of life, universe and everything?")


I have attached the trace back.
I am on the latest version of all the packages as I ran pip install -U langchain-core portkey_ai langchain-openai before starting
2 comments
G
H