Welcome to Portkey Forum

Updated 3 weeks ago

Smart Fallback with Model-Optimized Prom...

1
0
V
C
11 comments
And do you know how easy would this be to implement with my llama-index RAG system? I want to pass the context from the retriever as a variable to the prompt
I have this snippet to get the chat engine:
Plain Text
def get_chat_engine():
    system_prompt = SYSTEM_PROMPT

    index = get_index()

    if index is None:
        raise HTTPException(
            status_code=500,
            detail=str(
                "StorageContext is empty - call 'poetry run generate' to generate the storage first"
            ),
        )

    return index.as_chat_engine(
        similarity_top_k=app_config.TOP_K,
        system_prompt=system_prompt,
        chat_mode="context",
        response_mode="tree_summarize",
        vector_store_query_mode=app_config.VECTOR_STORE_QUERY_MODE,
    )
My initialization detects the environmental variable "provider mode" and if it's set to "portkey" then it initializes portkey, otherwise it initializes llama-index with standard openai config
So it would be nice to be able to either set SYSTEM_PROMPT from constant if portkey is not in env variables, or rely on portkey's prompt if it is set in envs
Just tested it and I get this error :
Plain Text
 File "/home/user/.cache/pypoetry/virtualenvs/impulsumgpt-TwrvyiC3-py3.11/lib/python3.11/site-packages/openai/_base_client.py", line 1584, in _request
    raise self._make_status_error_from_response(err.response) from None
openai.BadRequestError: Error code: 400 - {'status': 'failure', 'message': 'You cannot pass config with prompt id in /v1/prompts route'}
Here is the full trace:
My latest approach:


Plain Text
def get_prompt_template() -> str:
    portkey_client = Portkey(
        api_key=app_config.PORTKEY_API_KEY,
    )
    render_response = portkey_client.prompts.render(
        prompt_id=app_config.GATEWAY_PROMPT_ID,
        variables={"context": "context data below"},
        stream=False,
    )

    return render_response.data


It prints this error when I try to use the function. I pass in variables but in reality my prompt doesn't have any variables since I'm just fetching the prompt to pass it on to SYSTEM_PROMPT for use downstream.
I tried without passing variables and variables are required. If I give dummy variables, it gives this error. If I add the variable "context" to my prompt, it still gives this error:

Plain Text
line 176, in __init__
    self.__pydantic_validator__.validate_python(data, self_instance=self)
pydantic_core._pydantic_core.ValidationError: 2 validation errors for PromptRender
data.messages.0.content
  Input should be a valid string [type=string_type, input_value=[{'type': 'text', 'text':...context data below\n "}], input_type=list]
    For further information visit https://errors.pydantic.dev/2.7/v/string_type
data.messages.1.content
  Input should be a valid string [type=string_type, input_value=[], input_type=list]
    For further information visit https://errors.pydantic.dev/2.7/v/string_type
Looking forward to seeing what you think @Vrushank | Portkey , maybe I can help fix this somehow? πŸ™‚
Hey @0xDrTuna yes this was a Pydantic validation error we were getting earlier, but afaik, it's fixed in the latest build. Just check once if you're on the latest Python package, and that may fix it. (cc @chandypaaji to take a look still)

As for not using variables and still being able to make prompts.render request, tagging @visarg who can share if this should be possible.

Either way, we'll get back to you with a solution. Sorry for the delay here!
Hey @0xDrTuna

The issue has been identified. It is fix in the type for prompt.render

Change will be made live by today. Post that you can update the SDK and it should work fine.

Thank you for brining this issue up.
Hey @0xDrTuna - We have pushed a fix for the pydantic errors in Python SDK v1.5.1. Please try and let us know if you face any issues.

Regarding variables, if you do not have any variables in your prompt, you can pass variables={} in the prompt render call.
Add a reply
Sign up and join the conversation on Discord