Welcome to Portkey Forum

Updated 6 days ago

LibreChat

Thanks so much. I tried it out and wasn't able to get it to work. Below is code within my librechat.yaml file. I tried out my creds using a curl and it worked. I even tried to hardcode the keys into the file, but it wouldn't get the Portkey endpoint to show up on my end. I may be getting the model name wrong, but am not sure. I am trying to integrate it with a virtual key that has access to AWS Bedrock.

Let me know if you have any suggestions on what else I should try.

custom:
  • name: "Portkey"
    baseURL: "${PORTKEY_GATEWAY_URL}"
    headers:
    x-portkey-api-key: "${PORTKEY_API_KEY}"
    x-portkey-virtual-key: "${PORTKEY_VIRTUAL_KEY}"
    models:
    default: ["anthropic.claude-v2:1"]
    fetch: true
    titleConvo: true
    titleModel: "current_model"
    summarize: false
    summaryModel: "current_model"
    forcePrompt: false
    modelDisplayLabel: "Portkey"
Attachment
image.png
2
V
T
S
34 comments
@Tim sorry for this! Let me check and get back quickly
@Tim updated our documentation now, which should work.

Key things:
  1. The PORTKEY_BASE_URL in the .env file should be PORTKEY_GATEWAY_URL
  2. In librechat.yaml, you need to pass the apiKey param with a dummy value.
With these 2 changes, it should start working!

Updated docs: https://docs.portkey.ai/docs/integrations/libraries/librechat
Thanks so much @Vrushank | Portkey ! It's working better now. Works with Meta's Llama models from AWS Bedrock, but it's giving errors when using Anthropic Claude models. I got this error and it may be familiar to you:

LibreChat | 2024-10-04 14:39:39 error: [handleAbortError] AI response error; aborting request: 400 bedrock error: Malformed input request: #: subject must not be valid against schema {"required":["messages"]}#: required key [max_tokens] not found#: extraneous key [metadata] is not permitted, please reformat your input and try again.

I'll try to continue debugging as well. Feel free to let me know if you have any suggestions for me to test.
Attachment
image.png
@Tim max_tokens is a required parameter for anthropic, you can use portkey config to always pass it like this (https://docs.portkey.ai/docs/api-reference/config-object#passing-model-and-hyperparameters-with-override-option)
optionally I think there should be a way inside librechat settings to do the same
Thank you @sega , I'll try that out. How do you advise me configuring app such that whenever my user logs into LibreChat, their token usage is accounted for individually?

Currently when different users log into LibreChat, their usage gets reported as a single unnamed user in the Portkey admin dashboard.

LibreChat references one virtual key. Do I have to create individual virtual keys for each user?
Attachment
image.png
Do you have a centralized server running librechat through which users access their accounts?
I don't think creating separate virtual keys for each user would help since you'd not be able to mention it in the librechat.yaml file
you'd need to figure out a way to send the metadata header with each request
something like
Plain Text
  custom:
    - name: "Portkey"
      apiKey: 'DUMMY'
      baseURL: '${PORTKEY_BASE_URL}'
      headers:
        x-portkey-api-key: '${PORTKEY_API_KEY}'
        x-portkey-virtual-key: 'openai-asd-3dsaeb',
        x-portkey-metadata: '{"_user": "Tim"}'
...
Sorry for the late reply. Thanks I'll look into this
Just chiming in to say, @Tim - I've seen some customers use this to commute user IDs back to Portkey
Will investigate and write detailed docs for how you can do this / if it's possible!
We'd love to figure out how to make this happen in any case. This would be a very crucial feature of Portkey x LibreChat integration
Much appreciated, @Vrushank | Portkey . I finally got it to work last night. I had to edit how Librechat passed the headers to Portkey and the docker compose yaml.

I’ll post the code either today or tomorrow.
That is great @Tim , would love to see the implementation!
Made some rough notes: https://github.com/timmanik/librechat-for-portkey. Let me know if you all have any questions.
damn, this is so cool!
@Tim I'll link this repo in the docs, people will find it useful
I just submitted a PR to LibreChat. Hopefully it goes through!

https://github.com/danny-avila/LibreChat/pull/4488
Using the same configuration as mentioned in both the LibreChat/Portkey docs, we're unable to see PortKey as a provider in librechat.

The Librechat-API container logs show:
Plain Text
2024-11-13 21:08:53 info: Custom config file loaded:
2024-11-13 21:08:53 info: {
  "version": "1.1.7",
  "cache": true,
  "endpoints": {
    "custom": [
      {
        "name": "Portkey",
        "apikey": "dummy",
        "baseURL": "https://api.portkey.ai/v1",
        "headers": {
          "x-portkey-api-key": "....",
          "x-portkey-virtual-key": "...."
        },
        "models": {
          "default": [
            "gemini-1.5-pro"
          ],
          "fetch": true
        },
        "titleConvo": true,
        "titleModel": "current_model",
        "summarize": false,
        "summaryModel": "current_model",
        "forcePrompt": false,
        "modelDisplayLabel": "Portkey:OpenAI"
      }
    ]
  }
}


But LibreChat shows:
Attachment
image.png
The error log within the Librechat-API container is also empty.
your config looks correct,
are you not able to add portkey only or any other provider at all?
I'd need more details if you need help
>are you not able to add portkey only or any other provider at all?
I haven't authenticated any other provider, as I wanted to use providers through Portkey. The issue is that Portkey isn't showing up at all.
>I'd need more details if you need help
Sure, let me know what details you need. Thanks!
Here are the deployment files:
And I'm running it with:
Plain Text
sudo /usr/local/bin/docker-compose -f ./deploy-compose.yml -f ./deploy-compose.override.yml up -d
And here's the .env file with secrets removed.
I think you should try setting up another provider just to make sure your setup is not wrong, the current setup described in the steps is correct, I'm able to get it to work
The process to add portkey as a provider is no different from adding any other provider in the yaml
Please fix the documentation on your end as well as this page has the incorrect case for apiKey https://portkey.ai/docs/integrations/libraries/librechat#librechat
Add a reply
Sign up and join the conversation on Discord