Welcome to Portkey Forum

Updated 10 months ago

Hi @Vrushank | Portkey

At a glance
Hi @Vrushank | Portkey
There is some issue with setting the output length size for the models. It behaves very unreliably.
Sometimes it says max output length is 2K, sometimes 8K - for the same model. Could you have a look?

Eg - This picture is for Together AI - Llama-70 Model
On together website - max limit is 8K, but here it is 32K.

Also, when I change the model to say - Llama-7b model - it still stays at 32K.
Attachment
image.png
V
R
S
4 comments
Thanks for pointing this out! Confirming that it should be different for models.

cc @Sabbyasachi who can take a look
Checking the inconsistency
The issue was even though the foundation model is changed and newer model has less max token compared to old selected value, it was never reset.
I have pushed changes to address this issue. It will reflect with new App version.
Please update here if there is any concern.
Add a reply
Sign up and join the conversation on Discord