Hi @Vrushank | Portkey There is some issue with setting the output length size for the models. It behaves very unreliably. Sometimes it says max output length is 2K, sometimes 8K - for the same model. Could you have a look?
Eg - This picture is for Together AI - Llama-70 Model On together website - max limit is 8K, but here it is 32K.
Also, when I change the model to say - Llama-7b model - it still stays at 32K.
The issue was even though the foundation model is changed and newer model has less max token compared to old selected value, it was never reset. I have pushed changes to address this issue. It will reflect with new App version. Please update here if there is any concern.