There is a very weird bug I found when I was trying out Google Gemini Models.
So in my current prompts (GPT specific), I have a system prompt and then user prompts. But when I changed the provider, the system prompt just disappeared. (Maybe because Gemini doesn't support system prompt). I think we should not delete the system prompt.
Thanks for sharing. @visarg @Sabbyasachi thoughts? We could change the System message to a User message and then if the user switches back, keep it to a user message? Not sure
@Siddharth Bulia generally, we do not make any explicit transformations to the messages body that are not already supported by the provider. In this case, since the Gemini API itself does not have the system message, we omitted it as an extension of that philosophy.
I actually wanted to compare the response of different LLMs. So, for me keeping the message same is very important. Not sure what is the best UI for this.