Welcome to Portkey Forum

Updated 11 months ago

Controlling the likelihood of tokens in generated responses

You can pass a map where key will be the tokenized word and value will be the bias which controls the likelihood of that token appearing in your generated response. Example:
Plain Text
{
    19045: -10,
    58234: 10
}


For example, here 19045 is the tokenized id for good and 58234 is tokenized id for better. The above logit_bias will reduce the chances of the model generating the word good in the completion as its value is negative 10 and vice versa for the word better as its value is positive 10.

Reference to a simple article that explains it well: https://help.openai.com/en/articles/5247780-using-logit-bias-to-define-token-probability

You can use this to generate tokenized ids for words (for openai models): https://platform.openai.com/tokenizer
d
r
18 comments
Hello @visarg what would have more priority, logit bias or instructions, basically the issue is, in our prompt instructions, we have instructed LLM to not use certain words, but it still uses, would you recommend that this problem can be solved using Logit bias?
yes, logit_bias is the way to solve this
Rohit, can you give an example, lets say, I dont want words like keen, synergy etc, how can I add this into logit bias?
Each of these words are two token_id so not sure on how to pass it to logit bias parameter
synergy and keen are single tokens, you can then pass tokens for nergy as well which should take care of most cases
'nergy' I think would remove any reference of energy too, right?
Also if I have to remove 'insights', should i remove 'insight', would it remove 'insights' too?
yes, it would..
no other way to suppress this. You could write it in the prompt, but there's no guarantees
Yeah its been there in prompt but still fails 5/10 times
could be bcoz length of as length of prompt increases, results quality decreases
Is this the correct way:

Plain Text
{
    27989: -100,
    93140: -100
}
were yopu able to try this out?
Not that great solution
hmm, will think about it some more
Add a reply
Sign up and join the conversation on Discord