-
-
Notifications
You must be signed in to change notification settings - Fork 5k
Closed
Labels
Description
What happened?
I'm having trouble accessing logprobs in my chat suggestion response using Qwen Coder 2.5, the logprobs are being returned from the server successfully because I can access them under the original_response attribute.
acompletion({
"custom_llm_provider": "text-completion-openai",
"model": "qwen2p5-coder-7b",
"api_base": YOUR_API_BASE
"messages": [{"content": "hello", "role": "user"}],
"stream": False,
"logprobs": 1
}
Does not work: (logprobs doesn't exist)
suggestion.choices[0].logprobs.token_logprobs[0]
Does work:
suggestion._hidden_params['original_response']['choices'][0]['logprobs'].token_logprobs[0]
Relevant log output
Are you a ML Ops Team?
No
What LiteLLM version are you on ?
v1.55.9
Twitter / LinkedIn details
No response