-
-
Notifications
You must be signed in to change notification settings - Fork 5k
Open
Labels
bugSomething isn't workingSomething isn't working
Description
What happened?
We host Qwen3-Omni via vLLM. When you try to call the /audio/transcriptions endpoint via LiteLLM, the request is successfully put through, and the model creates the transcriptions accordingly ("POST /audio/transcriptions HTTP/1.0" 200 OK).
Something happens in LiteLLM after the POST request is successfully made, so it returns:
500: litellm.BadRequestError: Hosted_vllmException - The model does not support Transcriptions API
This is very frustrating to see, especially when you see the model doing the transcriptions.
"mode": "audio_transcription" is also set in model_info.
Relevant log output
Are you a ML Ops Team?
No
What LiteLLM version are you on ?
v1.80.0
Twitter / LinkedIn details
No response
Metadata
Metadata
Assignees
Labels
bugSomething isn't workingSomething isn't working