-
-
Notifications
You must be signed in to change notification settings - Fork 5k
Closed
Labels
Description
What happened?
I'm using litellm_proxy and custom base_url. The request looks like this:
response = litellm.completion(
api_key=os.environ.get("LITELLM_API_KEY"),
base_url=MY_URL,
model='litellm_proxy/xxx/claude-3-7-sonnet-20250219',
messages=xxx,
temperature=0.0,
thinking={
"type": "enabled",
"budget_tokens": 200
}
)I got UnsupportedParamsError:
litellm.exceptions.UnsupportedParamsError: litellm.UnsupportedParamsError: litellm_proxy does not support parameters: {'thinking': {'type': 'enabled', 'budget_tokens': 200}}, for model=xxx/claude-3-7-sonnet-20250219. To drop these, set `litellm.drop_params=True` or for proxy:
`litellm_settings:
drop_params: true`
Relevant log output
Are you a ML Ops Team?
No
What LiteLLM version are you on ?
v1.61.20
Twitter / LinkedIn details
No response
phergoualch, gbpdt, mbochneak, colesmcintosh, wagnerjt and 1 more