Skip to content

[Feature]: Invoke Claude Thinking when using OpenAI API #9022

@mwl38765

Description

@mwl38765

The Feature

When using the OpenAI API to invoke a model with reasoning/thinking, a "reasoning_effort" parameter is supplied. However, with Claude Sonnet 3.7, it requires a "thinking" parameter.

Understand that it's possible to just configure LiteLLM to invoke thinking on every prompt sent through the litellm_params for the model.

But it would be helpful if the reasoning_effort in OpenAI got translated to a thinking value before it is sent to a service hosting Claude Sonnet 3.7.

Motivation, pitch

Having this feature would allow those using LiteLLM for the OpenAI API access to have the flexibility of accessing the thinking feature of Claude Sonnet. The current workaround of having thinking on for every prompt would be an extra cost when the thinking feature is not needed.

Are you a ML Ops Team?

No

Twitter / LinkedIn details

No response

Metadata

Metadata

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions