Skip to content

[Feature]: Support logprobs for Vertex AI gemini Models #9091

@TravisGibbs

Description

@TravisGibbs

The Feature

Vertex AI supports response_logprobs and logprobs as parameters when generating text via their API. I confirmed that this worked for the gemini flash 2.0 model using the vertexai library. Currently Litellm does not currently support logprobs (integer setting for how many logprobs IE top_logprobs) when mapping over to OpenAI parameters. I made a change locally to allow and map the parameter in VertexGeminiConfig and got correct results (all of the logic to transform these in the response exists already). I would be happy to create a PR if that would be welcomed.

Motivation, pitch

I am working on a tool that currently uses the logprobs from OpenAI models to validate llm outputs at Gem. Being able to do the same with Gemini models would allow us to load balance or swap over to Gemini using LiteLLM.

Are you a ML Ops Team?

No

Twitter / LinkedIn details

https://www.linkedin.com/in/-travis-gibbs-/

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions