Skip to content

[FEATURE_REQUEST] llama.cpp Text Completion top_n_sigma #3960

@Beinsezii

Description

@Beinsezii

Have you searched for similar requests?

No

Is your feature request related to a problem? If so, please describe.

Server supports top_n_sigma as of today since this PR was merged ggml-org/llama.cpp#13264 so it needs to be enabled in ST

tl;dr:

new default ordering
[ "penalties", "dry", "top_n_sigma", "top_k", "typ_p", "top_p", "min_p", "xtc", "temperature" ]

short-circuit check for top_n_sigma < 0.0f so UI needs to go to -1 I suppose?
https://github.com/ggml-org/llama.cpp/blob/15a28ec8c705b188ebe178170966d1dcc36fe151/src/llama-sampling.cpp#L1753

Describe the solution you'd like

Think it's slightly beyond what I consider a quick copy paste job because it needs new entries to the ordering UI and such.

Describe alternatives you've considered

No response

Additional context

No response

Priority

Low (Nice-to-have)

Are you willing to test this on staging/unstable branch if this is implemented?

None

Metadata

Metadata

Assignees

No one assigned

    Labels

    ✅ Done (staging)[ISSUE][🎯Auto-applied] The issue/feature is fixed or done and integrated on staging🦄 Feature Request[ISSUE] Suggestion for new feature, update or change

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions