Skip to content

Conversation

@taronaeo
Copy link
Collaborator

@taronaeo taronaeo commented Nov 9, 2025

ref: #17088

This PR introduces the feature to allow sampler parameters to be set from GGUF KV metadata allowing model creators to embed recommended sampler settings unless explicitly overridden using the CLI flags.

Handy for users who do not want to tinker with the settings but want the recommended settings applied.

Priority of Sampler Parameters

  1. User flags (i.e., setting --temp 0.6)
  2. Model-Embedded recommendation (i.e., general.sampler.temp = 0.6)
  3. Default hardcoded values in common_params_sampling

Introduced Metadata

  • general.sampler.sequence
  • general.sampler.top_k
  • general.sampler.top_p
  • general.sampler.min_p
  • general.sampler.xtc_probability
  • general.sampler.xtc_threshold
  • general.sampler.temp
  • general.sampler.penalty_last_n
  • general.sampler.penalty_repeat
  • general.sampler.mirostat
  • general.sampler.mirostat_tau
  • general.sampler.mirostat_eta

Please let me know if we should introduce more sampling parameters.

Embedding From Safetensors into GGUF

By default, it will attempt to find the generation_config.json within the model directory and automatically add recommended sampler parameters into the GGUF metadata. If a sampling parameter is not available within the file, users can also specify --metadata metadata.json.

Note that --metadata metadata.json takes precedence over generation_config.json and will overwrite metadata if duplicate keys are found.

$ cat > metadata.json << EOF 
{
    "general.sampler.temp": 0.6
}
EOF

$ python3 convert_hf_to_gguf.py --outfile deepseek-r1-distill-qwen-1.5b.gguf --metadata metadata.json deepseek-r1-distill-qwen-1.5b/

$ ./build/bin/llama-cli -m deepseek-r1-distill-qwen-1.5b.gguf -p "Write me a dog walking business idea 1. " -no-cnv -n 1 -t 10 2>&1 | grep "temp"    
llama_model_loader: - kv   2:                       general.sampler.temp f32              = 0.600000
llama_model_loader: - kv  27:                    tokenizer.chat_template str              = {% if not add_generation_prompt is de...
        top_k = 40, top_p = 0.950, min_p = 0.050, xtc_probability = 0.000, xtc_threshold = 0.100, typical_p = 1.000, top_n_sigma = -1.000, temp = 0.600
sampler chain: logits -> logit-bias -> penalties -> dry -> top-n-sigma -> top-k -> typical -> top-p -> min-p -> xtc -> temp-ext -> dist

Signed-off-by: Aaron Teo <[email protected]>
@github-actions github-actions bot added the python python script changes label Nov 9, 2025
@CISC
Copy link
Collaborator

CISC commented Nov 9, 2025

$ cat > metadata.json << EOF 
{
    "general.sampler.temp": 0.6
}
EOF

So, you're suggesting that parameters should be added manually before conversion? How likely is that to happen?

AFAIK most models come with recommended (though, some are likely to just be copy-pasted from somewhere) settings in generation_config.json, so perhaps a better idea to get them from there?

Edit: or is that automatically added to metadata?

@taronaeo
Copy link
Collaborator Author

taronaeo commented Nov 9, 2025

$ cat > metadata.json << EOF 
{
    "general.sampler.temp": 0.6
}
EOF

So, you're suggesting that parameters should be added manually before conversion? How likely is that to happen?

AFAIK most models come with recommended (though, some are likely to just be copy-pasted from somewhere) settings in generation_config.json, so perhaps a better idea to get them from there?

You're right, I didn't spot that. Well I guess I have to rework the code such that it pulls generation_config.json from the model directory, maps to general.sampler.* and we can skip the --metadata flag.

@Green-Sky
Copy link
Collaborator

I think sampling sequence is important too. Also I personally only really tend to use min-p and xtc(not in your proposal).

@taronaeo
Copy link
Collaborator Author

taronaeo commented Nov 9, 2025

@Green-Sky Will include general.sampler.xtc_probability and general.sampler.xtc_thresold first then --samplers SEQUENCE.

@CISC RE generation_config.json vs. the custom --metadata file, I've realised that generation_config.json does not actually document (non-standard) support for parameters such as mirostat. In this case, we'll still need support for --metadata metadata.json to cover these parameters, unless there is a better way of handling this.

@CISC
Copy link
Collaborator

CISC commented Nov 9, 2025

@CISC RE generation_config.json vs. the custom --metadata file, I've realised that generation_config.json does not actually document (non-standard) support for parameters such as mirostat. In this case, we'll still need support for --metadata metadata.json to cover these parameters, unless there is a better way of handling this.

Does transformers even have this parameter?

@taronaeo
Copy link
Collaborator Author

taronaeo commented Nov 9, 2025

@CISC RE generation_config.json vs. the custom --metadata file, I've realised that generation_config.json does not actually document (non-standard) support for parameters such as mirostat. In this case, we'll still need support for --metadata metadata.json to cover these parameters, unless there is a better way of handling this.

Does transformers even have this parameter?

Doesn't look like it. Followed some of Ollama's supported parameters: https://ollama.readthedocs.io/en/modelfile/#parameter

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

python python script changes

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants