Hi, As per title. Transformers backend supports `int` think_end_token: https://github.com/EleutherAI/lm-evaluation-harness/blob/29a0765a4d13ff88797743347aa57b230b27bd98/lm_eval/models/huggingface.py#L105 But vllm backend does not support it: https://github.com/EleutherAI/lm-evaluation-harness/blob/29a0765a4d13ff88797743347aa57b230b27bd98/lm_eval/models/vllm_causallms.py#L148 Thanks!