-
Notifications
You must be signed in to change notification settings - Fork 2.9k
Open
Description
When passing an existing model to HFLM, it always outputs the warning "WARNING Failed to get model SHA for <model_name> ..."
lm-evaluation-harness/lm_eval/models/huggingface.py
Lines 1555 to 1570 in 7ddb2b1
| def get_model_sha(pretrained: str, revision: str) -> str: | |
| try: | |
| model_info = HfApi().model_info(repo_id=pretrained, revision=revision) | |
| return model_info.sha | |
| except Exception as e: | |
| eval_logger.debug( | |
| f"Failed to get model SHA for {pretrained} at revision {revision}. Error: {e}" | |
| ) | |
| return "" | |
| model_info = { | |
| "model_num_parameters": get_model_num_params(self._model), | |
| "model_dtype": get_model_dtype(self._model), | |
| "model_revision": self.revision, | |
| "model_sha": get_model_sha(self.pretrained, self.revision), | |
| } |
Because HfApi().model_info only accepts strings, this causes this issues when self.pretrained is a transformers.PreTrainedModel.
It could be resolved by modifying the variable known as self.pretrained to be of the type PreTrainedModel, or by not fetching the SHA.
Metadata
Metadata
Assignees
Labels
No labels