Skip to content

Get model SHA is always failed when pass existing model to HFLM #3431

@wogns3623

Description

@wogns3623

When passing an existing model to HFLM, it always outputs the warning "WARNING Failed to get model SHA for <model_name> ..."

def get_model_sha(pretrained: str, revision: str) -> str:
try:
model_info = HfApi().model_info(repo_id=pretrained, revision=revision)
return model_info.sha
except Exception as e:
eval_logger.debug(
f"Failed to get model SHA for {pretrained} at revision {revision}. Error: {e}"
)
return ""
model_info = {
"model_num_parameters": get_model_num_params(self._model),
"model_dtype": get_model_dtype(self._model),
"model_revision": self.revision,
"model_sha": get_model_sha(self.pretrained, self.revision),
}

Because HfApi().model_info only accepts strings, this causes this issues when self.pretrained is a transformers.PreTrainedModel.
It could be resolved by modifying the variable known as self.pretrained to be of the type PreTrainedModel, or by not fetching the SHA.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions