Skip to content
This repository has been archived by the owner on Oct 9, 2024. It is now read-only.

fix checkpoints file list to align with DeepSpeed #71

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

dc3671
Copy link

@dc3671 dc3671 commented Mar 24, 2023

When use glob.glob(f"{self.model_path}/*.bin"), files path in the list will all contain model_path prefix. While set it as root_dir will not. And it will align to DeepSpeed's loading way (replace_module.py):

sd = [
                    torch.load(os.path.join(base_dir1,
                                            checkpoint[i]),
                               map_location='cpu')
        ]

Where base_dir1 is duplicate with model_path.

plz help review @mayank31398, thx~

@mayank31398
Copy link
Collaborator

Hi, this repo is no longer maintained

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants