Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We鈥檒l occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix: best checkpoint not found issue #622

Open
wants to merge 1 commit into
base: main
Choose a base branch
from
Open

Conversation

Mai0313
Copy link

@Mai0313 Mai0313 commented Dec 5, 2023

What does this PR do?

This PR addresses the issue where Lightning saves "best ckpt" with absolute paths in each checkpoint. These are not copied during fine-tuning or continue training, causing issues if the "best ckpt" is deleted or moved.

With this change, the system will use current weights for testing if the "best ckpt" is not found, ensuring smooth execution of fine-tuning and continue training stages.

The changes made are as follows:

if ckpt_path == "":
    log.warning("Best ckpt not found! Using current weights for testing...")
    ckpt_path = None

to

if not ckpt_path:
    log.warning("Best ckpt not found! Using current weights for testing...")
    ckpt_path = None

Fixes #621

Before submitting

  • Did you make sure title is self-explanatory and the description concisely explains the PR?
  • Did you make sure your PR does only one thing, instead of bundling different changes together?
  • Did you list all the breaking changes introduced by this pull request?
  • Did you test your PR locally with pytest command?
  • Did you run pre-commit hooks with pre-commit run -a command?

Did you have fun?

Make sure you had fun coding 馃檭

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Issue with "best ckpt" absolute paths during fine-tuning and continue training
1 participant