-
-
Notifications
You must be signed in to change notification settings - Fork 753
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Occurring error when I add new tokens to the tokenizer. #237
Comments
@charlesCXK Oh I think you'll have to add I haven't yet fixed some parts, so hopefully I'll fix this by today! Sorry on the delay! |
@danielhanchen Thanks for your reply! I think the core problem is not related to add modules_to_save. We can see that the model is already saved (" |
Dear author, |
@charlesCXK Oh thanks!! So sorry again on the issue! I'll take a look a your PR - thanks so much again! |
Hi, bumping this up again! I added a new token to the tokenizer. Now I want to load my LoRA checkpoint using
@danielhanchen Would you mind reviewing charlesCXK's PR? |
@charlesCXK @chtmp223 Whoops I actually totally missed this, but now using |
Hi,
I want to add new tokens to the tokenizer through:
Then I save the model as LoRA adapters through:
When I load the model, the error occurs:
It seems that the saved checkpoint does not match the pre-defined model architecture (with 32000-d output). What should I do to solve this issue?
Thanks,
The text was updated successfully, but these errors were encountered: