-
Notifications
You must be signed in to change notification settings - Fork 17
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
请问如何继续训练? #56
Comments
chatglm2_finetuning/training/train_pl.py Line 115 in bf9d82a
看下这里。lora的话, 就填写对应路径. |
@ssbuild 感谢,解决了,用的pl_model.load_sft_weight('./best_ckpt/last',is_trainable=True) //不加adapter_model.bin |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
如题,我在train.py的pl_model = MyTransformer(**transformer_args)后面尝试加上
from deep_training.nlp.models.petl.lora.petl_model import PetlModel
PetlModel.from_pretrained(pl_model.backbone, os.path.join(output_weight_dir, "last"))
报错了。
The text was updated successfully, but these errors were encountered: