Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

请问一下,是否支持qwen1.5-72b-chat-int4版本的微调呢? #226

Open
ArlanCooper opened this issue Mar 27, 2024 · 1 comment
Open

Comments

@ArlanCooper
Copy link

请问一下是否可以支持qwen1.5-72b-chat-int4版本的微调呢?
我这边尝试使用lora微调qwen1.5-72b-chat-int4版本的,报错:

运行命令:

python train.py --train_args_file train_args/sft/lora/qwen1.5-72b-int4-sft-lora.json

报错信息:


2024-03-27 10:44:15.676 | INFO     | __main__:load_tokenizer:211 - vocab_size of tokenizer: 151643
2024-03-27 10:44:15.676 | INFO     | __main__:load_model:220 - Loading model from base model: /data/share/rwq/Qwen1.5-72B-Chat-GPTQ-Int4
2024-03-27 10:44:15.676 | INFO     | __main__:load_model:221 - Train model with lora
Traceback (most recent call last):
  File "/home/powerop/work/rwq/Firefly/train.py", line 400, in <module>
    main()
  File "/home/powerop/work/rwq/Firefly/train.py", line 385, in main
    trainer = init_components(args, training_args)
  File "/home/powerop/work/rwq/Firefly/train.py", line 338, in init_components
    components = load_model(args, training_args)
  File "/home/powerop/work/rwq/Firefly/train.py", line 246, in load_model
    model = AutoModelForCausalLM.from_pretrained(args.model_name_or_path, **model_kwargs)
  File "/home/powerop/work/conda/envs/firefly/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py", line 526, in from_pretrained
    config, kwargs = AutoConfig.from_pretrained(
  File "/home/powerop/work/conda/envs/firefly/lib/python3.10/site-packages/transformers/models/auto/configuration_auto.py", line 1124, in from_pretrained
    return config_class.from_dict(config_dict, **unused_kwargs)
  File "/home/powerop/work/conda/envs/firefly/lib/python3.10/site-packages/transformers/configuration_utils.py", line 792, in from_dict
    logger.info(f"Model config {config}")
  File "/home/powerop/work/conda/envs/firefly/lib/python3.10/site-packages/transformers/configuration_utils.py", line 824, in __repr__
    return f"{self.__class__.__name__} {self.to_json_string()}"
  File "/home/powerop/work/conda/envs/firefly/lib/python3.10/site-packages/transformers/configuration_utils.py", line 938, in to_json_string
    config_dict = self.to_diff_dict()
  File "/home/powerop/work/conda/envs/firefly/lib/python3.10/site-packages/transformers/configuration_utils.py", line 834, in to_diff_dict
    config_dict = self.to_dict()
  File "/home/powerop/work/conda/envs/firefly/lib/python3.10/site-packages/transformers/configuration_utils.py", line 913, in to_dict
    self.quantization_config.to_dict()
AttributeError: 'NoneType' object has no attribute 'to_dict'
@DayDreamChaser
Copy link

llama factory是可以的

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants