-
Notifications
You must be signed in to change notification settings - Fork 1.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
model merge_and_unload do not support layer_replication #1707
Comments
This is not easily possible. The reason is that those replicated layers share the underlying base weights between multiple layers. Therefore, we cannot merge LoRA weights, as different LoRA weights would be merged into the base weights, resulting in incorrect outputs. |
what I mean is when I create a lora with layer_replication for expanding blocks from 22 to 32,I merge and save it,but when I load the output model,I find there are only 22 blocks in the final model.
…---Original---
From: "Benjamin ***@***.***>
Date: Mon, May 6, 2024 17:54 PM
To: ***@***.***>;
Cc: "Xinlu ***@***.******@***.***>;
Subject: Re: [huggingface/peft] model merge_and_unload do not supportlayer_replication (Issue #1707)
This is not easily possible. The reason is that those replicated layers share the underlying base weights between multiple layers. Therefore, we cannot merge LoRA weights, as different LoRA weights would be merged into the base weights, resulting in incorrect outputs.
—
Reply to this email directly, view it on GitHub, or unsubscribe.
You are receiving this because you authored the thread.Message ID: ***@***.***>
|
As mentioned, merging with layer replication isn't really possible. Also, when you load the model, make sure that you first load the base model, then the LoRA adapter using |
sorry,what I mean is that I want to load the base model and the lora model then merge_and_unload,get a new 1.5B model with 32block,not a original 1B model with 22block.
…---Original---
From: "Benjamin ***@***.***>
Date: Mon, May 6, 2024 18:27 PM
To: ***@***.***>;
Cc: "Xinlu ***@***.******@***.***>;
Subject: Re: [huggingface/peft] model merge_and_unload do not supportlayer_replication (Issue #1707)
What I mean is when I create a lora with layer_replication for expanding blocks from 22 to 32, I merge and save it, but when I load the output model, I find there are only 22 blocks in the final model.
As mentioned, merging with layer replication isn't really possible.
Also, when you load the model, make sure that you first load the base model, then the LoRA adapter using PeftModel.from_pretrained(...). This should restore the replicated layers.
—
Reply to this email directly, view it on GitHub, or unsubscribe.
You are receiving this because you authored the thread.Message ID: ***@***.***>
|
This is not really an option right now with PEFT. I guess what you could try is to create clones of the weights that are currently being shared, edit the |
This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread. |
System Info
when you trained a model with layer_replication in LoRAConfig,you will find that can not be merge to the base model in a right way
Who can help?
No response
Information
Tasks
examples
folderReproduction
just set layer_replication in LoraConfig, train a sample lora and merge it to the base model
Expected behavior
generate a modeling_config.py script that can work properly with "layer_replication"
The text was updated successfully, but these errors were encountered: