New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Combining multiple LoRA adapters #429
Comments
Hi @winglian, this is posible today using Adapter Merging like TIES and DARE. You can also linearly combine adapters per request. Please take a look and let me know if you have any feedback! |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Feature request
Is it possible to combine multiple LoRA adapters like you might do to combine multiple styles with Stable Diffusion?
Motivation
I think we could get higher quality model outputs from multiple LoRAs by combining them.
Your contribution
I'm not familiar with the codebase, but if this is architecturally possible and someone pointed me in the right direction, I'm happy to take a stab at it and create a PR.
The text was updated successfully, but these errors were encountered: