Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add Support for IA3 Adapters in add_weighted_adapter Method #1696

Closed
Abdullah-kwl opened this issue Apr 29, 2024 · 5 comments
Closed

Add Support for IA3 Adapters in add_weighted_adapter Method #1696

Abdullah-kwl opened this issue Apr 29, 2024 · 5 comments

Comments

@Abdullah-kwl
Copy link

Feature request

I propose adding support for IA3 adapters in the add_weighted_adapter method used within the PEFT module of the transformers library. IA3 adapters enhance model adaptability with minimal parameter increase, offering significant benefits for efficient and effective model fine-tuning across various tasks. This feature would allow users to integrate IA3 adapters seamlessly, thereby expanding the functional capabilities of the library.

Motivation

The motivation behind this proposal is to address the current limitation in the PEFT module of the transformers library, where IA3 adapters are supported but the merging of IA3 adapters using the add_weighted_adapter method is not implemented. This gap leads to errors such as 'IA3Model' object has no attribute 'add_weighted_adapter' when attempting to merge IA3 adapters.

The ability to merge IA3 adapters is crucial for users who rely on the PEFT module for efficient model adaptation and fine-tuning. Without this functionality, users are unable to leverage the benefits of IA3 adapters, such as reduced memory footprint and customizable model tuning, within the PEFT architecture.

By adding support for merging IA3 adapters using the add_weighted_adapter method, users can seamlessly integrate IA3 adapters into their workflows, enhancing the adaptability and efficiency of their transformer models. This enhancement aligns with the broader goal of providing users with flexible and efficient tools for NLP tasks, thereby improving the overall usability and effectiveness of the transformers library.

Your contribution

I do not have a complete implementation in mind, but I suggest starting with an evaluation of the current add_weighted_adapter function to determine the necessary modifications for supporting IA3 adapters. Collaboration with researchers familiar with IA3 and PEFT could provide insights into feasible approaches.

@amyeroberts
Copy link

cc @younesbelkada @pacman100

@younesbelkada
Copy link
Collaborator

Thanks ! This issue might be more appropriate to live in PEFT repo, @amyeroberts is it ok if I transfer the issue there?

@amyeroberts
Copy link

@younesbelkada Of course - thanks for handling!

@younesbelkada younesbelkada transferred this issue from huggingface/transformers Apr 30, 2024
@younesbelkada
Copy link
Collaborator

cc @BenjaminBossan who worked on IA3 🙏

@BenjaminBossan
Copy link
Member

Duplicate of #1688.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants