Code for the ACL 2022 paper "Continual Sequence Generation with Adaptive Compositional Modules"
-
Updated
Apr 4, 2022 - Python
Code for the ACL 2022 paper "Continual Sequence Generation with Adaptive Compositional Modules"
CAMERO: Consistency Regularized Ensemble of Perturbed Language Models with Weight Sharing (ACL 2022)
A novel method to tune language models. Codes and datasets for paper ``GPT understands, too''.
[arXiv] Cross-Modal Adapter for Text-Video Retrieval
Code for EACL'23 paper "Udapter: Efficient Domain Adaptation Using Adapters"
Live Training for Open-source Big Models
CodeUp: A Multilingual Code Generation Llama2 Model with Parameter-Efficient Instruction-Tuning on a Single RTX 3090
This Repository surveys the paper focusing on Prompting and Adapters for Speech Processing.
A plug-and-play library for parameter-efficient-tuning (Delta Tuning)
INTERSPEECH 23 - Refunction Whisper to recognize new tasks with adapters!
Research Trends in LLM-guided Multimodal Learning.
An optimized deep prompt tuning strategy comparable to fine-tuning across scales and tasks
K-CAI NEURAL API - Keras based neural network API that will allow you to create parameter-efficient, memory-efficient, flops-efficient multipath models with new layer types. There are plenty of examples and documentation.
This repository contains the source code for the paper "Grouped Pointwise Convolutions Reduce Parameters in Convolutional Neural Networks".
A collection of parameter-efficient transfer learning papers focusing on computer vision and multimodal domains.
[CVPR2024] The code of "UniPT: Universal Parallel Tuning for Transfer Learning with Efficient Parameter and Memory"
On Transferability of Prompt Tuning for Natural Language Processing
Collection of Tools and Papers related to Adapters / Parameter-Efficient Transfer Learning/ Fine-Tuning
A Unified Library for Parameter-Efficient and Modular Transfer Learning
🤗 PEFT: State-of-the-art Parameter-Efficient Fine-Tuning.
Add a description, image, and links to the parameter-efficient-learning topic page so that developers can more easily learn about it.
To associate your repository with the parameter-efficient-learning topic, visit your repo's landing page and select "manage topics."