Parameter Efficient Transfer Learning
第四阶段第十二次会议 主讲人:蒋昊峻
大纲
Brief Intorduction to Parameter Efficient Transfer Learning
Adapter
- Houlsby, Neil, et al. “Parameter-efficient transfer learning for NLP.“ ICML 2019.
- Mahabadi, Rabeeh Karimi, et al. “Compacter: Efficient low-rank hypercomplex adapter layers.“NeurIPS 2021.
- Pfeiffer, Jonas, et al. “AdapterFusion: Non-destructive task composition for transfer learning.” EACL 2021.
- Mahabadi, Rabeeh Karimi, et al. “Parameter-efficient multi-task fine-tuning for transformers via shared hypernetworks.” ACL 2021 Oral.
- Rücklé, Andreas, et al. “Adapterdrop: On the efficiency of adapters in transformers.“ EMNLP 2021.
Adapter in CV
- Chen, Hao, et al. “Conv-Adapter: Exploring Parameter Efficient Transfer Learning for ConvNets.” arXiv:2208.07463.
- Pan, Junting, et al. “ST-Adapter: Parameter-Efficient Image-to-Video Transfer Learning for Action Recognition.” arXiv:2206.13559.
Prompt Tuning
- Li, Xiang Lisa, and Percy Liang. “Prefix-tuning: Optimizing continuous prompts for generation.“ ACL 2021.
- Lester, Brian, Rami Al-Rfou, and Noah Constant. “The power of scale for parameter-efficient prompt tuning.“ EMNLP 2021
- Zhou, Kaiyang, et al. “Learning to prompt for vision-language models.” IJCV 2022.
- Jia, Menglin, et al. “Visual prompt tuning.“ ECCV 2022.
视频 & PPT
- 会议录制视频&PPT链接:https://meeting.tencent.com/v2/cloud-record/share?id=deabf758-ab6b-46b2-aa3f-2e02848c0970&from=3