Parameter-Efficient Transfer Learning with Diff Pruning

通过Diff Pruning实现参数高效的迁移学习.

DensePose From WiFi

通过WiFi信号实现密集的人体姿态估计.

LoRA: Low-Rank Adaptation of Large Language Models

LoRA:大型语言模型的低秩调整.

AdapterDrop: On the Efficiency of Adapters in Transformers

AdapterDrop:提高Transformer中的Adapter模块的效率.

AdapterFusion: Non-Destructive Task Composition for Transfer Learning

AdapterFusion:迁移学习中的非破坏性任务组合.

P-Tuning v2: Prompt Tuning Can Be Comparable to Fine-tuning Universally Across Scales and Tasks

P-Tuning v2:提示微调可与跨规模和任务的通用微调相媲美.