AdapterFusion: Non-Destructive Task Composition for Transfer Learning

AdapterFusion:迁移学习中的非破坏性任务组合.

P-Tuning v2: Prompt Tuning Can Be Comparable to Fine-tuning Universally Across Scales and Tasks

P-Tuning v2:提示微调可与跨规模和任务的通用微调相媲美.

GPT Understands, Too

P-Tuning:GPT也能够擅长神经语言理解任务.

The Power of Scale for Parameter-Efficient Prompt Tuning

Prompt Tuning:参数高效的提示微调.

Prefix-Tuning: Optimizing Continuous Prompts for Generation

Prefix-Tuning:优化生成的连续提示.

BitFit: Simple Parameter-efficient Fine-tuning for Transformer-based Masked Language-models

BitFit:基于Transformer的掩码语言模型的简单参数高效微调.