Polarized Self-Attention: Towards High-quality Pixel-wise Regression

极化自注意力: 面向高质量像素级回归.

DMSANet: Dual Multi Scale Attention Network

DMSANet: 对偶多尺度注意力网络.

SimAM: A Simple, Parameter-Free Attention Module for Convolutional Neural Networks

SimAM:为卷积神经网络设计的简单无参数注意力模块.

Residual Attention: A Simple but Effective Method for Multi-Label Recognition

为多标签分类设计的简单有效的残差注意力.

局部保留投影(Locality Preserving Projection, LPP)

Locality Preserving Projections.

Sluice networks: Learning what to share between loosely related tasks

水闸网络:学习松散相关任务之间的共享表示.