SAU: Smooth activation function using convolution with approximate identities SAU:使用Dirac函数构造激活函数的光滑近似.
Orthogonal-Padé Activation Functions: Trainable Activation functions for smooth and faster convergence in deep networks OPAU:基于正交Padé近似的可训练激活函数.