SMU: smooth activation function for deep networks using smoothing maximum technique SMU:基于光滑最大值技术的光滑激活函数.
SAU: Smooth activation function using convolution with approximate identities SAU:使用Dirac函数构造激活函数的光滑近似.