SAU: Smooth activation function using convolution with approximate identities

SAU:使用Dirac函数构造激活函数的光滑近似.

Improving Deep Learning by Inverse Square Root Linear Units (ISRLUs)

使用逆平方根线性单元(ISRLU)改进深度学习.

ZerO Initialization: Initializing Neural Networks with only Zeros and Ones

ZerO初始化: 仅使用0和1初始化神经网络.

Dynamic ReLU

DY-ReLU:动态整流线性单元.

Learning Activation Functions to Improve Deep Neural Networks

APL:自适应分段线性单元.

Orthogonal-Padé Activation Functions: Trainable Activation functions for smooth and faster convergence in deep networks

OPAU:基于正交Padé近似的可训练激活函数.