SMU: smooth activation function for deep networks using smoothing maximum technique

SMU:基于光滑最大值技术的光滑激活函数.

函数的光滑化(Smoothing)

Smooth function and smoothing technique.

MicroNet: Towards Image Recognition with Extremely Low FLOPs

MicroNet:极低FLOPs的图像识别网络.

GhostNet: More Features from Cheap Operations

GhostNet:使用廉价操作构造更多特征.

SAU: Smooth activation function using convolution with approximate identities

SAU:使用Dirac函数构造激活函数的光滑近似.

Improving Deep Learning by Inverse Square Root Linear Units (ISRLUs)

使用逆平方根线性单元(ISRLU)改进深度学习.