The Quest for the Golden Activation Function

ELiSH:使用遗传算法寻找最优激活函数.

Self-Normalizing Neural Networks

SELU:自标准化的指数线性单元.

使用sympy.solve求解方程

sympy.solve.

Empirical Evaluation of Rectified Activations in Convolutional Network

RReLU:受限激活函数的经验验证.

Delving Deep into Rectifiers: Surpassing Human-Level Performance on ImageNet Classification

PReLU:分类任务超越人类表现.

Rectifier Nonlinearities Improve Neural Network Acoustic Models

LeakyReLU:使用修正的非线性提高神经网络声学模型.