郑之杰的个人博客
欢迎光临
Searching for Activation Functions
Swish:自动搜索得到的一种自门控的激活函数.
The Quest for the Golden Activation Function
ELiSH:使用遗传算法寻找最优激活函数.
Self-Normalizing Neural Networks
SELU:自标准化的指数线性单元.
使用sympy.solve求解方程
sympy.solve.
Empirical Evaluation of Rectified Activations in Convolutional Network
RReLU:受限激活函数的经验验证.
Delving Deep into Rectifiers: Surpassing Human-Level Performance on ImageNet Classification
PReLU:分类任务超越人类表现.