Activation Function
The non linear functions used in Neural Networks . The activation function computes the value that is the output of a neuron. Its parameter is the input vector multiplied with the neurons weights plus the neurons own bias.
Sigmoid Functions #
Or other non-linear functions #
ReLU #
-
max of x and 0
Leaky ReLU: max(0.1x, x) #
published by:
@inproceedings{maas2013rectifier,
title={Rectifier nonlinearities improve neural network acoustic models},
author={Maas, Andrew L and Hannun, Awni Y and Ng, Andrew Y and others},
booktitle={Proc. icml},
volume={30},
number={1},
pages={3},
year={2013},
organization={Atlanta, Georgia, USA}
}