Scikit-learn lists these as the implemented activation functions for it's multi-layer perceptron classifier:
‘identity’, no-op activation, useful to implement linear bottleneck, returns f(x) = x ‘logistic’, the logistic sigmoid function, returns f(x) = 1 / (1 + exp(-x)). ‘tanh’, the hyperbolic tan function, returns f(x) = tanh(x). ‘relu’, the rectified linear unit function, returns f(x) = max(0, x) Does anyone know if it is possible to implement a custom activation function? If not, can someone point me to a library where this is possible?