Theano HiddenLayer Activation Function

A.M. picture A.M. · Oct 22, 2014 · Viewed 8.9k times · Source

Is there anyway to use Rectified Linear Unit (ReLU) as the activation function of the hidden layer instead of tanh() or sigmoid() in Theano? The implementation of the hidden layer is as follows and as far as I have searched on the internet ReLU is not implemented inside the Theano.

class HiddenLayer(object):
  def __init__(self, rng, input, n_in, n_out, W=None, b=None, activation=T.tanh):
    pass

Answer

nouiz picture nouiz · Oct 22, 2014

relu is easy to do in Theano:

switch(x<0, 0, x)

To use it in your case make a python function that will implement relu and pass it to activation:

def relu(x):
    return theano.tensor.switch(x<0, 0, x)
HiddenLayer(..., activation=relu)

Some people use this implementation: x * (x > 0)

UPDATE: Newer Theano version have theano.tensor.nnet.relu(x) available.