Does anyone know the default activation function used in the recurrent layers in Keras? https://keras.io/layers/recurrent/
It says the default activation function is linear. But what about the default recurrent activation function. Nothing is mentioned about that. Any help would be highly appreciated. Thanks in advance
Keras Recurrent
is an abstact class for recurrent layers. In Keras 2.0 all default activations are linear for all implemented RNNs
(LSTM
, GRU
and SimpleRNN
). In previous versions you had:
linear
for SimpleRNN
,tanh
for LSTM
and GRU
.