This is my code that works if I use other activation layers like tanh:
model = Sequential()
act = keras.layers.advanced_activations.PReLU(init='zero', weights=None)
model.add(Dense(64, input_dim=14, init='uniform'))
model.add(Activation(act))
model.add(Dropout(0.15))
model.add(Dense(64, init='uniform'))
model.add(Activation('softplus'))
model.add(Dropout(0.15))
model.add(Dense(2, init='uniform'))
model.add(Activation('softmax'))
sgd = SGD(lr=0.1, decay=1e-6, momentum=0.9, nesterov=True)
model.compile(loss='binary_crossentropy', optimizer=sgd)
model.fit(X_train, y_train, nb_epoch=20, batch_size=16, show_accuracy=True, validation_split=0.2, verbose = 2)
In this case, it doesn't work and says "TypeError: 'PReLU' object is not callable" and the error is called at the model.compile line. Why is this the case? All the non-advanced activation functions works. However, neither of the advanced activation functions, including this one, works.
The correct way to use the advanced activations like PReLU is to use it with add()
method and not wrapping it using Activation
class. Example:
model = Sequential()
act = keras.layers.advanced_activations.PReLU(init='zero', weights=None)
model.add(Dense(64, input_dim=14, init='uniform'))
model.add(act)