Top "Activation-function" questions

Activation function is a non-linear transformation, usually applied in neural networks to the output of the linear or convolutional layer.

What is the intuition of using tanh in LSTM

In LSTM Network (Understanding LSTMs), Why input gate and output gate use tanh? what is the intuition behind this? it …

machine-learning deep-learning lstm recurrent-neural-network activation-function
How to make a custom activation function with only Python in Tensorflow?

Suppose you need to make an activation function which is not possible using only pre-defined tensorflow building-blocks, what can you …

python tensorflow neural-network deep-learning activation-function
Activation function for output layer for regression models in Neural Networks

I have been experimenting with neural networks these days. I have come across a general question regarding the activation function …

neural-network regression activation-function
Tensorflow error: Using a `tf.Tensor` as a Python `bool` is not allowed

I am struggling to implement an activation function in TensorFlow in Python. The code is the following: def myfunc(x): …

tensorflow machine-learning deep-learning keras activation-function
Why use softmax only in the output layer and not in hidden layers?

Most examples of neural networks for classification tasks I've seen use the a softmax layer as output activation function. Normally, …

machine-learning neural-network classification softmax activation-function
How to implement the derivative of Leaky Relu in python?

How would I implement the derivative of Leaky ReLU in Python without using Tensorflow? Is there a better way than …

python neural-network activation-function
Keras How to use max_value in Relu activation function

Relu function as defined in keras/activation.py is: def relu(x, alpha=0., max_value=None): return K.relu(x, …

tensorflow keras activation-function
How do I implement leaky relu using Numpy functions

I am trying to implement leaky Relu, the problem is I have to do 4 for loops for a 4 dimensional array …

numpy activation-function relu