Top "Relu" questions

ReLu is an abbreviation for Rectified Linear Unit, in the branch of neural networks.

ReLU derivative in backpropagation

I am about making backpropagation on a neural network that uses ReLU. In a previous project of mine, I did …

neural-network backpropagation sigmoid relu