Top "Backpropagation" questions

Backpropagation is a method of the gradient computation, often used in artificial neural networks to perform gradient descent.

ReLU derivative in backpropagation

I am about making backpropagation on a neural network that uses ReLU. In a previous project of mine, I did …

neural-network backpropagation sigmoid relu
How does keras handle multiple losses?

If I have something like: model = Model(inputs = input, outputs = [y1,y2]) l1 = 0.5 l2 = 0.3 model.compile(loss = [loss1,loss2], loss_…

deep-learning keras backpropagation loss-function
What does the parameter retain_graph mean in the Variable's backward() method?

I'm going through the neural transfer pytorch tutorial and am confused about the use of retain_variable(deprecated, now referred …

neural-network conv-neural-network backpropagation pytorch automatic-differentiation
What is the difference between SGD and back-propagation?

Can you please tell me the difference between Stochastic Gradient Descent (SGD) and back-propagation?

machine-learning artificial-intelligence difference backpropagation gradient-descent
How does a back-propagation training algorithm work?

I've been trying to learn how back-propagation works with neural networks, but yet to find a good explanation from a …

artificial-intelligence computer-science neural-network backpropagation
How to implement the Softmax derivative independently from any loss function?

For a neural networks library I implemented some activation functions and loss functions and their derivatives. They can be combined …

neural-network regression backpropagation derivative softmax
Why do sigmoid functions work in Neural Nets?

I have just started programming for Neural networks. I am currently working on understanding how a Backpropogation (BP) neural net …

neural-network backpropagation
How to build a multiple input graph with tensor flow?

Is it possible to define a TensorFlow graph with more than one input? For instance, I want to give the …

neural-network tensorflow backpropagation
Understanding Neural Network Backpropagation

Update: a better formulation of the issue. I'm trying to understand the backpropagation algorithm with an XOR neural network as …

computer-science machine-learning neural-network backpropagation
Extremely small or NaN values appear in training neural network

I'm trying to implement a neural network architecture in Haskell, and use it on MNIST. I'm using the hmatrix package …

algorithm haskell neural-network backpropagation