What are forward and backward passes in neural networks?

Shridhar R Kulkarni picture Shridhar R Kulkarni · Apr 20, 2016 · Viewed 27.3k times · Source

What is the meaning of forward pass and backward pass in neural networks?

Everybody is mentioning these expressions when talking about backpropagation and epochs.

I understood that forward pass and backward pass together form an epoch.

Answer

Kornel Dylski picture Kornel Dylski · Jan 18, 2018

The "forward pass" refers to calculation process, values of the output layers from the inputs data. It's traversing through all neurons from first to last layer.

A loss function is calculated from the output values.

And then "backward pass" refers to process of counting changes in weights (de facto learning), using gradient descent algorithm (or similar). Computation is made from last layer, backward to the first layer.


Backward and forward pass makes together one "iteration".

During one iteration, you usually pass a subset of the data set, which is called "mini-batch"
(if you pass all data at once, it is called "batch")

"Epoch" means passing the entire data set. One epoch contains number_of_items/batch_size iterations