How to interpret increase in both loss and accuracy

Nitin picture Nitin · Dec 1, 2016 · Viewed 19.1k times · Source

I have run deep learning models(CNN's) using tensorflow. Many times during the epoch, i have observed that both loss and accuracy have increased, or both have decreased. My understanding was that both are always inversely related. What could be scenario where both increase or decrease simultaneously.

Answer

nessuno picture nessuno · Dec 1, 2016

The loss decreases as the training process goes on, except for some fluctuation introduced by the mini-batch gradient descent and/or regularization techniques like dropout (that introduces random noise).

If the loss decreases, the training process is going well.

The (validation I suppose) accuracy, instead, it's a measure of how good the predictions of your model are.

If the model is learning, the accuracy increases. If the model is overfitting, instead, the accuracy stops to increase and can even start to decrease.

If the loss decreases and the accuracy decreases, your model is overfitting.

If the loss increases and the accuracy increase too is because your regularization techniques are working well and you're fighting the overfitting problem. This is true only if the loss, then, starts to decrease whilst the accuracy continues to increase. Otherwise, if the loss keep growing your model is diverging and you should look for the cause (usually you're using a too high learning rate value).