When you run a Keras neural network model you might see something like this in the console:
Epoch 1/3
6/1000 [..............................] - ETA: 7994s - loss: 5111.7661
As time goes on the loss hopefully improves. I want to log these losses to a file over time so that I can learn from them. I have tried:
logging.basicConfig(filename='example.log', filemode='w', level=logging.DEBUG)
but this doesn't work. I am not sure what level of logging I need in this situation.
I have also tried using a callback like in:
def generate_train_batch():
while 1:
for i in xrange(0,dset_X.shape[0],3):
yield dset_X[i:i+3,:,:,:],dset_y[i:i+3,:,:]
class LossHistory(keras.callbacks.Callback):
def on_train_begin(self, logs={}):
self.losses = []
def on_batch_end(self, batch, logs={}):
self.losses.append(logs.get('loss'))
logloss=LossHistory()
colorize.fit_generator(generate_train_batch(),samples_per_epoch=1000,nb_epoch=3,callbacks=['logloss'])
but obviously this isn't writing to a file. Whatever the method, through a callback or the logging module or anything else, I would love to hear your solutions for logging loss of a keras neural network to a file. Thanks!
You can use CSVLogger callback.
as example:
from keras.callbacks import CSVLogger
csv_logger = CSVLogger('log.csv', append=True, separator=';')
model.fit(X_train, Y_train, callbacks=[csv_logger])
Look at: Keras Callbacks