Is it possible for obtain the total number of records from a .tfrecords
file ? Related to this, how does one generally keep track of the number of epochs that have elapsed while training models? While it is possible for us to specify the batch_size
and num_of_epochs
, I am not sure if it is straightforward to obtain values such as current epoch
, number of batches per epoch etc - just so that I could have more control of how the training is progressing. Currently, I'm just using a dirty hack to compute this as I know before hand how many records there are in my .tfrecords file and the size of my minibatches. Appreciate any help..
To count the number of records, you should be able to use tf.python_io.tf_record_iterator
.
c = 0
for fn in tf_records_filenames:
for record in tf.python_io.tf_record_iterator(fn):
c += 1
To just keep track of the model training, tensorboard comes in handy.