Is there a function call or another way to count the total number of parameters in a tensorflow model?
By parameters I mean: an N dim vector of trainable variables has N parameters, a NxM
matrix has N*M
parameters, etc. So essentially I'd like to sum the product of the shape dimensions of all the trainable variables in a tensorflow session.
Loop over the shape of every variable in tf.trainable_variables()
.
total_parameters = 0
for variable in tf.trainable_variables():
# shape is an array of tf.Dimension
shape = variable.get_shape()
print(shape)
print(len(shape))
variable_parameters = 1
for dim in shape:
print(dim)
variable_parameters *= dim.value
print(variable_parameters)
total_parameters += variable_parameters
print(total_parameters)
Update: I wrote an article to clarify the dynamic/static shapes in Tensorflow because of this answer: https://pgaleone.eu/tensorflow/2018/07/28/understanding-tensorflow-tensors-shape-static-dynamic/