How to count total number of trainable parameters in a tensorflow model?

number of parameters in a model tensorflow
tensorflow model summary
tensorflow trainable variables
tensorflow print model structure
tensorflow save model
tensorflow define model
what is a tensorflow model
tensorflow create model

Is there a function call or another way to count the total number of parameters in a tensorflow model?

By parameters I mean: an N dim vector of trainable variables has N parameters, a NxM matrix has N*M parameters, etc. So essentially I'd like to sum the product of the shape dimensions of all the trainable variables in a tensorflow session.

Loop over the shape of every variable in tf.trainable_variables().

total_parameters = 0
for variable in tf.trainable_variables():
    # shape is an array of tf.Dimension
    shape = variable.get_shape()
    print(shape)
    print(len(shape))
    variable_parameters = 1
    for dim in shape:
        print(dim)
        variable_parameters *= dim.value
    print(variable_parameters)
    total_parameters += variable_parameters
print(total_parameters)

Update: I wrote an article to clarify the dynamic/static shapes in Tensorflow because of this answer: https://pgaleone.eu/tensorflow/2018/07/28/understanding-tensorflow-tensors-shape-static-dynamic/

What is the best way to count the total number of parameters in a , To my understanding, a GraphDef doesn't have enough information to describe Variables . How to count total number of trainable parameters in a tensorflow model? But if the model is defined with a graph loaded from .pb file, all the proposed answers don't work. Basically I loaded the graph with the following function.

I have an even shorter version, one line solution using using numpy:

np.sum([np.prod(v.get_shape().as_list()) for v in tf.trainable_variables()])

Number of trainable parameters? · Issue #365 · tensorflow/nmt , Assuming all your variables are tensors where the number of parameters is the of its shape, this one-liner will calculate the number of parameters in your model all the trainable variable tensors in the graph and multiply the dimensions of  Asked on December 17, 2018 in Tensorflow. To count total number of trainable parameters in a tensorflow use tf.trainable_variables().

Not sure if the answer given actually runs (I found you need to convert the dim object to an int for it to work). Here is is one that works and you can just copy paste the functions and call them (added a few comments too):

def count_number_trainable_params():
    '''
    Counts the number of trainable variables.
    '''
    tot_nb_params = 0
    for trainable_variable in tf.trainable_variables():
        shape = trainable_variable.get_shape() # e.g [D,F] or [W,H,C]
        current_nb_params = get_nb_params_shape(shape)
        tot_nb_params = tot_nb_params + current_nb_params
    return tot_nb_params

def get_nb_params_shape(shape):
    '''
    Computes the total number of params for a given shap.
    Works for any number of shapes etc [D,F] or [W,H,C] computes D*F and W*H*C.
    '''
    nb_params = 1
    for dim in shape:
        nb_params = nb_params*int(dim)
    return nb_params 

Count trainable parameters total in Tensorflow, Hello everyone! I'd like to know how to compute the number of trainable parameters in my network with  Here is some sample code that counts total number of params total_params = 0 for v in tf . trainable_variables (): print ( v ) shape = v . get_shape () count = 1 for dim in shape : count *= dim . value total_params += count print ( "Total Params: % d" % total_params )

The two existing answers are good if you're looking into computing the number of parameters yourself. If your question was more along the lines of "is there an easy way to profile my TensorFlow models?", I would highly recommend looking into tfprof. It profiles your model, including calculating the number of parameters.

Counting No. of Parameters in Deep Learning Models by Hand, Count trainable parameters total in Tensorflow Here is some sample code that counts total number of params. Assuming all your variables are tensors where the number of parameters is the product of its shape, this one-liner will calculate the number of parameters in your model [code]np.sum([np.product([xi.value for xi in x.get_shape()]) for x in tf.all_v

I'll throw in my equivalent but shorter implementation:

def count_params():
    "print number of trainable variables"
    size = lambda v: reduce(lambda x, y: x*y, v.get_shape().as_list())
    n = sum(size(v) for v in tf.trainable_variables())
    print "Model size: %dK" % (n/1000,)

Models and layers, After building the model , call model.count_params() to verify how many parameters are trainable. 1. Counting the number of trainable parameters of deep learning models is considered too trivial, because your code can already do this for you. But I’d like to keep my notes here for us to refer to once in a while.

Writing custom layers and models with Keras, The total number of trainable and non-trainable parameters of the model. For the model we defined above, we get the  You can mark variables as "non-trainable" on definition: v = tf.Variable(tf.zeros([1]), trainable=False) From the linked documentation (circa TensorFlow v0.11): trainable: If True, the default, also adds the variable to the graph collection GraphKeys.TRAINABLE_VARIABLES.

Understanding Tensorflow's tensors shape: static and dynamic – P , It's not included in the trainable weights: our Linear layer took an input_dim argument that was used to compute the In many cases, you may not know in advance the size of your inputs, and you  Here is the code to reproduce the issue: import tensorflow as tf import numpy as np IMG_SHAPE = (160, 160, 3) # Create the base model from the pre-trained model MobileNet V2 base_model = tf.keras.applications.MobileNetV2 (input_shape=IMG_SHAPE, include_top=False, weights='imagenet') base_model.trainable = False #

How do I check the number of parameters of a model?, Bonus: How to count the total number of trainable parameters in a Tensorflow model? After reading this  Hello everyone! I'd like to know how to compute the number of trainable parameters in my network with the configuration I am using. I've read here that the way to compute this number is by calling these functions: def count_number_traina

Comments
  • your question description and title do not match (unless I'm confusing the terminology of graph and model). In the question you ask about a graph and the title you ask about a model. What if you had two different models? I'd suggest to clarify that on the question.
  • if you have more than one model, how does tf.trainable_variables() know which one to use?
  • tf.trainable_variables() returns all the variables marked as trainable that are present in the current graph. If in the current graph you have more than one model, you have to manually filter the variables using theyr names. Somethink like if variable.name.strartswith("model2"): ...
  • this solution gives me the error "Exception occurred: Can't convert 'int' object to str implicitly". You need to cast 'dim' explicitly to 'int' as the suggested in the answer below (which I suggest to be the correct answer)
  • really helpful,
  • It seems in TF2, this has changed to tf.compat.v1.trainable_variables()! But then this return 0 parameters!
  • in my version, v doesn't have a shape_as_list() function but only get_shape() function
  • I think earlier versions don't have .shape but get_shape(). Updated my answer. Anyway, I wrote v.shape.as_list() and not v.shape_as_list().
  • np.sum([np.prod(v.shape) for v in tf.trainable_variables()]) works as well in TensorFlow 1.2
  • the answer does work (r0.11.0). yours is more plug n play :)
  • @f4. there seems to be a bug with this because y doesn't seem to be used.
  • @CharlieParker I fixed it a few seconds ago ;)
  • @f4. it still doesn't truly solve the issue I was trying to do (or the original author intended since he gave y as an input) because I was looking for a function that depended on the model one gave as input (i.e. y). Right now as given, I have no idea what on earth it counts. My suspicion is that it counts just all models (I have two separate models).
  • @CharlieParker it counts all trainable variables, which by default is all variables I believe. You can work something out using the variables attributes like graph or name.
  • The doc didn't mention about if the count returned is for trainable? I did some quick test and it looks like this is the TOTAL number of params. The question is for only the trainable parameters.