how to calculate a Mobilenet FLOPs in Keras

calculate flops of cnn
tensorflow flops
tflite flops
tensorflow 2.0 flops
python count flops
tensorflow calculate macs
keras memory profiler
madds flops

run_meta = tf.RunMetadata()
enter codwith tf.Session(graph=tf.Graph()) as sess:
K.set_session(sess)


with tf.device('/cpu:0'):
    base_model = MobileNet(alpha=1, weights=None, input_tensor=tf.placeholder('float32', shape=(1,224,224,3)))




opts = tf.profiler.ProfileOptionBuilder.float_operation()    
flops = tf.profiler.profile(sess.graph, run_meta=run_meta, cmd='op', options=opts)

opts = tf.profiler.ProfileOptionBuilder.trainable_variables_parameter()    
params = tf.profiler.profile(sess.graph, run_meta=run_meta, cmd='op', options=opts)

print("{:,} --- {:,}".format(flops.total_float_ops, params.total_parameters))

When I run above code, I got a below result

1,137,481,704 --- 4,253,864

This is different from the flops described in the paper.

mobilenet: https://arxiv.org/pdf/1704.04861.pdf

ShuffleNet: https://arxiv.org/pdf/1707.01083.pdf

How to calculate exact flops described in the paper?

tl;dr You've actually got the right answer! You are simply comparing flops with multiply accumulates (from the paper) and therefore need to divide by two.

If you're using Keras, then the code you listed is slightly over-complicating things...

Let model be any compiled Keras model. We can arrive at the flops of the model with the following code.

import tensorflow as tf
import keras.backend as K


def get_flops(model):
    run_meta = tf.RunMetadata()
    opts = tf.profiler.ProfileOptionBuilder.float_operation()

    # We use the Keras session graph in the call to the profiler.
    flops = tf.profiler.profile(graph=K.get_session().graph,
                                run_meta=run_meta, cmd='op', options=opts)

    return flops.total_float_ops  # Prints the "flops" of the model.


# .... Define your model here ....
print(get_flops(model))

However, when I look at my own example (not Mobilenet) that I did on my computer, the printed out total_float_ops was 2115 and I had the following results when I simply printed the flops variable:

[...]
Mul                      1.06k float_ops (100.00%, 49.98%)
Add                      1.06k float_ops (50.02%, 49.93%)
Sub                          2 float_ops (0.09%, 0.09%)

It's pretty clear that the total_float_ops property takes into consideration multiplication, addition and subtraction.

I then looked back at the MobileNets example, looking through the paper briefly, I found the implementation of MobileNet that is the default Keras implementation based on the number of parameters:

The first model in the table matches the result you have (4,253,864) and the Mult-Adds are approximately half of the flops result that you have. Therefore you have the correct answer, it's just you were mistaking flops for Mult-Adds (aka multiply accumulates or MACs).

If you want to compute the number of MACs you simply have to divide the result from the above code by two.

how to calculate a net's FLOPs in CNN, It calculates the FLOPs for the MobileNet. import tensorflow as tf import keras.​backend as K from keras.applications.mobilenet import MobileNet  import tensorflow as tf import keras.backend as K def get_flops(): run_meta = tf.RunMetadata() opts = tf.profiler.ProfileOptionBuilder.float_operation() # We use the Keras session graph in the call to the profiler.

This is working for me in TF-2.1:

def get_flops(model_h5_path):
    session = tf.compat.v1.Session()
    graph = tf.compat.v1.get_default_graph()


    with graph.as_default():
        with session.as_default():
            model = tf.keras.models.load_model(model_h5_path)

            run_meta = tf.compat.v1.RunMetadata()
            opts = tf.compat.v1.profiler.ProfileOptionBuilder.float_operation()

            # Optional: save printed results to file
            # flops_log_path = os.path.join(tempfile.gettempdir(), 'tf_flops_log.txt')
            # opts['output'] = 'file:outfile={}'.format(flops_log_path)

            # We use the Keras session graph in the call to the profiler.
            flops = tf.compat.v1.profiler.profile(graph=graph,
                                                  run_meta=run_meta, cmd='op', options=opts)

            return flops.total_float_ops

flops - St4k, Is FLOP calculated by tf.profiler.profile per second or per step? how to calculate a Mobilenet FLOPs in Keras FLOP count for a keras model in python. from keras.applications.resnet50 import ResNet50 from keras.applications.vgg16 import VGG16 from keras.applications.mobilenet import MobileNet from net_flops import net_flops model = VGG16(weights = None, include_top = True, pooling = None, input_shape = (224, 224, 3)) model.summary() # Prints a table with the FLOPS at each layer and total

You can use model.summary() on all Keras models to get number of FLOPS.

How fast is my model?, I need to calculate the power/energy consumption for some deep learning To compute your program flops with tensorflow, take a look at Is there any possibility to measure the energy consumed with Keras or is there a Tool to do that? the consumed energy from the known CNN models like GoogLeNet, MobileNet To calculate the FLOPs in TensorFlow, make sure to set the batch size equal to 1, and execute the following line when the model is loaded into memory. tf.profiler.profile( tf.get_default_graph(), options=tf.profiler.ProfileOptionBuilder.float_operation(), cmd='scope') I've already implemented this function.

Is there any possibillity to measure power/energy consumption using , Normally, we just measure frozen model which is used for inference only. What tf.​profiler do is to calculate all operations in given graph. NOTE: If  We construct 101- layer and 152-layer ResNets by using more 3-layer blocks (Table 1). Remarkably, although the depth is significantly increased, the 152-layer ResNet (11.3 billion FLOPs) still has lower complexity than VGG-16/19 nets (15.3/19.6 billion FLOPs) page 7 top. How can it be?

Model Flops measurement in TensorFlow - zong fan, MG2033 changed the title Calculating FLOPs of computational graph operations Calculating @ruiyuanlu Is it also working with Keras? Freeze the graph with a pb. Calculating the FLOP from a pb file was, actually, the OP's use case. The following snippet illustrates this: [a] Usually the FLOP of a matrix multiplication are mq(2p -1) for the product AB where A[m, p] and B[p, q] but TensorFlow returns 2mpq for some reason.

[Feature Request] Calculating FLOPs for computational graph , Large-scale image classification models on Keras MobileNet ('MobileNets: Efficient Convolutional Neural Networks for Mobile Vision Applications') FLOPs​/2 is the number of FLOPs divided by two to be similar to the number of MACs. Anchor/priorbox generation and roi/psroi-pooling are not included in flop estimates. The ssd-pascal-mobilenet-ft detector uses the MobileNet feature extractor (the model used here was imported from the architecture made available by chuanqi305 ).

Comments
  • why do you need model as an argument for get_flops?
  • this lists only the number of parameters