Error exporting inference graph (ValueError)

So I'm following sentdex's object detection tutorial and I have gotten to the step where you are supposed to export the inference graph. I'm using the "export_inference_graph.py" script from Tensorflow's object_detection folder. The problem is that I'm getting this ValueError:

Traceback (most recent call last):
  File "C:\Users\Zelcore-Dator\AppData\Local\Programs\Python\Python35\lib\site-packages\google\proto
buf\internal\python_message.py", line 545, in _GetFieldByName
    return message_descriptor.fields_by_name[field_name]
KeyError: 'layout_optimizer'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "export_inference_graph.py", line 119, in <module>
    tf.app.run()
  File "C:\Users\Zelcore-Dator\AppData\Local\Programs\Python\Python35\lib\site-packages\tensorflow\p
ython\platform\app.py", line 48, in run
    _sys.exit(main(_sys.argv[:1] + flags_passthrough))
  File "export_inference_graph.py", line 115, in main
    FLAGS.output_directory, input_shape)
  File "C:\Users\Zelcore-Dator\AppData\Local\Programs\Python\Python35\lib\site-packages\object_detec
tion-0.1-py3.5.egg\object_detection\exporter.py", line 427, in export_inference_graph
    input_shape, optimize_graph, output_collection_name)
  File "C:\Users\Zelcore-Dator\AppData\Local\Programs\Python\Python35\lib\site-packages\object_detec
tion-0.1-py3.5.egg\object_detection\exporter.py", line 391, in _export_inference_graph
    initializer_nodes='')
  File "C:\Users\Zelcore-Dator\AppData\Local\Programs\Python\Python35\lib\site-packages\object_detec
tion-0.1-py3.5.egg\object_detection\exporter.py", line 72, in freeze_graph_with_def_protos
    layout_optimizer=rewriter_config_pb2.RewriterConfig.ON)
  File "C:\Users\Zelcore-Dator\AppData\Local\Programs\Python\Python35\lib\site-packages\google\proto
buf\internal\python_message.py", line 484, in init
    field = _GetFieldByName(message_descriptor, field_name)
  File "C:\Users\Zelcore-Dator\AppData\Local\Programs\Python\Python35\lib\site-packages\google\proto
buf\internal\python_message.py", line 548, in _GetFieldByName
    (message_descriptor.name, field_name))
ValueError: Protocol message RewriterConfig has no "layout_optimizer" field.

I'm guessing that it has something to do with protobuf, but I've reinstalled it several times already with no success. All help appreciated


Happened to me too. Didn't happen few weeks ago. Until the bug is fixed, you could use an earlier version that still works. replace line 72 in 'object_detection/exporter.py':

layout_optimizer=rewriter_config_pb2.RewriterConfig.ON)

with the old and working line:

optimize_tensor_layout=True)

exporting inference graph value error · Issue #7646 · tensorflow , exporting inference graph value error #7646. Open. daisoto opened this issue on Oct 10, 2019 · 1 comment. Open  What is the top-level directory of the model you are using: research/object_detection Have I written custom code (as opposed to using a stock example script provided in TensorFlow): No OS Platform


I used:

rewrite_options = rewriter_config_pb2.RewriterConfig(optimize_tensor_layout=True)

but kept running into the same issue UNTIL I went and reran

python setup.py install

from my "research" folder. Then I was able to get everything to work.

Problem with Inference/Exporting Graph: ValueError Attempted to , Problem with Inference/Exporting Graph: ValueError Attempted to map inputs that were not found in grah_def[image_tensor:0] #6687. Open. ok, seems like the RewriterConfig has been changed in tensorflow 1.5 and the small modification detailed above will make it brake in the same manner but with the opposite result (I had to change back from the small mod when I updated to 1.5)


Remove optimize_tensor_layout=rewriter_config_pb2.RewriterConfig.ON

change the line 71 in exporter.py

rewrite_options = rewriter_config_pb2.RewriterConfig(optimize_tensor_layout=rewriter_config_pb2.RewriterConfig.ON)

to:

rewrite_options = rewriter_config_pb2.RewriterConfig()

Error exporting inference graph (ValueError), Happened to me too. Didn't happen few weeks ago. Until the bug is fixed, you could use an earlier version that still works. replace line 72 in  While executing the export_inference_graph.py I get the following error: File "export_inference_graph.py", line 91, in main raise ValueError('You must supply the path to save to with --output_file') ValueError: You must supply the path to save to with --output_file. I user this command to call the file:


python - Error exporting inference graph (ValueError), Happened to me too. Didn't happen few weeks ago. Until the bug is fixed, you could use an earlier version that still works. replace line 72 in  I just start learning tensorflow object detection API.And now,I can use the train.py script to train my model,and use the eval.py script to evaluate normally,but when I use the export_inference_graph.py script to export .pb file,the foll


Error exporting inference graph (ValueError) - python, are supposed to export the inference graph. I'm using the "​export_inference_graph.py" script from Tensorflow's object_detection folder. The problem is that I'm  Dismiss Join GitHub today. GitHub is home to over 40 million developers working together to host and review code, manage projects, and build software together.


tf.Graph, ValueError : If obj is of an appropriate type but invalid. Operation names are displayed in error messages reported by the TensorFlow runtime, and in various​  tensorflow.python.tools.optimize_for_inference_lib.optimize_for_inference is producing an invalid graph definition. So far as I can tell this is not user error; optimizing a valid graph definition should produce a valid graph definition, so this appears to be a bug.