从Tensorflow转换为Onnx

时间:2019-11-04 12:01:25

标签: tensorflow onnx

我想将TF模型ICNET_0.5转换为onnx,然后按照以下示例操作:ConvertingSSDMobilenetToONNX

我了解,如果我只是想推断,我应该使用冻结图(在我的情况下为frozen_inference_graph.pb),因此我将名称更改为savel_model.pb(tf2onnx似乎无法识别其他名称)并使用此错误运行以下命令:

C:\Users\esarojp\Desktop\newmodel\0818_icnet_0.5_1025_resnet_v1.tar> python -m tf2onnx.convert --opset 10 --fold_const --saved-model .\0818_icnet_0.5_1025_resnet_v1\saved_model\ --output MODEL.onnx

 - WARNING - From C:\Users\esarojp\AppData\Local\Continuum\anaconda3\lib\site-packages\tf2onnx\verbose_logging.py:72: The name tf.logging.set_verbosity is deprecated. Please use tf.compat.v1.logging.set_verbosity instead.

Traceback (most recent call last):
  File "C:\Users\esarojp\AppData\Local\Continuum\anaconda3\lib\runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "C:\Users\esarojp\AppData\Local\Continuum\anaconda3\lib\runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "C:\Users\esarojp\AppData\Local\Continuum\anaconda3\lib\site-packages\tf2onnx\convert.py", line 161, in <module>
    main()
  File "C:\Users\esarojp\AppData\Local\Continuum\anaconda3\lib\site-packages\tf2onnx\convert.py", line 123, in main
    args.saved_model, args.inputs, args.outputs, args.signature_def)
  File "C:\Users\esarojp\AppData\Local\Continuum\anaconda3\lib\site-packages\tf2onnx\loader.py", line 103, in from_saved_model
    meta_graph_def = tf.saved_model.loader.load(sess, [tf.saved_model.tag_constants.SERVING], model_path)
  File "C:\Users\esarojp\AppData\Roaming\Python\Python36\site-packages\tensorflow\python\util\deprecation.py", line 324, in new_func
    return func(*args, **kwargs)
  File "C:\Users\esarojp\AppData\Roaming\Python\Python36\site-packages\tensorflow\python\saved_model\loader_impl.py", line 269, in load
    return loader.load(sess, tags, import_scope, **saver_kwargs)
  File "C:\Users\esarojp\AppData\Roaming\Python\Python36\site-packages\tensorflow\python\saved_model\loader_impl.py", line 422, in load
    **saver_kwargs)
  File "C:\Users\esarojp\AppData\Roaming\Python\Python36\site-packages\tensorflow\python\saved_model\loader_impl.py", line 349, in load_graph
    meta_graph_def = self.get_meta_graph_def_from_tags(tags)
  File "C:\Users\esarojp\AppData\Roaming\Python\Python36\site-packages\tensorflow\python\saved_model\loader_impl.py", line 327, in get_meta_graph_def_from_tags
    "\navailable_tags: " + str(available_tags))
RuntimeError: MetaGraphDef associated with tags 'serve' could not be found in SavedModel. To inspect available tag-sets in the SavedModel, please use the SavedModel CLI: `saved_model_cli`
available_tags: [set()]

当我跑步时:

C:\Users\esarojp\Desktop\newmodel\0818_icnet_0.5_1025_resnet_v1.tar> saved_model_cli show --dir .\0818_icnet_0.5_1025_resnet_v1\saved_model\ --tag_set serve  --signature_def serving_default
Traceback (most recent call last):
  File "C:\Users\esarojp\AppData\Local\Continuum\anaconda3\Scripts\saved_model_cli-script.py", line 10, in <module>
    sys.exit(main())
  File "C:\Users\esarojp\AppData\Roaming\Python\Python36\site-packages\tensorflow\python\tools\saved_model_cli.py", line 909, in main
    args.func(args)
  File "C:\Users\esarojp\AppData\Roaming\Python\Python36\site-packages\tensorflow\python\tools\saved_model_cli.py", line 621, in show
    _show_inputs_outputs(args.dir, args.tag_set, args.signature_def)
  File "C:\Users\esarojp\AppData\Roaming\Python\Python36\site-packages\tensorflow\python\tools\saved_model_cli.py", line 133, in _show_inputs_outputs
    tag_set)
  File "C:\Users\esarojp\AppData\Roaming\Python\Python36\site-packages\tensorflow\python\tools\saved_model_utils.py", line 120, in get_meta_graph_def
    ' could not be found in SavedModel')
RuntimeError: MetaGraphDef associated with tag-set serve could not be found in SavedModel

我认为其他文件上有指向frozen_inference_graph.pb的东西,但是它不再存在(尽管它说所有的权重都在图形内)。

有什么不好的主意吗?

1 个答案:

答案 0 :(得分:0)

我做错了,我尝试使用冻结图从SavedModel转换模型,转换冻结图以添加graphdef标志并指定输入和输出。

python -m tf2onnx.convert --graphdef .\0818_icnet_0.5_1025_resnet_v1\frozen_inference_graph.pb --output frozen.onnx --fold_const --opset 10   --inputs inputs:0 --outputs predictions:0