如何在Colab中将Inception V4从.ckpt转换为.pb?

问题描述

我正在使用Coral开发板和Jetson T2开发板。 为了向他们发送模型,模型必须具有扩展名.pb

模型是否已经具有.pb扩展名的链接? 目前,我正在使用此链接TF_slim

所有模型都具有扩展名.ckpt,仅此而已。没有.Meta或其他任何内容。 我不知道如何转换为.pb。

我在Colab工作。这是我的代码

# Now let's download the pretrained model from tensorflow's model zoo.
!mkdir /content/pretrained_model
%cd /content/pretrained_model
!wget http://download.tensorflow.org/models/inception_v4_2016_09_09.tar.gz
!tar xvf inception_v4_2016_09_09.tar.gz


#Exporting the inference graph
!python /content/models/research/slim/export_inference_graph.py \
--alsologtostderr \
--model_name=inception_v4.ckpt \
--output_file=/content/pretrained_model/inception_v4_inf_graph.pb

这是我得到的错误

Traceback (most recent call last):
  File "/content/models/research/slim/export_inference_graph.py",line 162,in <module>
    tf.app.run()
  File "/tensorflow-1.15.2/python3.6/tensorflow_core/python/platform/app.py",line 40,in run
    _run(main=main,argv=argv,flags_parser=_parse_flags_tolerate_undef)
  File "/usr/local/lib/python3.6/dist-packages/absl/app.py",line 299,in run
    _run_main(main,args)
  File "/usr/local/lib/python3.6/dist-packages/absl/app.py",line 250,in _run_main
    sys.exit(main(argv))
  File "/content/models/research/slim/export_inference_graph.py",line 128,in main
    FLAGS.dataset_dir)
  File "/content/models/research/slim/datasets/dataset_factory.py",line 59,in get_dataset
    reader)
  File "/content/models/research/slim/datasets/imagenet.py",line 187,in get_split
    labels_to_names = create_readable_names_for_imagenet_labels()
  File "/content/models/research/slim/datasets/imagenet.py",line 93,in create_readable_names_for_imagenet_labels
    filename,_ = urllib.request.urlretrieve(synset_url)
  File "/usr/lib/python3.6/urllib/request.py",line 248,in urlretrieve
    with contextlib.closing(urlopen(url,data)) as fp:
  File "/usr/lib/python3.6/urllib/request.py",line 223,in urlopen
    return opener.open(url,data,timeout)
  File "/usr/lib/python3.6/urllib/request.py",line 532,in open
    response = meth(req,response)
  File "/usr/lib/python3.6/urllib/request.py",line 642,in http_response
    'http',request,response,code,msg,hdrs)
  File "/usr/lib/python3.6/urllib/request.py",line 564,in error
    result = self._call_chain(*args)
  File "/usr/lib/python3.6/urllib/request.py",line 504,in _call_chain
    result = func(*args)
  File "/usr/lib/python3.6/urllib/request.py",line 756,in http_error_302
    return self.parent.open(new,timeout=req.timeout)
  File "/usr/lib/python3.6/urllib/request.py",line 570,in error
    return self._call_chain(*args)
  File "/usr/lib/python3.6/urllib/request.py",line 650,in http_error_default
    raise HTTPError(req.full_url,hdrs,fp)
urllib.error.HTTPError: HTTP Error 404: Not Found

谢谢

解决方法

张量流/模型中的this URL中似乎有一个错误。我提交了PR tensorflow/models#9207

select input,count(type) from table_name where type = 'production' group by input having count(type) > 1

进行此更改将修复404错误。


请参阅https://github.com/tensorflow/models/tree/master/research/slim#exporting-the-inference-graph

上的说明

导出推理图

保存包含模型体系结构的GraphDef。

要将其与slim定义的模型名称一起使用,请运行:

- base_url = 'https://raw.githubusercontent.com/tensorflow/models/master/research/inception/inception/data/'
+ base_url = 'https://raw.githubusercontent.com/tensorflow/models/master/research/slim/datasets'

冻结导出的图形如果您想将生成的模型与您自己的或经过预先训练的检查点一起用作移动设备的一部分

模型,您可以运行freeze_graph以获取具有变量的图形def 内联为常量,使用:

$ python export_inference_graph.py \
  --alsologtostderr \  
  --model_name=inception_v3 \
  --output_file=/tmp/inception_v3_inf_graph.pb

$ python export_inference_graph.py \
  --alsologtostderr \  
  --model_name=mobilenet_v1 \
  --image_size=224 \
  --output_file=/tmp/mobilenet_v1_224.pb