错误:Tensorflow 预处理层未转换为 Tensorflow lite

问题描述

使用示例在 https://www.tensorflow.org/tutorials/structured_data/preprocessing_layers

我用自己的数据创建了一个模型。我想以 Tensorflow lite 格式保存它。我保存为 SavedModel,但是在转换时,我遇到了很多错误代码。我遇到的最后一个错误代码

WARNING:tensorflow:AutoGraph Could not transform <function canonicalize_signatures.<locals>.signature_wrapper at 0x7f4f61cd0560> and will run it as-is.
Please report this to the TensorFlow team. When filing the bug,set the verbosity to 10 (on Linux,`export AUTOGRAPH_VERBOSITY=10`) and attach the full output.
Cause: closure mismatch,requested ('signature_function','signature_key'),but source function had ()
To silence this warning,decorate the function with @tf.autograph.experimental.do_not_convert
WARNING: AutoGraph Could not transform <function canonicalize_signatures.<locals>.signature_wrapper at 0x7f4f61cd0560> and will run it as-is.
Please report this to the TensorFlow team. When filing the bug,decorate the function with @tf.autograph.experimental.do_not_convert
WARNING:tensorflow:AutoGraph Could not transform <function _trace_resource_initializers.<locals>._wrap_obj_initializer.<locals>.<lambda> at 0x7f4f61d28290> and will run it as-is.
Cause: Could not parse the source code of <function _trace_resource_initializers.<locals>._wrap_obj_initializer.<locals>.<lambda> at 0x7f4f61d28290>: no matching AST found
To silence this warning,decorate the function with @tf.autograph.experimental.do_not_convert
WARNING: AutoGraph Could not transform <function _trace_resource_initializers.<locals>._wrap_obj_initializer.<locals>.<lambda> at 0x7f4f61d28290> and will run it as-is.
Cause: Could not parse the source code of <function _trace_resource_initializers.<locals>._wrap_obj_initializer.<locals>.<lambda> at 0x7f4f61d28290>: no matching AST found
To silence this warning,decorate the function with @tf.autograph.experimental.do_not_convert
WARNING:tensorflow:AutoGraph Could not transform <function _trace_resource_initializers.<locals>._wrap_obj_initializer.<locals>.<lambda> at 0x7f4f61d28e60> and will run it as-is.
Cause: Could not parse the source code of <function _trace_resource_initializers.<locals>._wrap_obj_initializer.<locals>.<lambda> at 0x7f4f61d28e60>: no matching AST found
To silence this warning,decorate the function with @tf.autograph.experimental.do_not_convert
WARNING: AutoGraph Could not transform <function _trace_resource_initializers.<locals>._wrap_obj_initializer.<locals>.<lambda> at 0x7f4f61d28e60> and will run it as-is.
Cause: Could not parse the source code of <function _trace_resource_initializers.<locals>._wrap_obj_initializer.<locals>.<lambda> at 0x7f4f61d28e60>: no matching AST found
To silence this warning,decorate the function with @tf.autograph.experimental.do_not_convert
INFO:tensorflow:Assets written to: /tmp/test_saved_model/assets
---------------------------------------------------------------------------
Exception                                 Traceback (most recent call last)
/usr/local/lib/python3.7/dist-packages/tensorflow/lite/python/convert.py in toco_convert_protos(model_flags_str,toco_flags_str,input_data_str,debug_info_str,enable_mlir_converter)
    212       model body,the input/output will be quantized as well.
--> 213     inference_type: Data type for the activations. The default value is int8.
    214     enable_numeric_verify: Experimental. Subject to change. Bool indicating

4 frames
Exception: <unkNown>:0: error: loc("integer_lookup_1_index_table"): 'tf.MutableHashTableV2' op is neither a custom op nor a flex op
<unkNown>:0: error: loc("string_lookup_1_index_table"): 'tf.MutableHashTableV2' op is neither a custom op nor a flex op
<unkNown>:0: error: loc(callsite(callsite("model/string_lookup_1/string_lookup_1_index_table_lookup_table_find/LookupTableFindV2@__inference__wrapped_model_9475" at "StatefulPartitionedCall@__inference_signature_wrapper_10110") at "StatefulPartitionedCall")): 'tf.LookupTableFindV2' op is neither a custom op nor a flex op
<unkNown>:0: note: loc("StatefulPartitionedCall"): called from
<unkNown>:0: error: loc(callsite(callsite("model/category_encoding_3/bincount/add@__inference__wrapped_model_9475" at "StatefulPartitionedCall@__inference_signature_wrapper_10110") at "StatefulPartitionedCall")): 'tf.AddV2' op is neither a custom op nor a flex op
<unkNown>:0: note: loc("StatefulPartitionedCall"): called from
<unkNown>:0: error: loc(callsite(callsite("model/category_encoding_3/bincount/mul@__inference__wrapped_model_9475" at "StatefulPartitionedCall@__inference_signature_wrapper_10110") at "StatefulPartitionedCall")): 'tf.Mul' op is neither a custom op nor a flex op
<unkNown>:0: note: loc("StatefulPartitionedCall"): called from
<unkNown>:0: error: loc(callsite(callsite("model/category_encoding_3/bincount/DenseBincount@__inference__wrapped_model_9475" at "StatefulPartitionedCall@__inference_signature_wrapper_10110") at "StatefulPartitionedCall")): 'tf.DenseBincount' op is neither a custom op nor a flex op
<unkNown>:0: note: loc("StatefulPartitionedCall"): called from
<unkNown>:0: error: loc(callsite(callsite("model/integer_lookup_1/integer_lookup_1_index_table_lookup_table_find/LookupTableFindV2@__inference__wrapped_model_9475" at "StatefulPartitionedCall@__inference_signature_wrapper_10110") at "StatefulPartitionedCall")): 'tf.LookupTableFindV2' op is neither a custom op nor a flex op
<unkNown>:0: note: loc("StatefulPartitionedCall"): called from
<unkNown>:0: error: loc(callsite(callsite("model/category_encoding_2/bincount/add@__inference__wrapped_model_9475" at "StatefulPartitionedCall@__inference_signature_wrapper_10110") at "StatefulPartitionedCall")): 'tf.AddV2' op is neither a custom op nor a flex op
<unkNown>:0: note: loc("StatefulPartitionedCall"): called from
<unkNown>:0: error: loc(callsite(callsite("model/category_encoding_2/bincount/mul@__inference__wrapped_model_9475" at "StatefulPartitionedCall@__inference_signature_wrapper_10110") at "StatefulPartitionedCall")): 'tf.Mul' op is neither a custom op nor a flex op
<unkNown>:0: note: loc("StatefulPartitionedCall"): called from
<unkNown>:0: error: loc(callsite(callsite("model/category_encoding_2/bincount/DenseBincount@__inference__wrapped_model_9475" at "StatefulPartitionedCall@__inference_signature_wrapper_10110") at "StatefulPartitionedCall")): 'tf.DenseBincount' op is neither a custom op nor a flex op
<unkNown>:0: note: loc("StatefulPartitionedCall"): called from
<unkNown>:0: error: Failed while converting: 'main': Ops that can be supported by the flex runtime (enabled via setting the -emit-select-tf-ops flag):
    tf.AddV2 {device = ""}
    tf.DenseBincount {T = f32,Tidx = i64,binary_output = true,device = ""}
    tf.Mul {device = ""}Ops that need custom implementation (enabled via setting the -emit-custom-ops flag):
    tf.LookupTableFindV2 {device = "/job:localhost/replica:0/task:0/device:cpu:0"}
    tf.MutableHashTableV2 {container = "",device = "",key_dtype = !tf.string,shared_name = "table_704",use_node_name_sharing = false,value_dtype = i64}
    tf.MutableHashTableV2 {container = "",key_dtype = i64,shared_name = "table_615",value_dtype = i64}


During handling of the above exception,another exception occurred:

ConverterError                            Traceback (most recent call last)
/usr/local/lib/python3.7/dist-packages/tensorflow/lite/python/convert.py in toco_convert_protos(model_flags_str,enable_mlir_converter)
    214     enable_numeric_verify: Experimental. Subject to change. Bool indicating
    215       whether to add NumericVerify ops into the debug mode quantized model.
--> 216 
    217   Returns:
    218     Quantized model in serialized form (e.g. a TFLITE model) with floating-point

ConverterError: <unkNown>:0: error: loc("integer_lookup_1_index_table"): 'tf.MutableHashTableV2' op is neither a custom op nor a flex op
<unkNown>:0: error: loc("string_lookup_1_index_table"): 'tf.MutableHashTableV2' op is neither a custom op nor a flex op
<unkNown>:0: error: loc(callsite(callsite("model/string_lookup_1/string_lookup_1_index_table_lookup_table_find/LookupTableFindV2@__inference__wrapped_model_9475" at "StatefulPartitionedCall@__inference_signature_wrapper_10110") at "StatefulPartitionedCall")): 'tf.LookupTableFindV2' op is neither a custom op nor a flex op
<unkNown>:0: note: loc("StatefulPartitionedCall"): called from
<unkNown>:0: error: loc(callsite(callsite("model/category_encoding_3/bincount/add@__inference__wrapped_model_9475" at "StatefulPartitionedCall@__inference_signature_wrapper_10110") at "StatefulPartitionedCall")): 'tf.AddV2' op is neither a custom op nor a flex op
<unkNown>:0: note: loc("StatefulPartitionedCall"): called from
<unkNown>:0: error: loc(callsite(callsite("model/category_encoding_3/bincount/mul@__inference__wrapped_model_9475" at "StatefulPartitionedCall@__inference_signature_wrapper_10110") at "StatefulPartitionedCall")): 'tf.Mul' op is neither a custom op nor a flex op
<unkNown>:0: note: loc("StatefulPartitionedCall"): called from
<unkNown>:0: error: loc(callsite(callsite("model/category_encoding_3/bincount/DenseBincount@__inference__wrapped_model_9475" at "StatefulPartitionedCall@__inference_signature_wrapper_10110") at "StatefulPartitionedCall")): 'tf.DenseBincount' op is neither a custom op nor a flex op
<unkNown>:0: note: loc("StatefulPartitionedCall"): called from
<unkNown>:0: error: loc(callsite(callsite("model/integer_lookup_1/integer_lookup_1_index_table_lookup_table_find/LookupTableFindV2@__inference__wrapped_model_9475" at "StatefulPartitionedCall@__inference_signature_wrapper_10110") at "StatefulPartitionedCall")): 'tf.LookupTableFindV2' op is neither a custom op nor a flex op
<unkNown>:0: note: loc("StatefulPartitionedCall"): called from
<unkNown>:0: error: loc(callsite(callsite("model/category_encoding_2/bincount/add@__inference__wrapped_model_9475" at "StatefulPartitionedCall@__inference_signature_wrapper_10110") at "StatefulPartitionedCall")): 'tf.AddV2' op is neither a custom op nor a flex op
<unkNown>:0: note: loc("StatefulPartitionedCall"): called from
<unkNown>:0: error: loc(callsite(callsite("model/category_encoding_2/bincount/mul@__inference__wrapped_model_9475" at "StatefulPartitionedCall@__inference_signature_wrapper_10110") at "StatefulPartitionedCall")): 'tf.Mul' op is neither a custom op nor a flex op
<unkNown>:0: note: loc("StatefulPartitionedCall"): called from
<unkNown>:0: error: loc(callsite(callsite("model/category_encoding_2/bincount/DenseBincount@__inference__wrapped_model_9475" at "StatefulPartitionedCall@__inference_signature_wrapper_10110") at "StatefulPartitionedCall")): 'tf.DenseBincount' op is neither a custom op nor a flex op
<unkNown>:0: note: loc("StatefulPartitionedCall"): called from
<unkNown>:0: error: Failed while converting: 'main': Ops that can be supported by the flex runtime (enabled via setting the -emit-select-tf-ops flag):
    tf.AddV2 {device = ""}
    tf.DenseBincount {T = f32,value_dtype = i64}

代码


# Save the model into temp directory
export_dir = "/tmp/test_saved_model"


tf.saved_model.save(model,export_dir)
# Convert the model into TF Lite.
converter = tf.lite.TFLiteConverter.from_saved_model(export_dir)
tflite_model = converter.convert()
#save model 
tflite_model_files = pathlib.Path('/tmp/save_model_tflite.tflite')
tflite_model_file.write_bytes(tflite_model)

这个错误代码的原因是什么?我的目标是在应用程序中嵌入带有本机反应的模型。谢谢。

解决方法

查看您的跟踪记录,您似乎有一些 HashTable 操作。您需要设置 converter.allow_custom_ops = True 才能转换此模型。

export_dir = "/content/test_saved_model"


tf.saved_model.save(model,export_dir)
# Convert the model into TF Lite.
converter = tf.lite.TFLiteConverter.from_saved_model(export_dir)

converter.allow_custom_ops = True

tflite_model = converter.convert()

#save model 
tflite_model_files = pathlib.Path('/content/save_model_tflite.tflite')
tflite_model_files.write_bytes(tflite_model)

相关问答

Selenium Web驱动程序和Java。元素在(x,y)点处不可单击。其...
Python-如何使用点“。” 访问字典成员?
Java 字符串是不可变的。到底是什么意思?
Java中的“ final”关键字如何工作?(我仍然可以修改对象。...
“loop:”在Java代码中。这是什么,为什么要编译?
java.lang.ClassNotFoundException:sun.jdbc.odbc.JdbcOdbc...