如何只将h5文件转换为tflite文件?

时间:2020-08-26 12:55:27

标签: python keras tensorflow-lite

我正在尝试在Android上运行车牌检测。因此,我首先找到了本教程:https://medium.com/@quangnhatnguyenle/detect-and-recognize-vehicles-license-plate-with-machine-learning-and-python-part-1-detection-795fda47e922真的很棒。

在本教程中,我们可以找到wpod-net.h5,因此我尝试使用以下命令将其转换为TensorFlow lite:

import tensorflow as tf

model = tf.keras.models.load_model('wpod-net.h5')
converter = tf.lite.TFLiteConverter.from_keras_model(model)
converter.post_training_quantize = True
tflite_model = converter.convert()
open("wpod-net.tflite", "wb").write(tflite_model)

但是当我运行它时,我出现了这个错误:

  File "converter.py", line 3, in <module>
    model = tf.keras.models.load_model('License_character_recognition.h5')
  File "/home/.local/lib/python3.8/site-packages/tensorflow/python/keras/saving/save.py", line 184, in load_model
    return hdf5_format.load_model_from_hdf5(filepath, custom_objects,
  File "/home/.local/lib/python3.8/site-packages/tensorflow/python/keras/saving/hdf5_format.py", line 175, in load_model_from_hdf5
    raise ValueError('No model found in config file.')
ValueError: No model found in config file.

我也尝试使用API​​ tflite_convert --keras_model_file=License_character_recognition.h5 --output_file=test.tflite,但它给了我同样的错误。

这是否意味着如果我自己不训练模型,就无法将其转换为tflite吗?还是有另一种方法来转换.h5?

1 个答案:

答案 0 :(得分:1)

TensorFlow Lite模型结合了权重和模型代码本身。您需要加载Keras模型(带有权重),然后才能转换为tflite模型。

获取作者的repo的副本,然后执行get-networks.sh。您只需要data/lp-detector/wpod-net_update1.h5来安装牌照检测器,这样就可以提前停止下载。

深入研究代码,您可以在keras utils找到准备好的负载模型函数。

获得模型对象后,可以将其转换为tflite。

已测试TF2.4的Python3:

import sys, os
import tensorflow as tf
import traceback

from os.path                    import splitext, basename

print(tf.__version__)

mod_path = "data/lp-detector/wpod-net_update1.h5"

def load_model(path,custom_objects={},verbose=0):
    #from tf.keras.models import model_from_json

    path = splitext(path)[0]
    with open('%s.json' % path,'r') as json_file:
        model_json = json_file.read()
    model = tf.keras.models.model_from_json(model_json, custom_objects=custom_objects)
    model.load_weights('%s.h5' % path)
    if verbose: print('Loaded from %s' % path)
    return model

keras_mod = load_model(mod_path)

converter = tf.lite.TFLiteConverter.from_keras_model(keras_mod)
tflite_model = converter.convert()

# Save the TF Lite model.
with tf.io.gfile.GFile('model.tflite', 'wb') as f:
    f.write(tflite_model)

祝你好运!

相关问题