无法创建张量原型,其内容大于2GB KERAS

时间:2019-07-11 09:29:32

标签: python-3.x tensorflow keras protocol-buffers

我想为我的机器学习模型创建一个.pb和.pbtxt以便在android中使用它 我使用keras训练Cnn,然后将其保存到.pb文件后显示错误

我搜索错误,每个答案都基于张量流而不是keras 我将粘贴我的代码

import numpy as np
import tensorflow as tf


def freeze_session(session, keep_var_names=None, output_names=None, clear_devices=True):
    """
    Freezes the state of a session into a pruned computation graph.

    Creates a new computation graph where variable nodes are replaced by
    constants taking their current value in the session. The new graph will be
    pruned so subgraphs that are not necessary to compute the requested
    outputs are removed.
    @param session The TensorFlow session to be frozen.
    @param keep_var_names A list of variable names that should not be frozen,
                          or None to freeze all the variables in the graph.
    @param output_names Names of the relevant graph outputs.
    @param clear_devices Remove the device directives from the graph for better portability.
    @return The frozen graph definition.
    """
    graph = session.graph
    with graph.as_default():
        freeze_var_names = list(set(v.op.name for v in tf.global_variables()).difference(keep_var_names or []))
        output_names = output_names or []
        output_names += [v.op.name for v in tf.global_variables()]
        input_graph_def = graph.as_graph_def()
        if clear_devices:
            for node in input_graph_def.node:
                node.device = ''
        frozen_graph = tf.graph_util.convert_variables_to_constants(
            session, input_graph_def, output_names, freeze_var_names)
        return frozen_graph


def create_model():
   model = tf.keras.models.Sequential()
   model.add(tf.keras.layers.Dense(64, input_dim=2, activation='relu'))
   model.add(tf.keras.layers.Dense(64, activation='relu')) 
   model.add(tf.keras.layers.Dense(64, activation='relu'))
   model.add(tf.keras.layers.Dense(64, activation='relu'))
   model.add(tf.keras.layers.Dense(1, activation='sigmoid'))

   model.compile(loss='mean_squared_error', optimizer='adam', metrics= 
   ['binary_accuracy'])
    return model

classifier=create_model()
classifier.fir(trainimages,label,batch_size=10,epochs=15,validation_split=0.35)
classifier.save('xor.h5')

frozen_graph = freeze_session(tf.keras.backend.get_session(), output_names=[out.op.name for out in classifier.outputs])
tf.train.write_graph(frozen_graph, './', 'xor.pbtxt', as_text=True)
tf.train.write_graph(frozen_graph, './', 'xor.pb', as_text=False)

给出了错误 ValueError:无法创建内容大于2GB的张量原型

0 个答案:

没有答案
相关问题