如何通过export_savedmodel保存tensorflow估算器,然后在本地加载并使用它?

时间:2017-10-03 07:19:59

标签: python tensorflow tensorflow-serving

我已经训练了如下的tf线性回归估计:

sample_size = train_x.shape[0]
feature_size = train_x.shape[1]

feature_columns = [tf.feature_column.numeric_column("x", shape=[feature_size])]

lr_estimator = tf.estimator.LinearRegressor(feature_columns=feature_columns )

train_x_mat = train_x.as_matrix()
test_x_mat = test_x.as_matrix()


# Define the training inputs
train_input_fn = tf.estimator.inputs.numpy_input_fn(
    x={"x": train_x_mat},
    y=np.array(train_y_mat),
    num_epochs=None,
    shuffle=True)

# Train model.
lr_estimator.train(input_fn=train_input_fn, steps=2000)

其中train_x和train_y是pandas数据帧。 lr_estimator确实有效,我可以成功调用.predict。

如何将其保存到文件中,然后将其加载回来进行预测?我只是想建立一个小的python程序。预测程序将在同一桌面上运行。我还不需要复杂的服务器服务。

1 个答案:

答案 0 :(得分:2)

def serving_input_receiver_fn():
    """
    input placeholder
    """
    inputs = {"x": tf.placeholder(shape=[feature_size], dtype=tf.float32)}
    return tf.estimator.export.ServingInputReceiver(inputs, inputs)

# export model and weights
export_dir = est_inception_v3.export_savedmodel(export_dir_base="/export_dir", 
    serving_input_receiver_fn=serving_input_receiver_fn)

# restore from disk
with tf.Session() as sess:
    tf.saved_model.loader.load(sess, [tf.saved_model.tag_constants.SERVING], export_dir)
    predictor = SavedModelPredictor(export_dir)
    print(predictor({"x": test_x_mat}))