<p>你有两个选择:</p>
<h2>导出模型以使用JSON字典</h2>
<p>在我的<a href="https://github.com/Fematich/mlengine-boilerplate/blob/master/trainer/task.py" rel="nofollow noreferrer">mlengine-boilerplate repository</a>中,我使用它将估计器模型导出到Cloud ML引擎,以便轻松地将其用于在线预测(<a href="https://github.com/Fematich/mlengine-boilerplate/blob/master/predictions/predict.py" rel="nofollow noreferrer">sample code for the predictions</a>)。基本部分:</p>
<pre><code>def serving_input_fn():
feature_placeholders = {
'id': tf.placeholder(tf.string, [None], name="id_placeholder"),
'feat': tf.placeholder(tf.float32, [None, FEAT_LEN], name="feat_placeholder"),
#label is not required since serving is only used for inference
}
return input_fn_utils.InputFnOps(
feature_placeholders,
None,
feature_placeholders)
</code></pre>
<h2>导出模型以使用Tensorflow示例</h2>
<p><a href="https://github.com/MtDersvan/tf_playground/blob/master/wide_and_deep_tutorial/wide_and_deep_basic_serving.md" rel="nofollow noreferrer">This tutorial</a>显示如何使用<code>export_savedmodel</code>服务
用估计器实现的宽深模型以及如何将Tensorflow示例输入导出的模型。本质
部分:</p>
<pre><code>from tensorflow.contrib.learn.python.learn.utils import input_fn_utils
serving_input_fn = input_fn_utils.build_parsing_serving_input_fn(feature_spec)
</code></pre>