How to export Estimator model with export_savedmodel function

Yuwen Yan picture Yuwen Yan · Mar 16, 2017 · Viewed 19.4k times · Source

are there any tutorials available about export_savedmodel ?

I have gone through this article on tensorflow.org and unittest code on github.com, and still have no idea about how to construct the parameter serving_input_fn of function export_savedmodel

Answer

ursak picture ursak · Jan 4, 2018

Do it like this:

your_feature_spec = {
    "some_feature": tf.FixedLenFeature([], dtype=tf.string, default_value=""),
    "some_feature": tf.VarLenFeature(dtype=tf.string),
}

def _serving_input_receiver_fn():
    serialized_tf_example = tf.placeholder(dtype=tf.string, shape=None, 
                                           name='input_example_tensor')
    # key (e.g. 'examples') should be same with the inputKey when you 
    # buid the request for prediction
    receiver_tensors = {'examples': serialized_tf_example}
    features = tf.parse_example(serialized_tf_example, your_feature_spec)
    return tf.estimator.export.ServingInputReceiver(features, receiver_tensors)

estimator.export_savedmodel(export_dir, _serving_input_receiver_fn)

Then you can request the served model with "predict" signature name by batch.

Source: https://www.tensorflow.org/guide/saved_model#prepare_serving_inputs