What's the difference between tf.placeholder and tf.Variable?

J.Doe picture J.Doe · Apr 18, 2016 · Viewed 93.7k times · Source

I'm a newbie to TensorFlow. I'm confused about the difference between tf.placeholder and tf.Variable. In my view, tf.placeholder is used for input data, and tf.Variable is used to store the state of data. This is all what I know.

Could someone explain to me more in detail about their differences? In particular, when to use tf.Variable and when to use tf.placeholder?

Answer

Sung Kim picture Sung Kim · Apr 18, 2016

In short, you use tf.Variable for trainable variables such as weights (W) and biases (B) for your model.

weights = tf.Variable(
    tf.truncated_normal([IMAGE_PIXELS, hidden1_units],
                    stddev=1.0 / math.sqrt(float(IMAGE_PIXELS))), name='weights')

biases = tf.Variable(tf.zeros([hidden1_units]), name='biases')

tf.placeholder is used to feed actual training examples.

images_placeholder = tf.placeholder(tf.float32, shape=(batch_size, IMAGE_PIXELS))
labels_placeholder = tf.placeholder(tf.int32, shape=(batch_size))

This is how you feed the training examples during the training:

for step in xrange(FLAGS.max_steps):
    feed_dict = {
       images_placeholder: images_feed,
       labels_placeholder: labels_feed,
     }
    _, loss_value = sess.run([train_op, loss], feed_dict=feed_dict)

Your tf.variables will be trained (modified) as the result of this training.

See more at https://www.tensorflow.org/versions/r0.7/tutorials/mnist/tf/index.html. (Examples are taken from the web page.)