How to turn off dropout for testing in Tensorflow?

G. Mesch picture G. Mesch · Jul 7, 2017 · Viewed 24.4k times · Source

I am fairly new to Tensorflow and ML in general, so I hereby apologize for a (likely) trivial question.

I use the dropout technique to improve learning rates of my network, and it seems to work just fine. Then, I would like to test the network on some data to see if it works like this:

   def Ask(self, image):
        return self.session.run(self.model, feed_dict = {self.inputPh: image})

Obviously, it yields different results each time as the dropout is still in place. One solution I can think of is to create two separate models - one for a training and the other one for an actual later use of the network, however, such a solution seems impractical to me.

What's the common approach to solving this problem?

Answer

nessuno picture nessuno · Jul 7, 2017

The easiest way is to change the keep_prob parameter using a placeholder_with_default:

prob = tf.placeholder_with_default(1.0, shape=())
layer = tf.nn.dropout(layer, prob)

in this way when you train you can set the parameter like this:

sess.run(train_step, feed_dict={prob: 0.5})

and when you evaluate the default value of 1.0 is used.