Explore Courses Blog Tutorials Interview Questions
0 votes
1 view
in Machine Learning by (19k points)

I am fairly new to Tensorflow and ML in general, so I hereby apologize for a (likely) trivial question.

I use the dropout technique to improve the learning rates of my network, and it seems to work just fine. Then, I would like to test the network on some data to see if it works like this:

   def Ask(self, image):

        return, feed_dict = {self.inputPh: image})

Obviously, it yields different results each time as the dropout is still in place. One solution I can think of is to create two separate models - one for a training and the other one for actual later use of the network, however, such a solution seems impractical to me.

What's the common approach to solving this problem?

1 Answer

0 votes
by (33.1k points)

You can simply change the keep_prob parameter using a placehoder_with_default:

For example:

prob = tf.placeholder_with_default(1.0, shape=())

layer = tf.nn.dropout(layer, prob)

#During training, set the parrameter like this:, feed_dict={prob: 0.5})

The default value of 1.0 is used during the evaluation.

Hope this answer helps.

If you wish to learn more about Machine Learning visit this Machine Learning Tutorial.

Welcome to Intellipaat Community. Get your technical queries answered by top developers!

28.4k questions

29.7k answers


94.7k users

Browse Categories