+1 vote
3 views
in AI and Deep Learning by (3.9k points)
edited by

What is global_step ? why do we use '0' while setting up global_step

  def training(loss,learning_rate):
        tf.summary.scalar('loss',loss)
        optimizer = tf.train.GradientDescentOptimizer(learning_rate)

        # Why 0 as the first parameter of the global_step tf.Variable?
        global_step = tf.Variable(0, name='global_step',trainable=False)

        train_op = optimizer.minimize(loss, global_step=global_step)

        return train_op

does increment by one after update of variable means that global_step becomes 1?

2 Answers

0 votes
by (39.8k points)
edited by

It means the total number of batches a graph can see. 

When we use optimizer.minimize() the variable is increased by one in global_step argument.

To set the values of global_step  useuse tf.train.get_global_step or tf.train.get_or_create_global_step.

Cheers.

+4 votes
by (10.9k points)
edited by

global_step is defined as the number of batches that have been seen by the graph. Each time a batch is provided, the weights are updated in such a direction that it will minimize the loss . Moreover, global_step tracks the number of batches meet so far and when it is passed to the minimize() argument list, it increases by one.

The global_step value can be obtained using tf.train.global_step().

According to your Example:

global_step = tf.Variable(0, name='global_step',trainable=False)

Here, 0 means the initial value of the global_step.

Welcome to Intellipaat Community. Get your technical queries answered by top developers !


Categories

...