+2 votes
3 views
in AI and Deep Learning by (3.5k points)
edited by

I read that regularization terms are implemented by manually adding an additional term to loss value in neural network code using TensorFlow.

So my doubts are:

  • Is there any other way to regularize rather than doing it manually?
  • How can I use get_varibles's argument regularizer? I observed that, if we pass a regularizer to it (like a tf.contrib.layers.13_regularizer) a regularized term representing a tensor term will be computed and added to a graph collection named tf.Graphkeys.REGULARIZATION_LOSSES. Will TensorFlow automatically use that collection or I have to do it manually?

2 Answers

0 votes
by (41k points)
edited by

You need to manually add all the losses which are collected in the graph to your cost function using this:

reg_losses = tf.get_collection(tf.GraphKeys.REGULARIZATION_LOSSES)

  reg_constant = 0.02  # Choose an appropriate one.

  loss = my_normal_loss + reg_constant * sum(reg_losses)

Or click here to know in detail.

Hope this helps....!!

+4 votes
by (10.9k points)
edited by

The recommended way is by using the regularizer argument .You can either set it in your variable_scope or in the get_variable to make all your variables regularized.You can follow this steps:

1.Defining a regularizer

regularizer = tf.contrib.layers.l2_regularizer(scale=0.1)

2.Creating variables

 var = tf.get_variable(

        name=”var",

        regularizer=regularizer,

    )

3.Defining some loss terms and then adding the regularizer term:

regvar = tf.get_collection(tf.GraphKeys.REGULARIZATION_LOSSES)

regterm=tf.contrib.layers.apply_regularization(regularizer, regvar)

loss = loss+ regterm

 Hope this answer helps.

Welcome to Intellipaat Community. Get your technical queries answered by top developers !


Categories

...