Back

Explore Courses Blog Tutorials Interview Questions
0 votes
2 views
in AI and Deep Learning by (50.2k points)

Can I use the batch normalization layer right after the input layer and not normalize my data? May I expect to get a similar effect/performance?

In Keras functional it would be something like this:

x = Input (...) 

x = Batchnorm(...)

(x) …

1 Answer

0 votes
by (108k points)
edited by

Normalizing the input of your network is a well-established technique for improving the convergence properties of a network. Yes, you can use batch normalization right after the input layer, but the nice thing about batch normalization, in addition to activation distribution stabilization, is that the mean and std deviation are likely migrated as the network learns.

Effectively, setting the batch normalization right after the input layer is a fancy data pre-processing step. It helps, sometimes a lot (e.g. in linear regression). But it's easier and more efficient to compute the mean and variance of the whole training sample once than learn it per batch.

If you want to make your career in Artificial Intelligence then go through this video:

Interested in learning Artificial Intelligence? Learn more from this AI Course!

Browse Categories

...