0 votes
1 view
in Machine Learning by (16k points)

I am trying to produce a CNN using Keras, and wrote the following code:

batch_size = 64

epochs = 20

num_classes = 5

cnn_model = Sequential()

cnn_model.add(Conv2D(32, kernel_size=(3, 3), activation='linear',

                     input_shape=(380, 380, 1), padding='same'))

cnn_model.add(Activation('relu'))

cnn_model.add(MaxPooling2D((2, 2), padding='same'))

cnn_model.add(Conv2D(64, (3, 3), activation='linear', padding='same'))

cnn_model.add(Activation('relu'))

cnn_model.add(MaxPooling2D(pool_size=(2, 2), padding='same'))

cnn_model.add(Conv2D(128, (3, 3), activation='linear', padding='same'))

cnn_model.add(Activation('relu'))

cnn_model.add(MaxPooling2D(pool_size=(2, 2), padding='same'))

cnn_model.add(Flatten())

cnn_model.add(Dense(128, activation='linear'))

cnn_model.add(Activation('relu'))

cnn_model.add(Dense(num_classes, activation='softmax'))

cnn_model.compile(loss=keras.losses.categorical_crossentropy,

                  optimizer=keras.optimizers.Adam(), metrics=['accuracy'])

I want to use Keras's LeakyReLU activation layer instead of using Activation('relu'). However, I tried using LeakyReLU(alpha=0.1) in place, but this is an activation layer in Keras, and I get an error about using an activation layer and not an activation function.

How can I use LeakyReLU in this example?

1 Answer

0 votes
by (33.2k points)

All advanced activations functions in Keras, including LeakyReLU, are available as layers, and not as activations, therefore, you should use them directly.

For example:

from keras.layers import LeakyReLU

# instead of cnn_model.add(Activation('relu'))

cnn_model.add(LeakyReLU(alpha=0.1))

More details on the aforementioned topic given while studying Convolution Neural Network. Artificial Intelligence Course would be a significant topic to learn for this Since the talk of Keras has also come up, one can always study Machine Learning Course for mastering the domain. 

Hope this answer helps you!

by (100 points)
i am able to import LeakyReLU ,

but when I use it as a layer as mentioned above..
model.add(LeakyReLU(alpha=0.1))

I get an error: LeakyReLU not defined.
I'm Using tensorflow 1.9 and keras 2.25
by (33.2k points)
You can use the activation argument as a function:
leaky = lambda x: tf.keras.layers.LeakyReLU(alpha=0.1)(x)
model.add(Conv2D(..., activation=leaky, ...)
Welcome to Intellipaat Community. Get your technical queries answered by top developers !


Categories

...