Back

Explore Courses Blog Tutorials Interview Questions
0 votes
2 views
in Data Science by (17.6k points)

i'm new to ML and i'm trying to fit a model for some images to do a binary classification.i have a dataset of 550 images for each of these two classes and i use 100 images of each class for validation.i use augmentation for my data and tensorboard for visualizing accurazy and loss.my loss function is 'binary_crossentropy' and i use 'rmsprop' as optimizer.my images i wrote my code here.the problem is my accuracy still between 49 and 52 for first 3 epochs and rises to 95 percent to 5th epoch but it falls back to 50 percent at 8th and this rise and fall happens in next epochs as well.also i provided some images from tensorboard till 8th epoch .i used this exact same code for kaggle cat and dog classification and it worked fine with about more than 86 percent accuracy.i think my problem is from my data because images in each classes are very different but in same class some times i can't say if two images are not same(they are not but they're very similar). i'll be very grateful if any one can answer what should i do.

i tried to add and remove some layers but it didn't work

i tried to change batch size how ever i think it's not that much important but still same things happens .

i tried changing input dimensions too. my images are actually 720*500 bu i tried different inputs here .it didn't work:

  model = Sequential()

  model.add(Conv2D(32, (3, 3),input_shape=(300,300,3)))

  model.add(Activation('relu'))

  model.add(MaxPooling2D(pool_size=(2, 2)))

  model.add(Conv2D(32, (3, 3)))

  model.add(Activation('relu'))

  model.add(MaxPooling2D(pool_size=(2, 2)))

  model.add(Conv2D(64, (3, 3)))

  model.add(Activation('relu'))

  model.add(MaxPooling2D(pool_size=(2, 2)))

  model.add(Flatten()) 

  model.add(Dense(64))

  model.add(Activation('relu'))

  model.add(Dropout(0.5))

  model.add(Dense(1))

  model.add(Activation('sigmoid'))

  model.compile(loss='binary_crossentropy',optimizer='rmsprop',metrics=['accuracy'])

  batch_size = 10

  train_datagen = ImageDataGenerator(

          rescale=1./255,

          shear_range=0.2,

          zoom_range=0.2,

          horizontal_flip=True)

  test_datagen = ImageDataGenerator(rescale=1./255)

  train_generator = 

           train_datagen.flow_from_directory('castData/train-set', 

           batch_size=batch_size,

           class_mode='binary') 

  validation_generator = test_datagen.flow_from_directory(,

          target_size=(300, 300),

          batch_size=batch_size,

          class_mode='binary')

   model.fit_generator(

          train_generator,

          steps_per_epoch=1075 ,

          epochs=50,

          validation_data=validation_generator,validation_steps=200,

          callbacks=[tensorboard_cb])

1 Answer

0 votes
by (41.4k points)

Instead of using flatten layer, GlobalAveragePooling2D or GlobalMaxPooling2D can be used:

from keras.layers import GlobalAveragePooling2D

model = Sequential()

model.add(Conv2D(32, (3, 3),input_shape=(300,300,3)))

model.add(Activation('relu'))

model.add(MaxPooling2D(pool_size=(2, 2)))

model.add(Conv2D(32, (3, 3)))

model.add(Activation('relu'))

model.add(MaxPooling2D(pool_size=(2, 2)))

model.add(Conv2D(64, (3, 3)))

model.add(Activation('relu'))

model.add(MaxPooling2D(pool_size=(2, 2)))

model.add(GlobalAveragePooling2D()) 

model.add(Dense(64))

model.add(Activation('relu'))

model.add(Dropout(0.5))

model.add(Dense(1))

model.add(Activation('sigmoid'))

Browse Categories

...