print('Created using Python', platform.python_version())
Created using Python 3.6.4
The challenge of recognizing artists given their paintings has been, for a long time, far beyond the capability of algorithms. Recent advances in deep learning, specifically the development of convolutional neural networks, have made that task possible. One of the advantages of these methods is that, in contrast to several methods employed by art specialists, they are not invasive and do not interfere with the painting.
I used Convolutional Neural Networks (ConvNets) to identify the artist of a given painting. The dataset contains a minimum of 400 paintings per artist
from a set of 37 famous artists.
I trained a small ConvNet built from scratch, and also used transfer learning, fine-tuning the top layers of a deep pre-trained networks (VGG16).
The number of training examples in our dataset is small (for image recognition standards). Therefore, making predictions with high accuracy avoiding overfitting becomes a difficult task. To build classification systems with the level of capability of current state-of-the-art models would need millions of training examples. Example of such models are the ImageNet models. Examples of these models include:
The Keras
class keras.preprocessing.image.ImageDataGenerator
generates batches of image data with real-time data augmentation and defines the configuration for both image data preparation and image data augmentation. Data augmentation is particularly useful in cases like the present one, where the number of images in the training set is not large, and overfitting can become an issue.
To create an augmented image generator we can follow these steps:
We must first create an instance i.e. an augmented image generator (using the command below) where several arguments can be chosen. These arguments will determine the alterations to be performed on the images during training:
datagen = ImageDataGenerator(arguments)
To use datagen
to create new images we call the function fit_generator( )
with the desired arguments.
I will quickly explain some possible arguments of ImageDataGenerator
:
rotation range
defines the amplitude that the images will be rotated randomly during training. Rotations aren't always useful. For example, in the MNIST dataset all images have normalized orientation, so random rotations during training are not needed. In tour present case it is not clear how useful rotations are so I will choose an small argument (instead of just setting it to zero).rotation_range
, width_shift_range
, height_shift_range
and shear_range
: the ranges of random shifts and random shears should be the same in our case, since the images were resized to have the same dimensions.fill mode
to be nearest
which means that pixels that are missing will be filled by the nearest ones.horizontal_flip
: horizontal (and vertical) flips can be useful here since in many examples in our dataset there is no clear definition of orientation (again the MNIST dataset is an example where flipping is not useful)featurewise_center
and feature_std_normalization
arguments.One way to circunvent this issue is to use 'Transfer Learning', where we use a pre-trained model, modify its final layers and apply to our dataset. When the dataset is too small, these pre-trained models act as feature generators only (see discussion below). As will be illustrated later on, when the dataset in question has some reasonable size, one can drop some layers from the original model, stack a model on top of the network and perform some parameters fine-tuning.
Before following this approach, I will, in the next section, build a small ConvNet "from scratch".
import tensorflow as tf
from keras.preprocessing.image import ImageDataGenerator, img_to_array, load_img
from keras.models import Sequential
from keras.preprocessing import image
from keras.layers import Dropout, Flatten, Dense
from keras import applications
from keras.utils.np_utils import to_categorical
from keras import applications
from keras.applications.imagenet_utils import preprocess_input
from imagenet_utils import decode_predictions
import math, cv2
folder_train = './train_toy_3/'
folder_test = './test_toy_3/'
datagen = ImageDataGenerator(
featurewise_center=True,
featurewise_std_normalization=True,
rotation_range=0.15,
width_shift_range=0.2,
height_shift_range=0.2,
rescale = 1./255,
shear_range=0.2,
zoom_range=0.2,
horizontal_flip=True,
fill_mode='nearest')
from keras.preprocessing.image import ImageDataGenerator, img_to_array, load_img
from keras.models import Sequential
from keras.layers import Conv2D, MaxPooling2D
from keras.layers import Activation, Dropout, Flatten, Dense
from keras import backend as K
from keras.callbacks import EarlyStopping, Callback
K.image_data_format() # this means that "backend": "tensorflow". Channels are RGB
from keras import applications
from keras.utils.np_utils import to_categorical
import math, cv2
'channels_last'
preprocess( )
(see the notebook about data analysis in this repo) to resize the images. In the next cell I resize them again and play with the size to see how it impacts accuracy.img_width, img_height = 120,120
if K.image_data_format() == 'channels_first':
input_shape = (3, img_width, img_height)
print('Theano Backend')
else:
input_shape = (img_width, img_height, 3)
print('TensorFlow Backend')
input_shape
TensorFlow Backend
(120, 120, 3)
nb_train_samples = 0
for p in range(len(os.listdir(os.path.abspath(folder_train)))):
nb_train_samples += len(os.listdir(os.path.abspath(folder_train) +'/'+ os.listdir(
os.path.abspath(folder_train))[p]))
nb_train_samples
nb_test_samples = 0
for p in range(len(os.listdir(os.path.abspath(folder_test)))):
nb_test_samples += len(os.listdir(os.path.abspath(folder_test) +'/'+ os.listdir(
os.path.abspath(folder_test))[p]))
nb_test_samples
1141
359
evaluation_data
or evaluation_split
with the fit
method of Keras models, evaluation will be run at the end of every epoch (extracted from the docs).train_data_dir = os.path.abspath(folder_train) # folder containing training set already subdivided
validation_data_dir = os.path.abspath(folder_test) # folder containing test set already subdivided
nb_train_samples = nb_train_samples
nb_validation_samples = nb_test_samples
epochs = 100
batch_size = 16 # batch_size = 16
num_classes = len(os.listdir(os.path.abspath(folder_train)))
print('The painters are',os.listdir(os.path.abspath(folder_train)))
The painters are ['Pablo_Picasso', 'Pierre-Auguste_Renoir', 'Rembrandt']
Model stops training when 10 epochs do not show gain in accuracy.
# rdcolema
class EarlyStoppingByLossVal(Callback):
"""Custom class to set a val loss target for early stopping"""
def __init__(self, monitor='val_loss', value=0.45, verbose=0):
super(Callback, self).__init__()
self.monitor = monitor
self.value = value
self.verbose = verbose
def on_epoch_end(self, epoch, logs={}):
current = logs.get(self.monitor)
if current is None:
warnings.warn("Early stopping requires %s available!" % self.monitor, RuntimeWarning)
if current < self.value:
if self.verbose > 0:
print("Epoch %05d: early stopping THR" % epoch)
self.model.stop_training = True
early_stopping = EarlyStopping(monitor='val_loss', patience=10, mode='auto') #
top_model_weights_path = 'bottleneck_fc_model.h5'
We now create the InceptionV3 model without the final fully-connected layers (setting include_top=False
) and loading the ImageNet weights (by setting weights ='imagenet
)
from keras.applications.inception_v3 import InceptionV3
model = applications.InceptionV3(include_top=False, weights='imagenet')
applications.InceptionV3(include_top=False, weights='imagenet').summary()
__________________________________________________________________________________________________ Layer (type) Output Shape Param # Connected to ================================================================================================== input_2 (InputLayer) (None, None, None, 3 0 __________________________________________________________________________________________________ conv2d_95 (Conv2D) (None, None, None, 3 864 input_2[0][0] __________________________________________________________________________________________________ batch_normalization_95 (BatchNo (None, None, None, 3 96 conv2d_95[0][0] __________________________________________________________________________________________________ activation_95 (Activation) (None, None, None, 3 0 batch_normalization_95[0][0] __________________________________________________________________________________________________ conv2d_96 (Conv2D) (None, None, None, 3 9216 activation_95[0][0] __________________________________________________________________________________________________ batch_normalization_96 (BatchNo (None, None, None, 3 96 conv2d_96[0][0] __________________________________________________________________________________________________ activation_96 (Activation) (None, None, None, 3 0 batch_normalization_96[0][0] __________________________________________________________________________________________________ conv2d_97 (Conv2D) (None, None, None, 6 18432 activation_96[0][0] __________________________________________________________________________________________________ batch_normalization_97 (BatchNo (None, None, None, 6 192 conv2d_97[0][0] __________________________________________________________________________________________________ activation_97 (Activation) (None, None, None, 6 0 batch_normalization_97[0][0] __________________________________________________________________________________________________ max_pooling2d_5 (MaxPooling2D) (None, None, None, 6 0 activation_97[0][0] __________________________________________________________________________________________________ conv2d_98 (Conv2D) (None, None, None, 8 5120 max_pooling2d_5[0][0] __________________________________________________________________________________________________ batch_normalization_98 (BatchNo (None, None, None, 8 240 conv2d_98[0][0] __________________________________________________________________________________________________ activation_98 (Activation) (None, None, None, 8 0 batch_normalization_98[0][0] __________________________________________________________________________________________________ conv2d_99 (Conv2D) (None, None, None, 1 138240 activation_98[0][0] __________________________________________________________________________________________________ batch_normalization_99 (BatchNo (None, None, None, 1 576 conv2d_99[0][0] __________________________________________________________________________________________________ activation_99 (Activation) (None, None, None, 1 0 batch_normalization_99[0][0] __________________________________________________________________________________________________ max_pooling2d_6 (MaxPooling2D) (None, None, None, 1 0 activation_99[0][0] __________________________________________________________________________________________________ conv2d_103 (Conv2D) (None, None, None, 6 12288 max_pooling2d_6[0][0] __________________________________________________________________________________________________ batch_normalization_103 (BatchN (None, None, None, 6 192 conv2d_103[0][0] __________________________________________________________________________________________________ activation_103 (Activation) (None, None, None, 6 0 batch_normalization_103[0][0] __________________________________________________________________________________________________ conv2d_101 (Conv2D) (None, None, None, 4 9216 max_pooling2d_6[0][0] __________________________________________________________________________________________________ conv2d_104 (Conv2D) (None, None, None, 9 55296 activation_103[0][0] __________________________________________________________________________________________________ batch_normalization_101 (BatchN (None, None, None, 4 144 conv2d_101[0][0] __________________________________________________________________________________________________ batch_normalization_104 (BatchN (None, None, None, 9 288 conv2d_104[0][0] __________________________________________________________________________________________________ activation_101 (Activation) (None, None, None, 4 0 batch_normalization_101[0][0] __________________________________________________________________________________________________ activation_104 (Activation) (None, None, None, 9 0 batch_normalization_104[0][0] __________________________________________________________________________________________________ average_pooling2d_10 (AveragePo (None, None, None, 1 0 max_pooling2d_6[0][0] __________________________________________________________________________________________________ conv2d_100 (Conv2D) (None, None, None, 6 12288 max_pooling2d_6[0][0] __________________________________________________________________________________________________ conv2d_102 (Conv2D) (None, None, None, 6 76800 activation_101[0][0] __________________________________________________________________________________________________ conv2d_105 (Conv2D) (None, None, None, 9 82944 activation_104[0][0] __________________________________________________________________________________________________ conv2d_106 (Conv2D) (None, None, None, 3 6144 average_pooling2d_10[0][0] __________________________________________________________________________________________________ batch_normalization_100 (BatchN (None, None, None, 6 192 conv2d_100[0][0] __________________________________________________________________________________________________ batch_normalization_102 (BatchN (None, None, None, 6 192 conv2d_102[0][0] __________________________________________________________________________________________________ batch_normalization_105 (BatchN (None, None, None, 9 288 conv2d_105[0][0] __________________________________________________________________________________________________ batch_normalization_106 (BatchN (None, None, None, 3 96 conv2d_106[0][0] __________________________________________________________________________________________________ activation_100 (Activation) (None, None, None, 6 0 batch_normalization_100[0][0] __________________________________________________________________________________________________ activation_102 (Activation) (None, None, None, 6 0 batch_normalization_102[0][0] __________________________________________________________________________________________________ activation_105 (Activation) (None, None, None, 9 0 batch_normalization_105[0][0] __________________________________________________________________________________________________ activation_106 (Activation) (None, None, None, 3 0 batch_normalization_106[0][0] __________________________________________________________________________________________________ mixed0 (Concatenate) (None, None, None, 2 0 activation_100[0][0] activation_102[0][0] activation_105[0][0] activation_106[0][0] __________________________________________________________________________________________________ conv2d_110 (Conv2D) (None, None, None, 6 16384 mixed0[0][0] __________________________________________________________________________________________________ batch_normalization_110 (BatchN (None, None, None, 6 192 conv2d_110[0][0] __________________________________________________________________________________________________ activation_110 (Activation) (None, None, None, 6 0 batch_normalization_110[0][0] __________________________________________________________________________________________________ conv2d_108 (Conv2D) (None, None, None, 4 12288 mixed0[0][0] __________________________________________________________________________________________________ conv2d_111 (Conv2D) (None, None, None, 9 55296 activation_110[0][0] __________________________________________________________________________________________________ batch_normalization_108 (BatchN (None, None, None, 4 144 conv2d_108[0][0] __________________________________________________________________________________________________ batch_normalization_111 (BatchN (None, None, None, 9 288 conv2d_111[0][0] __________________________________________________________________________________________________ activation_108 (Activation) (None, None, None, 4 0 batch_normalization_108[0][0] __________________________________________________________________________________________________ activation_111 (Activation) (None, None, None, 9 0 batch_normalization_111[0][0] __________________________________________________________________________________________________ average_pooling2d_11 (AveragePo (None, None, None, 2 0 mixed0[0][0] __________________________________________________________________________________________________ conv2d_107 (Conv2D) (None, None, None, 6 16384 mixed0[0][0] __________________________________________________________________________________________________ conv2d_109 (Conv2D) (None, None, None, 6 76800 activation_108[0][0] __________________________________________________________________________________________________ conv2d_112 (Conv2D) (None, None, None, 9 82944 activation_111[0][0] __________________________________________________________________________________________________ conv2d_113 (Conv2D) (None, None, None, 6 16384 average_pooling2d_11[0][0] __________________________________________________________________________________________________ batch_normalization_107 (BatchN (None, None, None, 6 192 conv2d_107[0][0] __________________________________________________________________________________________________ batch_normalization_109 (BatchN (None, None, None, 6 192 conv2d_109[0][0] __________________________________________________________________________________________________ batch_normalization_112 (BatchN (None, None, None, 9 288 conv2d_112[0][0] __________________________________________________________________________________________________ batch_normalization_113 (BatchN (None, None, None, 6 192 conv2d_113[0][0] __________________________________________________________________________________________________ activation_107 (Activation) (None, None, None, 6 0 batch_normalization_107[0][0] __________________________________________________________________________________________________ activation_109 (Activation) (None, None, None, 6 0 batch_normalization_109[0][0] __________________________________________________________________________________________________ activation_112 (Activation) (None, None, None, 9 0 batch_normalization_112[0][0] __________________________________________________________________________________________________ activation_113 (Activation) (None, None, None, 6 0 batch_normalization_113[0][0] __________________________________________________________________________________________________ mixed1 (Concatenate) (None, None, None, 2 0 activation_107[0][0] activation_109[0][0] activation_112[0][0] activation_113[0][0] __________________________________________________________________________________________________ conv2d_117 (Conv2D) (None, None, None, 6 18432 mixed1[0][0] __________________________________________________________________________________________________ batch_normalization_117 (BatchN (None, None, None, 6 192 conv2d_117[0][0] __________________________________________________________________________________________________ activation_117 (Activation) (None, None, None, 6 0 batch_normalization_117[0][0] __________________________________________________________________________________________________ conv2d_115 (Conv2D) (None, None, None, 4 13824 mixed1[0][0] __________________________________________________________________________________________________ conv2d_118 (Conv2D) (None, None, None, 9 55296 activation_117[0][0] __________________________________________________________________________________________________ batch_normalization_115 (BatchN (None, None, None, 4 144 conv2d_115[0][0] __________________________________________________________________________________________________ batch_normalization_118 (BatchN (None, None, None, 9 288 conv2d_118[0][0] __________________________________________________________________________________________________ activation_115 (Activation) (None, None, None, 4 0 batch_normalization_115[0][0] __________________________________________________________________________________________________ activation_118 (Activation) (None, None, None, 9 0 batch_normalization_118[0][0] __________________________________________________________________________________________________ average_pooling2d_12 (AveragePo (None, None, None, 2 0 mixed1[0][0] __________________________________________________________________________________________________ conv2d_114 (Conv2D) (None, None, None, 6 18432 mixed1[0][0] __________________________________________________________________________________________________ conv2d_116 (Conv2D) (None, None, None, 6 76800 activation_115[0][0] __________________________________________________________________________________________________ conv2d_119 (Conv2D) (None, None, None, 9 82944 activation_118[0][0] __________________________________________________________________________________________________ conv2d_120 (Conv2D) (None, None, None, 6 18432 average_pooling2d_12[0][0] __________________________________________________________________________________________________ batch_normalization_114 (BatchN (None, None, None, 6 192 conv2d_114[0][0] __________________________________________________________________________________________________ batch_normalization_116 (BatchN (None, None, None, 6 192 conv2d_116[0][0] __________________________________________________________________________________________________ batch_normalization_119 (BatchN (None, None, None, 9 288 conv2d_119[0][0] __________________________________________________________________________________________________ batch_normalization_120 (BatchN (None, None, None, 6 192 conv2d_120[0][0] __________________________________________________________________________________________________ activation_114 (Activation) (None, None, None, 6 0 batch_normalization_114[0][0] __________________________________________________________________________________________________ activation_116 (Activation) (None, None, None, 6 0 batch_normalization_116[0][0] __________________________________________________________________________________________________ activation_119 (Activation) (None, None, None, 9 0 batch_normalization_119[0][0] __________________________________________________________________________________________________ activation_120 (Activation) (None, None, None, 6 0 batch_normalization_120[0][0] __________________________________________________________________________________________________ mixed2 (Concatenate) (None, None, None, 2 0 activation_114[0][0] activation_116[0][0] activation_119[0][0] activation_120[0][0] __________________________________________________________________________________________________ conv2d_122 (Conv2D) (None, None, None, 6 18432 mixed2[0][0] __________________________________________________________________________________________________ batch_normalization_122 (BatchN (None, None, None, 6 192 conv2d_122[0][0] __________________________________________________________________________________________________ activation_122 (Activation) (None, None, None, 6 0 batch_normalization_122[0][0] __________________________________________________________________________________________________ conv2d_123 (Conv2D) (None, None, None, 9 55296 activation_122[0][0] __________________________________________________________________________________________________ batch_normalization_123 (BatchN (None, None, None, 9 288 conv2d_123[0][0] __________________________________________________________________________________________________ activation_123 (Activation) (None, None, None, 9 0 batch_normalization_123[0][0] __________________________________________________________________________________________________ conv2d_121 (Conv2D) (None, None, None, 3 995328 mixed2[0][0] __________________________________________________________________________________________________ conv2d_124 (Conv2D) (None, None, None, 9 82944 activation_123[0][0] __________________________________________________________________________________________________ batch_normalization_121 (BatchN (None, None, None, 3 1152 conv2d_121[0][0] __________________________________________________________________________________________________ batch_normalization_124 (BatchN (None, None, None, 9 288 conv2d_124[0][0] __________________________________________________________________________________________________ activation_121 (Activation) (None, None, None, 3 0 batch_normalization_121[0][0] __________________________________________________________________________________________________ activation_124 (Activation) (None, None, None, 9 0 batch_normalization_124[0][0] __________________________________________________________________________________________________ max_pooling2d_7 (MaxPooling2D) (None, None, None, 2 0 mixed2[0][0] __________________________________________________________________________________________________ mixed3 (Concatenate) (None, None, None, 7 0 activation_121[0][0] activation_124[0][0] max_pooling2d_7[0][0] __________________________________________________________________________________________________ conv2d_129 (Conv2D) (None, None, None, 1 98304 mixed3[0][0] __________________________________________________________________________________________________ batch_normalization_129 (BatchN (None, None, None, 1 384 conv2d_129[0][0] __________________________________________________________________________________________________ activation_129 (Activation) (None, None, None, 1 0 batch_normalization_129[0][0] __________________________________________________________________________________________________ conv2d_130 (Conv2D) (None, None, None, 1 114688 activation_129[0][0] __________________________________________________________________________________________________ batch_normalization_130 (BatchN (None, None, None, 1 384 conv2d_130[0][0] __________________________________________________________________________________________________ activation_130 (Activation) (None, None, None, 1 0 batch_normalization_130[0][0] __________________________________________________________________________________________________ conv2d_126 (Conv2D) (None, None, None, 1 98304 mixed3[0][0] __________________________________________________________________________________________________ conv2d_131 (Conv2D) (None, None, None, 1 114688 activation_130[0][0] __________________________________________________________________________________________________ batch_normalization_126 (BatchN (None, None, None, 1 384 conv2d_126[0][0] __________________________________________________________________________________________________ batch_normalization_131 (BatchN (None, None, None, 1 384 conv2d_131[0][0] __________________________________________________________________________________________________ activation_126 (Activation) (None, None, None, 1 0 batch_normalization_126[0][0] __________________________________________________________________________________________________ activation_131 (Activation) (None, None, None, 1 0 batch_normalization_131[0][0] __________________________________________________________________________________________________ conv2d_127 (Conv2D) (None, None, None, 1 114688 activation_126[0][0] __________________________________________________________________________________________________ conv2d_132 (Conv2D) (None, None, None, 1 114688 activation_131[0][0] __________________________________________________________________________________________________ batch_normalization_127 (BatchN (None, None, None, 1 384 conv2d_127[0][0] __________________________________________________________________________________________________ batch_normalization_132 (BatchN (None, None, None, 1 384 conv2d_132[0][0] __________________________________________________________________________________________________ activation_127 (Activation) (None, None, None, 1 0 batch_normalization_127[0][0] __________________________________________________________________________________________________ activation_132 (Activation) (None, None, None, 1 0 batch_normalization_132[0][0] __________________________________________________________________________________________________ average_pooling2d_13 (AveragePo (None, None, None, 7 0 mixed3[0][0] __________________________________________________________________________________________________ conv2d_125 (Conv2D) (None, None, None, 1 147456 mixed3[0][0] __________________________________________________________________________________________________ conv2d_128 (Conv2D) (None, None, None, 1 172032 activation_127[0][0] __________________________________________________________________________________________________ conv2d_133 (Conv2D) (None, None, None, 1 172032 activation_132[0][0] __________________________________________________________________________________________________ conv2d_134 (Conv2D) (None, None, None, 1 147456 average_pooling2d_13[0][0] __________________________________________________________________________________________________ batch_normalization_125 (BatchN (None, None, None, 1 576 conv2d_125[0][0] __________________________________________________________________________________________________ batch_normalization_128 (BatchN (None, None, None, 1 576 conv2d_128[0][0] __________________________________________________________________________________________________ batch_normalization_133 (BatchN (None, None, None, 1 576 conv2d_133[0][0] __________________________________________________________________________________________________ batch_normalization_134 (BatchN (None, None, None, 1 576 conv2d_134[0][0] __________________________________________________________________________________________________ activation_125 (Activation) (None, None, None, 1 0 batch_normalization_125[0][0] __________________________________________________________________________________________________ activation_128 (Activation) (None, None, None, 1 0 batch_normalization_128[0][0] __________________________________________________________________________________________________ activation_133 (Activation) (None, None, None, 1 0 batch_normalization_133[0][0] __________________________________________________________________________________________________ activation_134 (Activation) (None, None, None, 1 0 batch_normalization_134[0][0] __________________________________________________________________________________________________ mixed4 (Concatenate) (None, None, None, 7 0 activation_125[0][0] activation_128[0][0] activation_133[0][0] activation_134[0][0] __________________________________________________________________________________________________ conv2d_139 (Conv2D) (None, None, None, 1 122880 mixed4[0][0] __________________________________________________________________________________________________ batch_normalization_139 (BatchN (None, None, None, 1 480 conv2d_139[0][0] __________________________________________________________________________________________________ activation_139 (Activation) (None, None, None, 1 0 batch_normalization_139[0][0] __________________________________________________________________________________________________ conv2d_140 (Conv2D) (None, None, None, 1 179200 activation_139[0][0] __________________________________________________________________________________________________ batch_normalization_140 (BatchN (None, None, None, 1 480 conv2d_140[0][0] __________________________________________________________________________________________________ activation_140 (Activation) (None, None, None, 1 0 batch_normalization_140[0][0] __________________________________________________________________________________________________ conv2d_136 (Conv2D) (None, None, None, 1 122880 mixed4[0][0] __________________________________________________________________________________________________ conv2d_141 (Conv2D) (None, None, None, 1 179200 activation_140[0][0] __________________________________________________________________________________________________ batch_normalization_136 (BatchN (None, None, None, 1 480 conv2d_136[0][0] __________________________________________________________________________________________________ batch_normalization_141 (BatchN (None, None, None, 1 480 conv2d_141[0][0] __________________________________________________________________________________________________ activation_136 (Activation) (None, None, None, 1 0 batch_normalization_136[0][0] __________________________________________________________________________________________________ activation_141 (Activation) (None, None, None, 1 0 batch_normalization_141[0][0] __________________________________________________________________________________________________ conv2d_137 (Conv2D) (None, None, None, 1 179200 activation_136[0][0] __________________________________________________________________________________________________ conv2d_142 (Conv2D) (None, None, None, 1 179200 activation_141[0][0] __________________________________________________________________________________________________ batch_normalization_137 (BatchN (None, None, None, 1 480 conv2d_137[0][0] __________________________________________________________________________________________________ batch_normalization_142 (BatchN (None, None, None, 1 480 conv2d_142[0][0] __________________________________________________________________________________________________ activation_137 (Activation) (None, None, None, 1 0 batch_normalization_137[0][0] __________________________________________________________________________________________________ activation_142 (Activation) (None, None, None, 1 0 batch_normalization_142[0][0] __________________________________________________________________________________________________ average_pooling2d_14 (AveragePo (None, None, None, 7 0 mixed4[0][0] __________________________________________________________________________________________________ conv2d_135 (Conv2D) (None, None, None, 1 147456 mixed4[0][0] __________________________________________________________________________________________________ conv2d_138 (Conv2D) (None, None, None, 1 215040 activation_137[0][0] __________________________________________________________________________________________________ conv2d_143 (Conv2D) (None, None, None, 1 215040 activation_142[0][0] __________________________________________________________________________________________________ conv2d_144 (Conv2D) (None, None, None, 1 147456 average_pooling2d_14[0][0] __________________________________________________________________________________________________ batch_normalization_135 (BatchN (None, None, None, 1 576 conv2d_135[0][0] __________________________________________________________________________________________________ batch_normalization_138 (BatchN (None, None, None, 1 576 conv2d_138[0][0] __________________________________________________________________________________________________ batch_normalization_143 (BatchN (None, None, None, 1 576 conv2d_143[0][0] __________________________________________________________________________________________________ batch_normalization_144 (BatchN (None, None, None, 1 576 conv2d_144[0][0] __________________________________________________________________________________________________ activation_135 (Activation) (None, None, None, 1 0 batch_normalization_135[0][0] __________________________________________________________________________________________________ activation_138 (Activation) (None, None, None, 1 0 batch_normalization_138[0][0] __________________________________________________________________________________________________ activation_143 (Activation) (None, None, None, 1 0 batch_normalization_143[0][0] __________________________________________________________________________________________________ activation_144 (Activation) (None, None, None, 1 0 batch_normalization_144[0][0] __________________________________________________________________________________________________ mixed5 (Concatenate) (None, None, None, 7 0 activation_135[0][0] activation_138[0][0] activation_143[0][0] activation_144[0][0] __________________________________________________________________________________________________ conv2d_149 (Conv2D) (None, None, None, 1 122880 mixed5[0][0] __________________________________________________________________________________________________ batch_normalization_149 (BatchN (None, None, None, 1 480 conv2d_149[0][0] __________________________________________________________________________________________________ activation_149 (Activation) (None, None, None, 1 0 batch_normalization_149[0][0] __________________________________________________________________________________________________ conv2d_150 (Conv2D) (None, None, None, 1 179200 activation_149[0][0] __________________________________________________________________________________________________ batch_normalization_150 (BatchN (None, None, None, 1 480 conv2d_150[0][0] __________________________________________________________________________________________________ activation_150 (Activation) (None, None, None, 1 0 batch_normalization_150[0][0] __________________________________________________________________________________________________ conv2d_146 (Conv2D) (None, None, None, 1 122880 mixed5[0][0] __________________________________________________________________________________________________ conv2d_151 (Conv2D) (None, None, None, 1 179200 activation_150[0][0] __________________________________________________________________________________________________ batch_normalization_146 (BatchN (None, None, None, 1 480 conv2d_146[0][0] __________________________________________________________________________________________________ batch_normalization_151 (BatchN (None, None, None, 1 480 conv2d_151[0][0] __________________________________________________________________________________________________ activation_146 (Activation) (None, None, None, 1 0 batch_normalization_146[0][0] __________________________________________________________________________________________________ activation_151 (Activation) (None, None, None, 1 0 batch_normalization_151[0][0] __________________________________________________________________________________________________ conv2d_147 (Conv2D) (None, None, None, 1 179200 activation_146[0][0] __________________________________________________________________________________________________ conv2d_152 (Conv2D) (None, None, None, 1 179200 activation_151[0][0] __________________________________________________________________________________________________ batch_normalization_147 (BatchN (None, None, None, 1 480 conv2d_147[0][0] __________________________________________________________________________________________________ batch_normalization_152 (BatchN (None, None, None, 1 480 conv2d_152[0][0] __________________________________________________________________________________________________ activation_147 (Activation) (None, None, None, 1 0 batch_normalization_147[0][0] __________________________________________________________________________________________________ activation_152 (Activation) (None, None, None, 1 0 batch_normalization_152[0][0] __________________________________________________________________________________________________ average_pooling2d_15 (AveragePo (None, None, None, 7 0 mixed5[0][0] __________________________________________________________________________________________________ conv2d_145 (Conv2D) (None, None, None, 1 147456 mixed5[0][0] __________________________________________________________________________________________________ conv2d_148 (Conv2D) (None, None, None, 1 215040 activation_147[0][0] __________________________________________________________________________________________________ conv2d_153 (Conv2D) (None, None, None, 1 215040 activation_152[0][0] __________________________________________________________________________________________________ conv2d_154 (Conv2D) (None, None, None, 1 147456 average_pooling2d_15[0][0] __________________________________________________________________________________________________ batch_normalization_145 (BatchN (None, None, None, 1 576 conv2d_145[0][0] __________________________________________________________________________________________________ batch_normalization_148 (BatchN (None, None, None, 1 576 conv2d_148[0][0] __________________________________________________________________________________________________ batch_normalization_153 (BatchN (None, None, None, 1 576 conv2d_153[0][0] __________________________________________________________________________________________________ batch_normalization_154 (BatchN (None, None, None, 1 576 conv2d_154[0][0] __________________________________________________________________________________________________ activation_145 (Activation) (None, None, None, 1 0 batch_normalization_145[0][0] __________________________________________________________________________________________________ activation_148 (Activation) (None, None, None, 1 0 batch_normalization_148[0][0] __________________________________________________________________________________________________ activation_153 (Activation) (None, None, None, 1 0 batch_normalization_153[0][0] __________________________________________________________________________________________________ activation_154 (Activation) (None, None, None, 1 0 batch_normalization_154[0][0] __________________________________________________________________________________________________ mixed6 (Concatenate) (None, None, None, 7 0 activation_145[0][0] activation_148[0][0] activation_153[0][0] activation_154[0][0] __________________________________________________________________________________________________ conv2d_159 (Conv2D) (None, None, None, 1 147456 mixed6[0][0] __________________________________________________________________________________________________ batch_normalization_159 (BatchN (None, None, None, 1 576 conv2d_159[0][0] __________________________________________________________________________________________________ activation_159 (Activation) (None, None, None, 1 0 batch_normalization_159[0][0] __________________________________________________________________________________________________ conv2d_160 (Conv2D) (None, None, None, 1 258048 activation_159[0][0] __________________________________________________________________________________________________ batch_normalization_160 (BatchN (None, None, None, 1 576 conv2d_160[0][0] __________________________________________________________________________________________________ activation_160 (Activation) (None, None, None, 1 0 batch_normalization_160[0][0] __________________________________________________________________________________________________ conv2d_156 (Conv2D) (None, None, None, 1 147456 mixed6[0][0] __________________________________________________________________________________________________ conv2d_161 (Conv2D) (None, None, None, 1 258048 activation_160[0][0] __________________________________________________________________________________________________ batch_normalization_156 (BatchN (None, None, None, 1 576 conv2d_156[0][0] __________________________________________________________________________________________________ batch_normalization_161 (BatchN (None, None, None, 1 576 conv2d_161[0][0] __________________________________________________________________________________________________ activation_156 (Activation) (None, None, None, 1 0 batch_normalization_156[0][0] __________________________________________________________________________________________________ activation_161 (Activation) (None, None, None, 1 0 batch_normalization_161[0][0] __________________________________________________________________________________________________ conv2d_157 (Conv2D) (None, None, None, 1 258048 activation_156[0][0] __________________________________________________________________________________________________ conv2d_162 (Conv2D) (None, None, None, 1 258048 activation_161[0][0] __________________________________________________________________________________________________ batch_normalization_157 (BatchN (None, None, None, 1 576 conv2d_157[0][0] __________________________________________________________________________________________________ batch_normalization_162 (BatchN (None, None, None, 1 576 conv2d_162[0][0] __________________________________________________________________________________________________ activation_157 (Activation) (None, None, None, 1 0 batch_normalization_157[0][0] __________________________________________________________________________________________________ activation_162 (Activation) (None, None, None, 1 0 batch_normalization_162[0][0] __________________________________________________________________________________________________ average_pooling2d_16 (AveragePo (None, None, None, 7 0 mixed6[0][0] __________________________________________________________________________________________________ conv2d_155 (Conv2D) (None, None, None, 1 147456 mixed6[0][0] __________________________________________________________________________________________________ conv2d_158 (Conv2D) (None, None, None, 1 258048 activation_157[0][0] __________________________________________________________________________________________________ conv2d_163 (Conv2D) (None, None, None, 1 258048 activation_162[0][0] __________________________________________________________________________________________________ conv2d_164 (Conv2D) (None, None, None, 1 147456 average_pooling2d_16[0][0] __________________________________________________________________________________________________ batch_normalization_155 (BatchN (None, None, None, 1 576 conv2d_155[0][0] __________________________________________________________________________________________________ batch_normalization_158 (BatchN (None, None, None, 1 576 conv2d_158[0][0] __________________________________________________________________________________________________ batch_normalization_163 (BatchN (None, None, None, 1 576 conv2d_163[0][0] __________________________________________________________________________________________________ batch_normalization_164 (BatchN (None, None, None, 1 576 conv2d_164[0][0] __________________________________________________________________________________________________ activation_155 (Activation) (None, None, None, 1 0 batch_normalization_155[0][0] __________________________________________________________________________________________________ activation_158 (Activation) (None, None, None, 1 0 batch_normalization_158[0][0] __________________________________________________________________________________________________ activation_163 (Activation) (None, None, None, 1 0 batch_normalization_163[0][0] __________________________________________________________________________________________________ activation_164 (Activation) (None, None, None, 1 0 batch_normalization_164[0][0] __________________________________________________________________________________________________ mixed7 (Concatenate) (None, None, None, 7 0 activation_155[0][0] activation_158[0][0] activation_163[0][0] activation_164[0][0] __________________________________________________________________________________________________ conv2d_167 (Conv2D) (None, None, None, 1 147456 mixed7[0][0] __________________________________________________________________________________________________ batch_normalization_167 (BatchN (None, None, None, 1 576 conv2d_167[0][0] __________________________________________________________________________________________________ activation_167 (Activation) (None, None, None, 1 0 batch_normalization_167[0][0] __________________________________________________________________________________________________ conv2d_168 (Conv2D) (None, None, None, 1 258048 activation_167[0][0] __________________________________________________________________________________________________ batch_normalization_168 (BatchN (None, None, None, 1 576 conv2d_168[0][0] __________________________________________________________________________________________________ activation_168 (Activation) (None, None, None, 1 0 batch_normalization_168[0][0] __________________________________________________________________________________________________ conv2d_165 (Conv2D) (None, None, None, 1 147456 mixed7[0][0] __________________________________________________________________________________________________ conv2d_169 (Conv2D) (None, None, None, 1 258048 activation_168[0][0] __________________________________________________________________________________________________ batch_normalization_165 (BatchN (None, None, None, 1 576 conv2d_165[0][0] __________________________________________________________________________________________________ batch_normalization_169 (BatchN (None, None, None, 1 576 conv2d_169[0][0] __________________________________________________________________________________________________ activation_165 (Activation) (None, None, None, 1 0 batch_normalization_165[0][0] __________________________________________________________________________________________________ activation_169 (Activation) (None, None, None, 1 0 batch_normalization_169[0][0] __________________________________________________________________________________________________ conv2d_166 (Conv2D) (None, None, None, 3 552960 activation_165[0][0] __________________________________________________________________________________________________ conv2d_170 (Conv2D) (None, None, None, 1 331776 activation_169[0][0] __________________________________________________________________________________________________ batch_normalization_166 (BatchN (None, None, None, 3 960 conv2d_166[0][0] __________________________________________________________________________________________________ batch_normalization_170 (BatchN (None, None, None, 1 576 conv2d_170[0][0] __________________________________________________________________________________________________ activation_166 (Activation) (None, None, None, 3 0 batch_normalization_166[0][0] __________________________________________________________________________________________________ activation_170 (Activation) (None, None, None, 1 0 batch_normalization_170[0][0] __________________________________________________________________________________________________ max_pooling2d_8 (MaxPooling2D) (None, None, None, 7 0 mixed7[0][0] __________________________________________________________________________________________________ mixed8 (Concatenate) (None, None, None, 1 0 activation_166[0][0] activation_170[0][0] max_pooling2d_8[0][0] __________________________________________________________________________________________________ conv2d_175 (Conv2D) (None, None, None, 4 573440 mixed8[0][0] __________________________________________________________________________________________________ batch_normalization_175 (BatchN (None, None, None, 4 1344 conv2d_175[0][0] __________________________________________________________________________________________________ activation_175 (Activation) (None, None, None, 4 0 batch_normalization_175[0][0] __________________________________________________________________________________________________ conv2d_172 (Conv2D) (None, None, None, 3 491520 mixed8[0][0] __________________________________________________________________________________________________ conv2d_176 (Conv2D) (None, None, None, 3 1548288 activation_175[0][0] __________________________________________________________________________________________________ batch_normalization_172 (BatchN (None, None, None, 3 1152 conv2d_172[0][0] __________________________________________________________________________________________________ batch_normalization_176 (BatchN (None, None, None, 3 1152 conv2d_176[0][0] __________________________________________________________________________________________________ activation_172 (Activation) (None, None, None, 3 0 batch_normalization_172[0][0] __________________________________________________________________________________________________ activation_176 (Activation) (None, None, None, 3 0 batch_normalization_176[0][0] __________________________________________________________________________________________________ conv2d_173 (Conv2D) (None, None, None, 3 442368 activation_172[0][0] __________________________________________________________________________________________________ conv2d_174 (Conv2D) (None, None, None, 3 442368 activation_172[0][0] __________________________________________________________________________________________________ conv2d_177 (Conv2D) (None, None, None, 3 442368 activation_176[0][0] __________________________________________________________________________________________________ conv2d_178 (Conv2D) (None, None, None, 3 442368 activation_176[0][0] __________________________________________________________________________________________________ average_pooling2d_17 (AveragePo (None, None, None, 1 0 mixed8[0][0] __________________________________________________________________________________________________ conv2d_171 (Conv2D) (None, None, None, 3 409600 mixed8[0][0] __________________________________________________________________________________________________ batch_normalization_173 (BatchN (None, None, None, 3 1152 conv2d_173[0][0] __________________________________________________________________________________________________ batch_normalization_174 (BatchN (None, None, None, 3 1152 conv2d_174[0][0] __________________________________________________________________________________________________ batch_normalization_177 (BatchN (None, None, None, 3 1152 conv2d_177[0][0] __________________________________________________________________________________________________ batch_normalization_178 (BatchN (None, None, None, 3 1152 conv2d_178[0][0] __________________________________________________________________________________________________ conv2d_179 (Conv2D) (None, None, None, 1 245760 average_pooling2d_17[0][0] __________________________________________________________________________________________________ batch_normalization_171 (BatchN (None, None, None, 3 960 conv2d_171[0][0] __________________________________________________________________________________________________ activation_173 (Activation) (None, None, None, 3 0 batch_normalization_173[0][0] __________________________________________________________________________________________________ activation_174 (Activation) (None, None, None, 3 0 batch_normalization_174[0][0] __________________________________________________________________________________________________ activation_177 (Activation) (None, None, None, 3 0 batch_normalization_177[0][0] __________________________________________________________________________________________________ activation_178 (Activation) (None, None, None, 3 0 batch_normalization_178[0][0] __________________________________________________________________________________________________ batch_normalization_179 (BatchN (None, None, None, 1 576 conv2d_179[0][0] __________________________________________________________________________________________________ activation_171 (Activation) (None, None, None, 3 0 batch_normalization_171[0][0] __________________________________________________________________________________________________ mixed9_0 (Concatenate) (None, None, None, 7 0 activation_173[0][0] activation_174[0][0] __________________________________________________________________________________________________ concatenate_3 (Concatenate) (None, None, None, 7 0 activation_177[0][0] activation_178[0][0] __________________________________________________________________________________________________ activation_179 (Activation) (None, None, None, 1 0 batch_normalization_179[0][0] __________________________________________________________________________________________________ mixed9 (Concatenate) (None, None, None, 2 0 activation_171[0][0] mixed9_0[0][0] concatenate_3[0][0] activation_179[0][0] __________________________________________________________________________________________________ conv2d_184 (Conv2D) (None, None, None, 4 917504 mixed9[0][0] __________________________________________________________________________________________________ batch_normalization_184 (BatchN (None, None, None, 4 1344 conv2d_184[0][0] __________________________________________________________________________________________________ activation_184 (Activation) (None, None, None, 4 0 batch_normalization_184[0][0] __________________________________________________________________________________________________ conv2d_181 (Conv2D) (None, None, None, 3 786432 mixed9[0][0] __________________________________________________________________________________________________ conv2d_185 (Conv2D) (None, None, None, 3 1548288 activation_184[0][0] __________________________________________________________________________________________________ batch_normalization_181 (BatchN (None, None, None, 3 1152 conv2d_181[0][0] __________________________________________________________________________________________________ batch_normalization_185 (BatchN (None, None, None, 3 1152 conv2d_185[0][0] __________________________________________________________________________________________________ activation_181 (Activation) (None, None, None, 3 0 batch_normalization_181[0][0] __________________________________________________________________________________________________ activation_185 (Activation) (None, None, None, 3 0 batch_normalization_185[0][0] __________________________________________________________________________________________________ conv2d_182 (Conv2D) (None, None, None, 3 442368 activation_181[0][0] __________________________________________________________________________________________________ conv2d_183 (Conv2D) (None, None, None, 3 442368 activation_181[0][0] __________________________________________________________________________________________________ conv2d_186 (Conv2D) (None, None, None, 3 442368 activation_185[0][0] __________________________________________________________________________________________________ conv2d_187 (Conv2D) (None, None, None, 3 442368 activation_185[0][0] __________________________________________________________________________________________________ average_pooling2d_18 (AveragePo (None, None, None, 2 0 mixed9[0][0] __________________________________________________________________________________________________ conv2d_180 (Conv2D) (None, None, None, 3 655360 mixed9[0][0] __________________________________________________________________________________________________ batch_normalization_182 (BatchN (None, None, None, 3 1152 conv2d_182[0][0] __________________________________________________________________________________________________ batch_normalization_183 (BatchN (None, None, None, 3 1152 conv2d_183[0][0] __________________________________________________________________________________________________ batch_normalization_186 (BatchN (None, None, None, 3 1152 conv2d_186[0][0] __________________________________________________________________________________________________ batch_normalization_187 (BatchN (None, None, None, 3 1152 conv2d_187[0][0] __________________________________________________________________________________________________ conv2d_188 (Conv2D) (None, None, None, 1 393216 average_pooling2d_18[0][0] __________________________________________________________________________________________________ batch_normalization_180 (BatchN (None, None, None, 3 960 conv2d_180[0][0] __________________________________________________________________________________________________ activation_182 (Activation) (None, None, None, 3 0 batch_normalization_182[0][0] __________________________________________________________________________________________________ activation_183 (Activation) (None, None, None, 3 0 batch_normalization_183[0][0] __________________________________________________________________________________________________ activation_186 (Activation) (None, None, None, 3 0 batch_normalization_186[0][0] __________________________________________________________________________________________________ activation_187 (Activation) (None, None, None, 3 0 batch_normalization_187[0][0] __________________________________________________________________________________________________ batch_normalization_188 (BatchN (None, None, None, 1 576 conv2d_188[0][0] __________________________________________________________________________________________________ activation_180 (Activation) (None, None, None, 3 0 batch_normalization_180[0][0] __________________________________________________________________________________________________ mixed9_1 (Concatenate) (None, None, None, 7 0 activation_182[0][0] activation_183[0][0] __________________________________________________________________________________________________ concatenate_4 (Concatenate) (None, None, None, 7 0 activation_186[0][0] activation_187[0][0] __________________________________________________________________________________________________ activation_188 (Activation) (None, None, None, 1 0 batch_normalization_188[0][0] __________________________________________________________________________________________________ mixed10 (Concatenate) (None, None, None, 2 0 activation_180[0][0] mixed9_1[0][0] concatenate_4[0][0] activation_188[0][0] ================================================================================================== Total params: 21,802,784 Trainable params: 21,768,352 Non-trainable params: 34,432 __________________________________________________________________________________________________
type(applications.InceptionV3(include_top=False, weights='imagenet').summary())
__________________________________________________________________________________________________ Layer (type) Output Shape Param # Connected to ================================================================================================== input_3 (InputLayer) (None, None, None, 3 0 __________________________________________________________________________________________________ conv2d_189 (Conv2D) (None, None, None, 3 864 input_3[0][0] __________________________________________________________________________________________________ batch_normalization_189 (BatchN (None, None, None, 3 96 conv2d_189[0][0] __________________________________________________________________________________________________ activation_189 (Activation) (None, None, None, 3 0 batch_normalization_189[0][0] __________________________________________________________________________________________________ conv2d_190 (Conv2D) (None, None, None, 3 9216 activation_189[0][0] __________________________________________________________________________________________________ batch_normalization_190 (BatchN (None, None, None, 3 96 conv2d_190[0][0] __________________________________________________________________________________________________ activation_190 (Activation) (None, None, None, 3 0 batch_normalization_190[0][0] __________________________________________________________________________________________________ conv2d_191 (Conv2D) (None, None, None, 6 18432 activation_190[0][0] __________________________________________________________________________________________________ batch_normalization_191 (BatchN (None, None, None, 6 192 conv2d_191[0][0] __________________________________________________________________________________________________ activation_191 (Activation) (None, None, None, 6 0 batch_normalization_191[0][0] __________________________________________________________________________________________________ max_pooling2d_9 (MaxPooling2D) (None, None, None, 6 0 activation_191[0][0] __________________________________________________________________________________________________ conv2d_192 (Conv2D) (None, None, None, 8 5120 max_pooling2d_9[0][0] __________________________________________________________________________________________________ batch_normalization_192 (BatchN (None, None, None, 8 240 conv2d_192[0][0] __________________________________________________________________________________________________ activation_192 (Activation) (None, None, None, 8 0 batch_normalization_192[0][0] __________________________________________________________________________________________________ conv2d_193 (Conv2D) (None, None, None, 1 138240 activation_192[0][0] __________________________________________________________________________________________________ batch_normalization_193 (BatchN (None, None, None, 1 576 conv2d_193[0][0] __________________________________________________________________________________________________ activation_193 (Activation) (None, None, None, 1 0 batch_normalization_193[0][0] __________________________________________________________________________________________________ max_pooling2d_10 (MaxPooling2D) (None, None, None, 1 0 activation_193[0][0] __________________________________________________________________________________________________ conv2d_197 (Conv2D) (None, None, None, 6 12288 max_pooling2d_10[0][0] __________________________________________________________________________________________________ batch_normalization_197 (BatchN (None, None, None, 6 192 conv2d_197[0][0] __________________________________________________________________________________________________ activation_197 (Activation) (None, None, None, 6 0 batch_normalization_197[0][0] __________________________________________________________________________________________________ conv2d_195 (Conv2D) (None, None, None, 4 9216 max_pooling2d_10[0][0] __________________________________________________________________________________________________ conv2d_198 (Conv2D) (None, None, None, 9 55296 activation_197[0][0] __________________________________________________________________________________________________ batch_normalization_195 (BatchN (None, None, None, 4 144 conv2d_195[0][0] __________________________________________________________________________________________________ batch_normalization_198 (BatchN (None, None, None, 9 288 conv2d_198[0][0] __________________________________________________________________________________________________ activation_195 (Activation) (None, None, None, 4 0 batch_normalization_195[0][0] __________________________________________________________________________________________________ activation_198 (Activation) (None, None, None, 9 0 batch_normalization_198[0][0] __________________________________________________________________________________________________ average_pooling2d_19 (AveragePo (None, None, None, 1 0 max_pooling2d_10[0][0] __________________________________________________________________________________________________ conv2d_194 (Conv2D) (None, None, None, 6 12288 max_pooling2d_10[0][0] __________________________________________________________________________________________________ conv2d_196 (Conv2D) (None, None, None, 6 76800 activation_195[0][0] __________________________________________________________________________________________________ conv2d_199 (Conv2D) (None, None, None, 9 82944 activation_198[0][0] __________________________________________________________________________________________________ conv2d_200 (Conv2D) (None, None, None, 3 6144 average_pooling2d_19[0][0] __________________________________________________________________________________________________ batch_normalization_194 (BatchN (None, None, None, 6 192 conv2d_194[0][0] __________________________________________________________________________________________________ batch_normalization_196 (BatchN (None, None, None, 6 192 conv2d_196[0][0] __________________________________________________________________________________________________ batch_normalization_199 (BatchN (None, None, None, 9 288 conv2d_199[0][0] __________________________________________________________________________________________________ batch_normalization_200 (BatchN (None, None, None, 3 96 conv2d_200[0][0] __________________________________________________________________________________________________ activation_194 (Activation) (None, None, None, 6 0 batch_normalization_194[0][0] __________________________________________________________________________________________________ activation_196 (Activation) (None, None, None, 6 0 batch_normalization_196[0][0] __________________________________________________________________________________________________ activation_199 (Activation) (None, None, None, 9 0 batch_normalization_199[0][0] __________________________________________________________________________________________________ activation_200 (Activation) (None, None, None, 3 0 batch_normalization_200[0][0] __________________________________________________________________________________________________ mixed0 (Concatenate) (None, None, None, 2 0 activation_194[0][0] activation_196[0][0] activation_199[0][0] activation_200[0][0] __________________________________________________________________________________________________ conv2d_204 (Conv2D) (None, None, None, 6 16384 mixed0[0][0] __________________________________________________________________________________________________ batch_normalization_204 (BatchN (None, None, None, 6 192 conv2d_204[0][0] __________________________________________________________________________________________________ activation_204 (Activation) (None, None, None, 6 0 batch_normalization_204[0][0] __________________________________________________________________________________________________ conv2d_202 (Conv2D) (None, None, None, 4 12288 mixed0[0][0] __________________________________________________________________________________________________ conv2d_205 (Conv2D) (None, None, None, 9 55296 activation_204[0][0] __________________________________________________________________________________________________ batch_normalization_202 (BatchN (None, None, None, 4 144 conv2d_202[0][0] __________________________________________________________________________________________________ batch_normalization_205 (BatchN (None, None, None, 9 288 conv2d_205[0][0] __________________________________________________________________________________________________ activation_202 (Activation) (None, None, None, 4 0 batch_normalization_202[0][0] __________________________________________________________________________________________________ activation_205 (Activation) (None, None, None, 9 0 batch_normalization_205[0][0] __________________________________________________________________________________________________ average_pooling2d_20 (AveragePo (None, None, None, 2 0 mixed0[0][0] __________________________________________________________________________________________________ conv2d_201 (Conv2D) (None, None, None, 6 16384 mixed0[0][0] __________________________________________________________________________________________________ conv2d_203 (Conv2D) (None, None, None, 6 76800 activation_202[0][0] __________________________________________________________________________________________________ conv2d_206 (Conv2D) (None, None, None, 9 82944 activation_205[0][0] __________________________________________________________________________________________________ conv2d_207 (Conv2D) (None, None, None, 6 16384 average_pooling2d_20[0][0] __________________________________________________________________________________________________ batch_normalization_201 (BatchN (None, None, None, 6 192 conv2d_201[0][0] __________________________________________________________________________________________________ batch_normalization_203 (BatchN (None, None, None, 6 192 conv2d_203[0][0] __________________________________________________________________________________________________ batch_normalization_206 (BatchN (None, None, None, 9 288 conv2d_206[0][0] __________________________________________________________________________________________________ batch_normalization_207 (BatchN (None, None, None, 6 192 conv2d_207[0][0] __________________________________________________________________________________________________ activation_201 (Activation) (None, None, None, 6 0 batch_normalization_201[0][0] __________________________________________________________________________________________________ activation_203 (Activation) (None, None, None, 6 0 batch_normalization_203[0][0] __________________________________________________________________________________________________ activation_206 (Activation) (None, None, None, 9 0 batch_normalization_206[0][0] __________________________________________________________________________________________________ activation_207 (Activation) (None, None, None, 6 0 batch_normalization_207[0][0] __________________________________________________________________________________________________ mixed1 (Concatenate) (None, None, None, 2 0 activation_201[0][0] activation_203[0][0] activation_206[0][0] activation_207[0][0] __________________________________________________________________________________________________ conv2d_211 (Conv2D) (None, None, None, 6 18432 mixed1[0][0] __________________________________________________________________________________________________ batch_normalization_211 (BatchN (None, None, None, 6 192 conv2d_211[0][0] __________________________________________________________________________________________________ activation_211 (Activation) (None, None, None, 6 0 batch_normalization_211[0][0] __________________________________________________________________________________________________ conv2d_209 (Conv2D) (None, None, None, 4 13824 mixed1[0][0] __________________________________________________________________________________________________ conv2d_212 (Conv2D) (None, None, None, 9 55296 activation_211[0][0] __________________________________________________________________________________________________ batch_normalization_209 (BatchN (None, None, None, 4 144 conv2d_209[0][0] __________________________________________________________________________________________________ batch_normalization_212 (BatchN (None, None, None, 9 288 conv2d_212[0][0] __________________________________________________________________________________________________ activation_209 (Activation) (None, None, None, 4 0 batch_normalization_209[0][0] __________________________________________________________________________________________________ activation_212 (Activation) (None, None, None, 9 0 batch_normalization_212[0][0] __________________________________________________________________________________________________ average_pooling2d_21 (AveragePo (None, None, None, 2 0 mixed1[0][0] __________________________________________________________________________________________________ conv2d_208 (Conv2D) (None, None, None, 6 18432 mixed1[0][0] __________________________________________________________________________________________________ conv2d_210 (Conv2D) (None, None, None, 6 76800 activation_209[0][0] __________________________________________________________________________________________________ conv2d_213 (Conv2D) (None, None, None, 9 82944 activation_212[0][0] __________________________________________________________________________________________________ conv2d_214 (Conv2D) (None, None, None, 6 18432 average_pooling2d_21[0][0] __________________________________________________________________________________________________ batch_normalization_208 (BatchN (None, None, None, 6 192 conv2d_208[0][0] __________________________________________________________________________________________________ batch_normalization_210 (BatchN (None, None, None, 6 192 conv2d_210[0][0] __________________________________________________________________________________________________ batch_normalization_213 (BatchN (None, None, None, 9 288 conv2d_213[0][0] __________________________________________________________________________________________________ batch_normalization_214 (BatchN (None, None, None, 6 192 conv2d_214[0][0] __________________________________________________________________________________________________ activation_208 (Activation) (None, None, None, 6 0 batch_normalization_208[0][0] __________________________________________________________________________________________________ activation_210 (Activation) (None, None, None, 6 0 batch_normalization_210[0][0] __________________________________________________________________________________________________ activation_213 (Activation) (None, None, None, 9 0 batch_normalization_213[0][0] __________________________________________________________________________________________________ activation_214 (Activation) (None, None, None, 6 0 batch_normalization_214[0][0] __________________________________________________________________________________________________ mixed2 (Concatenate) (None, None, None, 2 0 activation_208[0][0] activation_210[0][0] activation_213[0][0] activation_214[0][0] __________________________________________________________________________________________________ conv2d_216 (Conv2D) (None, None, None, 6 18432 mixed2[0][0] __________________________________________________________________________________________________ batch_normalization_216 (BatchN (None, None, None, 6 192 conv2d_216[0][0] __________________________________________________________________________________________________ activation_216 (Activation) (None, None, None, 6 0 batch_normalization_216[0][0] __________________________________________________________________________________________________ conv2d_217 (Conv2D) (None, None, None, 9 55296 activation_216[0][0] __________________________________________________________________________________________________ batch_normalization_217 (BatchN (None, None, None, 9 288 conv2d_217[0][0] __________________________________________________________________________________________________ activation_217 (Activation) (None, None, None, 9 0 batch_normalization_217[0][0] __________________________________________________________________________________________________ conv2d_215 (Conv2D) (None, None, None, 3 995328 mixed2[0][0] __________________________________________________________________________________________________ conv2d_218 (Conv2D) (None, None, None, 9 82944 activation_217[0][0] __________________________________________________________________________________________________ batch_normalization_215 (BatchN (None, None, None, 3 1152 conv2d_215[0][0] __________________________________________________________________________________________________ batch_normalization_218 (BatchN (None, None, None, 9 288 conv2d_218[0][0] __________________________________________________________________________________________________ activation_215 (Activation) (None, None, None, 3 0 batch_normalization_215[0][0] __________________________________________________________________________________________________ activation_218 (Activation) (None, None, None, 9 0 batch_normalization_218[0][0] __________________________________________________________________________________________________ max_pooling2d_11 (MaxPooling2D) (None, None, None, 2 0 mixed2[0][0] __________________________________________________________________________________________________ mixed3 (Concatenate) (None, None, None, 7 0 activation_215[0][0] activation_218[0][0] max_pooling2d_11[0][0] __________________________________________________________________________________________________ conv2d_223 (Conv2D) (None, None, None, 1 98304 mixed3[0][0] __________________________________________________________________________________________________ batch_normalization_223 (BatchN (None, None, None, 1 384 conv2d_223[0][0] __________________________________________________________________________________________________ activation_223 (Activation) (None, None, None, 1 0 batch_normalization_223[0][0] __________________________________________________________________________________________________ conv2d_224 (Conv2D) (None, None, None, 1 114688 activation_223[0][0] __________________________________________________________________________________________________ batch_normalization_224 (BatchN (None, None, None, 1 384 conv2d_224[0][0] __________________________________________________________________________________________________ activation_224 (Activation) (None, None, None, 1 0 batch_normalization_224[0][0] __________________________________________________________________________________________________ conv2d_220 (Conv2D) (None, None, None, 1 98304 mixed3[0][0] __________________________________________________________________________________________________ conv2d_225 (Conv2D) (None, None, None, 1 114688 activation_224[0][0] __________________________________________________________________________________________________ batch_normalization_220 (BatchN (None, None, None, 1 384 conv2d_220[0][0] __________________________________________________________________________________________________ batch_normalization_225 (BatchN (None, None, None, 1 384 conv2d_225[0][0] __________________________________________________________________________________________________ activation_220 (Activation) (None, None, None, 1 0 batch_normalization_220[0][0] __________________________________________________________________________________________________ activation_225 (Activation) (None, None, None, 1 0 batch_normalization_225[0][0] __________________________________________________________________________________________________ conv2d_221 (Conv2D) (None, None, None, 1 114688 activation_220[0][0] __________________________________________________________________________________________________ conv2d_226 (Conv2D) (None, None, None, 1 114688 activation_225[0][0] __________________________________________________________________________________________________ batch_normalization_221 (BatchN (None, None, None, 1 384 conv2d_221[0][0] __________________________________________________________________________________________________ batch_normalization_226 (BatchN (None, None, None, 1 384 conv2d_226[0][0] __________________________________________________________________________________________________ activation_221 (Activation) (None, None, None, 1 0 batch_normalization_221[0][0] __________________________________________________________________________________________________ activation_226 (Activation) (None, None, None, 1 0 batch_normalization_226[0][0] __________________________________________________________________________________________________ average_pooling2d_22 (AveragePo (None, None, None, 7 0 mixed3[0][0] __________________________________________________________________________________________________ conv2d_219 (Conv2D) (None, None, None, 1 147456 mixed3[0][0] __________________________________________________________________________________________________ conv2d_222 (Conv2D) (None, None, None, 1 172032 activation_221[0][0] __________________________________________________________________________________________________ conv2d_227 (Conv2D) (None, None, None, 1 172032 activation_226[0][0] __________________________________________________________________________________________________ conv2d_228 (Conv2D) (None, None, None, 1 147456 average_pooling2d_22[0][0] __________________________________________________________________________________________________ batch_normalization_219 (BatchN (None, None, None, 1 576 conv2d_219[0][0] __________________________________________________________________________________________________ batch_normalization_222 (BatchN (None, None, None, 1 576 conv2d_222[0][0] __________________________________________________________________________________________________ batch_normalization_227 (BatchN (None, None, None, 1 576 conv2d_227[0][0] __________________________________________________________________________________________________ batch_normalization_228 (BatchN (None, None, None, 1 576 conv2d_228[0][0] __________________________________________________________________________________________________ activation_219 (Activation) (None, None, None, 1 0 batch_normalization_219[0][0] __________________________________________________________________________________________________ activation_222 (Activation) (None, None, None, 1 0 batch_normalization_222[0][0] __________________________________________________________________________________________________ activation_227 (Activation) (None, None, None, 1 0 batch_normalization_227[0][0] __________________________________________________________________________________________________ activation_228 (Activation) (None, None, None, 1 0 batch_normalization_228[0][0] __________________________________________________________________________________________________ mixed4 (Concatenate) (None, None, None, 7 0 activation_219[0][0] activation_222[0][0] activation_227[0][0] activation_228[0][0] __________________________________________________________________________________________________ conv2d_233 (Conv2D) (None, None, None, 1 122880 mixed4[0][0] __________________________________________________________________________________________________ batch_normalization_233 (BatchN (None, None, None, 1 480 conv2d_233[0][0] __________________________________________________________________________________________________ activation_233 (Activation) (None, None, None, 1 0 batch_normalization_233[0][0] __________________________________________________________________________________________________ conv2d_234 (Conv2D) (None, None, None, 1 179200 activation_233[0][0] __________________________________________________________________________________________________ batch_normalization_234 (BatchN (None, None, None, 1 480 conv2d_234[0][0] __________________________________________________________________________________________________ activation_234 (Activation) (None, None, None, 1 0 batch_normalization_234[0][0] __________________________________________________________________________________________________ conv2d_230 (Conv2D) (None, None, None, 1 122880 mixed4[0][0] __________________________________________________________________________________________________ conv2d_235 (Conv2D) (None, None, None, 1 179200 activation_234[0][0] __________________________________________________________________________________________________ batch_normalization_230 (BatchN (None, None, None, 1 480 conv2d_230[0][0] __________________________________________________________________________________________________ batch_normalization_235 (BatchN (None, None, None, 1 480 conv2d_235[0][0] __________________________________________________________________________________________________ activation_230 (Activation) (None, None, None, 1 0 batch_normalization_230[0][0] __________________________________________________________________________________________________ activation_235 (Activation) (None, None, None, 1 0 batch_normalization_235[0][0] __________________________________________________________________________________________________ conv2d_231 (Conv2D) (None, None, None, 1 179200 activation_230[0][0] __________________________________________________________________________________________________ conv2d_236 (Conv2D) (None, None, None, 1 179200 activation_235[0][0] __________________________________________________________________________________________________ batch_normalization_231 (BatchN (None, None, None, 1 480 conv2d_231[0][0] __________________________________________________________________________________________________ batch_normalization_236 (BatchN (None, None, None, 1 480 conv2d_236[0][0] __________________________________________________________________________________________________ activation_231 (Activation) (None, None, None, 1 0 batch_normalization_231[0][0] __________________________________________________________________________________________________ activation_236 (Activation) (None, None, None, 1 0 batch_normalization_236[0][0] __________________________________________________________________________________________________ average_pooling2d_23 (AveragePo (None, None, None, 7 0 mixed4[0][0] __________________________________________________________________________________________________ conv2d_229 (Conv2D) (None, None, None, 1 147456 mixed4[0][0] __________________________________________________________________________________________________ conv2d_232 (Conv2D) (None, None, None, 1 215040 activation_231[0][0] __________________________________________________________________________________________________ conv2d_237 (Conv2D) (None, None, None, 1 215040 activation_236[0][0] __________________________________________________________________________________________________ conv2d_238 (Conv2D) (None, None, None, 1 147456 average_pooling2d_23[0][0] __________________________________________________________________________________________________ batch_normalization_229 (BatchN (None, None, None, 1 576 conv2d_229[0][0] __________________________________________________________________________________________________ batch_normalization_232 (BatchN (None, None, None, 1 576 conv2d_232[0][0] __________________________________________________________________________________________________ batch_normalization_237 (BatchN (None, None, None, 1 576 conv2d_237[0][0] __________________________________________________________________________________________________ batch_normalization_238 (BatchN (None, None, None, 1 576 conv2d_238[0][0] __________________________________________________________________________________________________ activation_229 (Activation) (None, None, None, 1 0 batch_normalization_229[0][0] __________________________________________________________________________________________________ activation_232 (Activation) (None, None, None, 1 0 batch_normalization_232[0][0] __________________________________________________________________________________________________ activation_237 (Activation) (None, None, None, 1 0 batch_normalization_237[0][0] __________________________________________________________________________________________________ activation_238 (Activation) (None, None, None, 1 0 batch_normalization_238[0][0] __________________________________________________________________________________________________ mixed5 (Concatenate) (None, None, None, 7 0 activation_229[0][0] activation_232[0][0] activation_237[0][0] activation_238[0][0] __________________________________________________________________________________________________ conv2d_243 (Conv2D) (None, None, None, 1 122880 mixed5[0][0] __________________________________________________________________________________________________ batch_normalization_243 (BatchN (None, None, None, 1 480 conv2d_243[0][0] __________________________________________________________________________________________________ activation_243 (Activation) (None, None, None, 1 0 batch_normalization_243[0][0] __________________________________________________________________________________________________ conv2d_244 (Conv2D) (None, None, None, 1 179200 activation_243[0][0] __________________________________________________________________________________________________ batch_normalization_244 (BatchN (None, None, None, 1 480 conv2d_244[0][0] __________________________________________________________________________________________________ activation_244 (Activation) (None, None, None, 1 0 batch_normalization_244[0][0] __________________________________________________________________________________________________ conv2d_240 (Conv2D) (None, None, None, 1 122880 mixed5[0][0] __________________________________________________________________________________________________ conv2d_245 (Conv2D) (None, None, None, 1 179200 activation_244[0][0] __________________________________________________________________________________________________ batch_normalization_240 (BatchN (None, None, None, 1 480 conv2d_240[0][0] __________________________________________________________________________________________________ batch_normalization_245 (BatchN (None, None, None, 1 480 conv2d_245[0][0] __________________________________________________________________________________________________ activation_240 (Activation) (None, None, None, 1 0 batch_normalization_240[0][0] __________________________________________________________________________________________________ activation_245 (Activation) (None, None, None, 1 0 batch_normalization_245[0][0] __________________________________________________________________________________________________ conv2d_241 (Conv2D) (None, None, None, 1 179200 activation_240[0][0] __________________________________________________________________________________________________ conv2d_246 (Conv2D) (None, None, None, 1 179200 activation_245[0][0] __________________________________________________________________________________________________ batch_normalization_241 (BatchN (None, None, None, 1 480 conv2d_241[0][0] __________________________________________________________________________________________________ batch_normalization_246 (BatchN (None, None, None, 1 480 conv2d_246[0][0] __________________________________________________________________________________________________ activation_241 (Activation) (None, None, None, 1 0 batch_normalization_241[0][0] __________________________________________________________________________________________________ activation_246 (Activation) (None, None, None, 1 0 batch_normalization_246[0][0] __________________________________________________________________________________________________ average_pooling2d_24 (AveragePo (None, None, None, 7 0 mixed5[0][0] __________________________________________________________________________________________________ conv2d_239 (Conv2D) (None, None, None, 1 147456 mixed5[0][0] __________________________________________________________________________________________________ conv2d_242 (Conv2D) (None, None, None, 1 215040 activation_241[0][0] __________________________________________________________________________________________________ conv2d_247 (Conv2D) (None, None, None, 1 215040 activation_246[0][0] __________________________________________________________________________________________________ conv2d_248 (Conv2D) (None, None, None, 1 147456 average_pooling2d_24[0][0] __________________________________________________________________________________________________ batch_normalization_239 (BatchN (None, None, None, 1 576 conv2d_239[0][0] __________________________________________________________________________________________________ batch_normalization_242 (BatchN (None, None, None, 1 576 conv2d_242[0][0] __________________________________________________________________________________________________ batch_normalization_247 (BatchN (None, None, None, 1 576 conv2d_247[0][0] __________________________________________________________________________________________________ batch_normalization_248 (BatchN (None, None, None, 1 576 conv2d_248[0][0] __________________________________________________________________________________________________ activation_239 (Activation) (None, None, None, 1 0 batch_normalization_239[0][0] __________________________________________________________________________________________________ activation_242 (Activation) (None, None, None, 1 0 batch_normalization_242[0][0] __________________________________________________________________________________________________ activation_247 (Activation) (None, None, None, 1 0 batch_normalization_247[0][0] __________________________________________________________________________________________________ activation_248 (Activation) (None, None, None, 1 0 batch_normalization_248[0][0] __________________________________________________________________________________________________ mixed6 (Concatenate) (None, None, None, 7 0 activation_239[0][0] activation_242[0][0] activation_247[0][0] activation_248[0][0] __________________________________________________________________________________________________ conv2d_253 (Conv2D) (None, None, None, 1 147456 mixed6[0][0] __________________________________________________________________________________________________ batch_normalization_253 (BatchN (None, None, None, 1 576 conv2d_253[0][0] __________________________________________________________________________________________________ activation_253 (Activation) (None, None, None, 1 0 batch_normalization_253[0][0] __________________________________________________________________________________________________ conv2d_254 (Conv2D) (None, None, None, 1 258048 activation_253[0][0] __________________________________________________________________________________________________ batch_normalization_254 (BatchN (None, None, None, 1 576 conv2d_254[0][0] __________________________________________________________________________________________________ activation_254 (Activation) (None, None, None, 1 0 batch_normalization_254[0][0] __________________________________________________________________________________________________ conv2d_250 (Conv2D) (None, None, None, 1 147456 mixed6[0][0] __________________________________________________________________________________________________ conv2d_255 (Conv2D) (None, None, None, 1 258048 activation_254[0][0] __________________________________________________________________________________________________ batch_normalization_250 (BatchN (None, None, None, 1 576 conv2d_250[0][0] __________________________________________________________________________________________________ batch_normalization_255 (BatchN (None, None, None, 1 576 conv2d_255[0][0] __________________________________________________________________________________________________ activation_250 (Activation) (None, None, None, 1 0 batch_normalization_250[0][0] __________________________________________________________________________________________________ activation_255 (Activation) (None, None, None, 1 0 batch_normalization_255[0][0] __________________________________________________________________________________________________ conv2d_251 (Conv2D) (None, None, None, 1 258048 activation_250[0][0] __________________________________________________________________________________________________ conv2d_256 (Conv2D) (None, None, None, 1 258048 activation_255[0][0] __________________________________________________________________________________________________ batch_normalization_251 (BatchN (None, None, None, 1 576 conv2d_251[0][0] __________________________________________________________________________________________________ batch_normalization_256 (BatchN (None, None, None, 1 576 conv2d_256[0][0] __________________________________________________________________________________________________ activation_251 (Activation) (None, None, None, 1 0 batch_normalization_251[0][0] __________________________________________________________________________________________________ activation_256 (Activation) (None, None, None, 1 0 batch_normalization_256[0][0] __________________________________________________________________________________________________ average_pooling2d_25 (AveragePo (None, None, None, 7 0 mixed6[0][0] __________________________________________________________________________________________________ conv2d_249 (Conv2D) (None, None, None, 1 147456 mixed6[0][0] __________________________________________________________________________________________________ conv2d_252 (Conv2D) (None, None, None, 1 258048 activation_251[0][0] __________________________________________________________________________________________________ conv2d_257 (Conv2D) (None, None, None, 1 258048 activation_256[0][0] __________________________________________________________________________________________________ conv2d_258 (Conv2D) (None, None, None, 1 147456 average_pooling2d_25[0][0] __________________________________________________________________________________________________ batch_normalization_249 (BatchN (None, None, None, 1 576 conv2d_249[0][0] __________________________________________________________________________________________________ batch_normalization_252 (BatchN (None, None, None, 1 576 conv2d_252[0][0] __________________________________________________________________________________________________ batch_normalization_257 (BatchN (None, None, None, 1 576 conv2d_257[0][0] __________________________________________________________________________________________________ batch_normalization_258 (BatchN (None, None, None, 1 576 conv2d_258[0][0] __________________________________________________________________________________________________ activation_249 (Activation) (None, None, None, 1 0 batch_normalization_249[0][0] __________________________________________________________________________________________________ activation_252 (Activation) (None, None, None, 1 0 batch_normalization_252[0][0] __________________________________________________________________________________________________ activation_257 (Activation) (None, None, None, 1 0 batch_normalization_257[0][0] __________________________________________________________________________________________________ activation_258 (Activation) (None, None, None, 1 0 batch_normalization_258[0][0] __________________________________________________________________________________________________ mixed7 (Concatenate) (None, None, None, 7 0 activation_249[0][0] activation_252[0][0] activation_257[0][0] activation_258[0][0] __________________________________________________________________________________________________ conv2d_261 (Conv2D) (None, None, None, 1 147456 mixed7[0][0] __________________________________________________________________________________________________ batch_normalization_261 (BatchN (None, None, None, 1 576 conv2d_261[0][0] __________________________________________________________________________________________________ activation_261 (Activation) (None, None, None, 1 0 batch_normalization_261[0][0] __________________________________________________________________________________________________ conv2d_262 (Conv2D) (None, None, None, 1 258048 activation_261[0][0] __________________________________________________________________________________________________ batch_normalization_262 (BatchN (None, None, None, 1 576 conv2d_262[0][0] __________________________________________________________________________________________________ activation_262 (Activation) (None, None, None, 1 0 batch_normalization_262[0][0] __________________________________________________________________________________________________ conv2d_259 (Conv2D) (None, None, None, 1 147456 mixed7[0][0] __________________________________________________________________________________________________ conv2d_263 (Conv2D) (None, None, None, 1 258048 activation_262[0][0] __________________________________________________________________________________________________ batch_normalization_259 (BatchN (None, None, None, 1 576 conv2d_259[0][0] __________________________________________________________________________________________________ batch_normalization_263 (BatchN (None, None, None, 1 576 conv2d_263[0][0] __________________________________________________________________________________________________ activation_259 (Activation) (None, None, None, 1 0 batch_normalization_259[0][0] __________________________________________________________________________________________________ activation_263 (Activation) (None, None, None, 1 0 batch_normalization_263[0][0] __________________________________________________________________________________________________ conv2d_260 (Conv2D) (None, None, None, 3 552960 activation_259[0][0] __________________________________________________________________________________________________ conv2d_264 (Conv2D) (None, None, None, 1 331776 activation_263[0][0] __________________________________________________________________________________________________ batch_normalization_260 (BatchN (None, None, None, 3 960 conv2d_260[0][0] __________________________________________________________________________________________________ batch_normalization_264 (BatchN (None, None, None, 1 576 conv2d_264[0][0] __________________________________________________________________________________________________ activation_260 (Activation) (None, None, None, 3 0 batch_normalization_260[0][0] __________________________________________________________________________________________________ activation_264 (Activation) (None, None, None, 1 0 batch_normalization_264[0][0] __________________________________________________________________________________________________ max_pooling2d_12 (MaxPooling2D) (None, None, None, 7 0 mixed7[0][0] __________________________________________________________________________________________________ mixed8 (Concatenate) (None, None, None, 1 0 activation_260[0][0] activation_264[0][0] max_pooling2d_12[0][0] __________________________________________________________________________________________________ conv2d_269 (Conv2D) (None, None, None, 4 573440 mixed8[0][0] __________________________________________________________________________________________________ batch_normalization_269 (BatchN (None, None, None, 4 1344 conv2d_269[0][0] __________________________________________________________________________________________________ activation_269 (Activation) (None, None, None, 4 0 batch_normalization_269[0][0] __________________________________________________________________________________________________ conv2d_266 (Conv2D) (None, None, None, 3 491520 mixed8[0][0] __________________________________________________________________________________________________ conv2d_270 (Conv2D) (None, None, None, 3 1548288 activation_269[0][0] __________________________________________________________________________________________________ batch_normalization_266 (BatchN (None, None, None, 3 1152 conv2d_266[0][0] __________________________________________________________________________________________________ batch_normalization_270 (BatchN (None, None, None, 3 1152 conv2d_270[0][0] __________________________________________________________________________________________________ activation_266 (Activation) (None, None, None, 3 0 batch_normalization_266[0][0] __________________________________________________________________________________________________ activation_270 (Activation) (None, None, None, 3 0 batch_normalization_270[0][0] __________________________________________________________________________________________________ conv2d_267 (Conv2D) (None, None, None, 3 442368 activation_266[0][0] __________________________________________________________________________________________________ conv2d_268 (Conv2D) (None, None, None, 3 442368 activation_266[0][0] __________________________________________________________________________________________________ conv2d_271 (Conv2D) (None, None, None, 3 442368 activation_270[0][0] __________________________________________________________________________________________________ conv2d_272 (Conv2D) (None, None, None, 3 442368 activation_270[0][0] __________________________________________________________________________________________________ average_pooling2d_26 (AveragePo (None, None, None, 1 0 mixed8[0][0] __________________________________________________________________________________________________ conv2d_265 (Conv2D) (None, None, None, 3 409600 mixed8[0][0] __________________________________________________________________________________________________ batch_normalization_267 (BatchN (None, None, None, 3 1152 conv2d_267[0][0] __________________________________________________________________________________________________ batch_normalization_268 (BatchN (None, None, None, 3 1152 conv2d_268[0][0] __________________________________________________________________________________________________ batch_normalization_271 (BatchN (None, None, None, 3 1152 conv2d_271[0][0] __________________________________________________________________________________________________ batch_normalization_272 (BatchN (None, None, None, 3 1152 conv2d_272[0][0] __________________________________________________________________________________________________ conv2d_273 (Conv2D) (None, None, None, 1 245760 average_pooling2d_26[0][0] __________________________________________________________________________________________________ batch_normalization_265 (BatchN (None, None, None, 3 960 conv2d_265[0][0] __________________________________________________________________________________________________ activation_267 (Activation) (None, None, None, 3 0 batch_normalization_267[0][0] __________________________________________________________________________________________________ activation_268 (Activation) (None, None, None, 3 0 batch_normalization_268[0][0] __________________________________________________________________________________________________ activation_271 (Activation) (None, None, None, 3 0 batch_normalization_271[0][0] __________________________________________________________________________________________________ activation_272 (Activation) (None, None, None, 3 0 batch_normalization_272[0][0] __________________________________________________________________________________________________ batch_normalization_273 (BatchN (None, None, None, 1 576 conv2d_273[0][0] __________________________________________________________________________________________________ activation_265 (Activation) (None, None, None, 3 0 batch_normalization_265[0][0] __________________________________________________________________________________________________ mixed9_0 (Concatenate) (None, None, None, 7 0 activation_267[0][0] activation_268[0][0] __________________________________________________________________________________________________ concatenate_5 (Concatenate) (None, None, None, 7 0 activation_271[0][0] activation_272[0][0] __________________________________________________________________________________________________ activation_273 (Activation) (None, None, None, 1 0 batch_normalization_273[0][0] __________________________________________________________________________________________________ mixed9 (Concatenate) (None, None, None, 2 0 activation_265[0][0] mixed9_0[0][0] concatenate_5[0][0] activation_273[0][0] __________________________________________________________________________________________________ conv2d_278 (Conv2D) (None, None, None, 4 917504 mixed9[0][0] __________________________________________________________________________________________________ batch_normalization_278 (BatchN (None, None, None, 4 1344 conv2d_278[0][0] __________________________________________________________________________________________________ activation_278 (Activation) (None, None, None, 4 0 batch_normalization_278[0][0] __________________________________________________________________________________________________ conv2d_275 (Conv2D) (None, None, None, 3 786432 mixed9[0][0] __________________________________________________________________________________________________ conv2d_279 (Conv2D) (None, None, None, 3 1548288 activation_278[0][0] __________________________________________________________________________________________________ batch_normalization_275 (BatchN (None, None, None, 3 1152 conv2d_275[0][0] __________________________________________________________________________________________________ batch_normalization_279 (BatchN (None, None, None, 3 1152 conv2d_279[0][0] __________________________________________________________________________________________________ activation_275 (Activation) (None, None, None, 3 0 batch_normalization_275[0][0] __________________________________________________________________________________________________ activation_279 (Activation) (None, None, None, 3 0 batch_normalization_279[0][0] __________________________________________________________________________________________________ conv2d_276 (Conv2D) (None, None, None, 3 442368 activation_275[0][0] __________________________________________________________________________________________________ conv2d_277 (Conv2D) (None, None, None, 3 442368 activation_275[0][0] __________________________________________________________________________________________________ conv2d_280 (Conv2D) (None, None, None, 3 442368 activation_279[0][0] __________________________________________________________________________________________________ conv2d_281 (Conv2D) (None, None, None, 3 442368 activation_279[0][0] __________________________________________________________________________________________________ average_pooling2d_27 (AveragePo (None, None, None, 2 0 mixed9[0][0] __________________________________________________________________________________________________ conv2d_274 (Conv2D) (None, None, None, 3 655360 mixed9[0][0] __________________________________________________________________________________________________ batch_normalization_276 (BatchN (None, None, None, 3 1152 conv2d_276[0][0] __________________________________________________________________________________________________ batch_normalization_277 (BatchN (None, None, None, 3 1152 conv2d_277[0][0] __________________________________________________________________________________________________ batch_normalization_280 (BatchN (None, None, None, 3 1152 conv2d_280[0][0] __________________________________________________________________________________________________ batch_normalization_281 (BatchN (None, None, None, 3 1152 conv2d_281[0][0] __________________________________________________________________________________________________ conv2d_282 (Conv2D) (None, None, None, 1 393216 average_pooling2d_27[0][0] __________________________________________________________________________________________________ batch_normalization_274 (BatchN (None, None, None, 3 960 conv2d_274[0][0] __________________________________________________________________________________________________ activation_276 (Activation) (None, None, None, 3 0 batch_normalization_276[0][0] __________________________________________________________________________________________________ activation_277 (Activation) (None, None, None, 3 0 batch_normalization_277[0][0] __________________________________________________________________________________________________ activation_280 (Activation) (None, None, None, 3 0 batch_normalization_280[0][0] __________________________________________________________________________________________________ activation_281 (Activation) (None, None, None, 3 0 batch_normalization_281[0][0] __________________________________________________________________________________________________ batch_normalization_282 (BatchN (None, None, None, 1 576 conv2d_282[0][0] __________________________________________________________________________________________________ activation_274 (Activation) (None, None, None, 3 0 batch_normalization_274[0][0] __________________________________________________________________________________________________ mixed9_1 (Concatenate) (None, None, None, 7 0 activation_276[0][0] activation_277[0][0] __________________________________________________________________________________________________ concatenate_6 (Concatenate) (None, None, None, 7 0 activation_280[0][0] activation_281[0][0] __________________________________________________________________________________________________ activation_282 (Activation) (None, None, None, 1 0 batch_normalization_282[0][0] __________________________________________________________________________________________________ mixed10 (Concatenate) (None, None, None, 2 0 activation_274[0][0] mixed9_1[0][0] concatenate_6[0][0] activation_282[0][0] ================================================================================================== Total params: 21,802,784 Trainable params: 21,768,352 Non-trainable params: 34,432 __________________________________________________________________________________________________
NoneType
We first create the generator. The generator is an iterator that generates batches of images when requested using e.g. flow( )
.
datagen = ImageDataGenerator(rescale=1. / 255)
generator = datagen.flow_from_directory(
train_data_dir,
target_size=(img_width, img_height),
batch_size=batch_size,
class_mode=None,
shuffle=False)
nb_train_samples = len(generator.filenames)
num_classes = len(generator.class_indices)
predict_size_train = int(math.ceil(nb_train_samples / batch_size))
print('Number of training samples:',nb_train_samples)
print('Number of classes:',num_classes)
Found 1141 images belonging to 3 classes. Number of training samples: 1141 Number of classes: 3
The extracted features, which are the last activation maps before the fully-connected layers in the pre-trained model, are called "bottleneck features". The function predict_generator( )
generates predictions for the input samples from a data generator.
bottleneck_features_train = model.predict_generator(generator, predict_size_train) # these are numpy arrays
bottleneck_features_train[0].shape
bottleneck_features_train.shape
(2, 2, 2048)
(1141, 2, 2, 2048)
In the next cell, we save the bottleneck features to help training our data:
np.save('bottleneck_features_train.npy', bottleneck_features_train)
Using predict( )
we see that, indeed, ResNet50
is able to identify some objects in the painting. The function decode_predictions
decodes the results into a list of tuples of the form (class, description, probability). We see below that the model identifies the house in the image as a castle or mosque and shows correctly a non-zero probability of finding a seashore in the painting. In this case, ResNet50
acts as a feature generator.
Repeating the steps for the validation data:
generator = datagen.flow_from_directory(
validation_data_dir,
target_size=(img_width, img_height),
batch_size=batch_size,
class_mode=None,
shuffle=False)
nb_validation_samples = len(generator.filenames)
predict_size_validation = int(math.ceil(nb_validation_samples / batch_size))
print('Number of testing samples:',nb_validation_samples)
Found 359 images belonging to 3 classes. Number of testing samples: 359
bottleneck_features_validation = model.predict_generator(
generator, predict_size_validation)
np.save('bottleneck_features_validation.npy', bottleneck_features_validation)
We now load the features just obtained, get the class labels for the training set and convert the latter into categorial vectors:
datagen_top = ImageDataGenerator(rescale=1./255)
generator_top = datagen_top.flow_from_directory(
train_data_dir,
target_size=(img_width, img_height),
batch_size=batch_size,
class_mode='categorical',
shuffle=False)
nb_train_samples = len(generator_top.filenames)
num_classes = len(generator_top.class_indices)
Found 1141 images belonging to 3 classes.
Loading the features:
train_data = np.load('bottleneck_features_train.npy')
Converting training data into vectors of categories:
train_labels = generator_top.classes
print('Classes before dummification:',train_labels)
train_labels = to_categorical(train_labels, num_classes=num_classes)
print('Classes after dummification:\n\n',train_labels)
Classes before dummification: [0 0 0 ... 2 2 2] Classes after dummification: [[1. 0. 0.] [1. 0. 0.] [1. 0. 0.] ... [0. 0. 1.] [0. 0. 1.] [0. 0. 1.]]
Again repeating the process with the validation data:
generator_top = datagen_top.flow_from_directory(
validation_data_dir,
target_size=(img_width, img_height),
batch_size=batch_size,
class_mode=None,
shuffle=False)
nb_validation_samples = len(generator_top.filenames)
Found 359 images belonging to 3 classes.
validation_data = np.load('bottleneck_features_validation.npy')
validation_labels = generator_top.classes
validation_labels = to_categorical(validation_labels, num_classes=num_classes)
model = Sequential()
model.add(Flatten(input_shape=train_data.shape[1:]))
# model.add(Dense(1024, activation='relu'))
# model.add(Dropout(0.5))
model.add(Dense(512, activation='relu'))
model.add(Dropout(0.5))
model.add(Dense(256, activation='relu'))
model.add(Dropout(0.5))
model.add(Dense(128, activation='relu'))
model.add(Dropout(0.5))
model.add(Dense(64, activation='relu'))
model.add(Dropout(0.5))
model.add(Dense(32, activation='relu'))
model.add(Dropout(0.5))
model.add(Dense(16, activation='relu'))
model.add(Dropout(0.5))
model.add(Dense(8, activation='relu')) # Not valid for minimum = 500
model.add(Dropout(0.5))
# model.add(Dense(4, activation='relu')) # Not valid for minimum = 500
# model.add(Dropout(0.5))
model.add(Dense(num_classes, activation='sigmoid'))
model.compile(optimizer='Adam',
loss='binary_crossentropy', metrics=['accuracy'])
history = model.fit(train_data, train_labels,
epochs=epochs,
batch_size=batch_size,
validation_data=(validation_data, validation_labels))
model.save_weights(top_model_weights_path)
(eval_loss, eval_accuracy) = model.evaluate(
validation_data, validation_labels,
batch_size=batch_size, verbose=1)
print("[INFO] accuracy: {:.2f}%".format(eval_accuracy * 100))
print("[INFO] Loss: {}".format(eval_loss))
Train on 1141 samples, validate on 359 samples Epoch 1/100 1141/1141 [==============================] - 11s 9ms/step - loss: 2.0881 - acc: 0.5451 - val_loss: 0.6892 - val_acc: 0.5162 Epoch 2/100 1141/1141 [==============================] - 7s 7ms/step - loss: 1.1520 - acc: 0.5504 - val_loss: 0.6889 - val_acc: 0.5395 Epoch 3/100 1141/1141 [==============================] - 9s 8ms/step - loss: 0.9261 - acc: 0.5828 - val_loss: 0.6833 - val_acc: 0.6667 Epoch 4/100 1141/1141 [==============================] - 8s 7ms/step - loss: 0.7618 - acc: 0.6377 - val_loss: 0.6764 - val_acc: 0.6667 Epoch 5/100 1141/1141 [==============================] - 8s 7ms/step - loss: 0.7100 - acc: 0.6506 - val_loss: 0.6688 - val_acc: 0.6667 Epoch 6/100 1141/1141 [==============================] - 8s 7ms/step - loss: 0.6894 - acc: 0.6564 - val_loss: 0.6629 - val_acc: 0.6667 Epoch 7/100 1141/1141 [==============================] - 7s 6ms/step - loss: 0.6817 - acc: 0.6564 - val_loss: 0.6582 - val_acc: 0.6667 Epoch 8/100 1141/1141 [==============================] - 7s 6ms/step - loss: 0.6646 - acc: 0.6620 - val_loss: 0.6529 - val_acc: 0.6667 Epoch 9/100 1141/1141 [==============================] - 7s 6ms/step - loss: 0.6609 - acc: 0.6611 - val_loss: 0.6496 - val_acc: 0.6667 Epoch 10/100 1141/1141 [==============================] - 7s 6ms/step - loss: 0.6547 - acc: 0.6640 - val_loss: 0.6464 - val_acc: 0.6667 Epoch 11/100 1141/1141 [==============================] - 7s 6ms/step - loss: 0.6498 - acc: 0.6658 - val_loss: 0.6444 - val_acc: 0.6667 Epoch 12/100 1141/1141 [==============================] - 7s 6ms/step - loss: 0.6476 - acc: 0.6635 - val_loss: 0.6431 - val_acc: 0.6667 Epoch 13/100 1141/1141 [==============================] - 7s 6ms/step - loss: 0.6472 - acc: 0.6687 - val_loss: 0.6421 - val_acc: 0.6667 Epoch 14/100 1141/1141 [==============================] - 7s 6ms/step - loss: 0.6383 - acc: 0.6684 - val_loss: 0.6410 - val_acc: 0.6667 Epoch 15/100 1141/1141 [==============================] - 7s 6ms/step - loss: 0.6428 - acc: 0.6664 - val_loss: 0.6406 - val_acc: 0.6667 Epoch 16/100 1141/1141 [==============================] - 7s 6ms/step - loss: 0.6456 - acc: 0.6661 - val_loss: 0.6405 - val_acc: 0.6667 Epoch 17/100 1141/1141 [==============================] - 7s 6ms/step - loss: 0.6360 - acc: 0.6655 - val_loss: 0.6399 - val_acc: 0.6667 Epoch 18/100 1141/1141 [==============================] - 7s 6ms/step - loss: 0.6394 - acc: 0.6693 - val_loss: 0.6388 - val_acc: 0.6667 Epoch 19/100 1141/1141 [==============================] - 7s 6ms/step - loss: 0.6381 - acc: 0.6673 - val_loss: 0.6388 - val_acc: 0.6667 Epoch 20/100 1141/1141 [==============================] - 7s 6ms/step - loss: 0.6315 - acc: 0.6699 - val_loss: 0.6381 - val_acc: 0.6667 Epoch 21/100 1141/1141 [==============================] - 7s 6ms/step - loss: 0.6302 - acc: 0.6678 - val_loss: 0.6372 - val_acc: 0.6667 Epoch 22/100 1141/1141 [==============================] - 7s 6ms/step - loss: 0.6333 - acc: 0.6684 - val_loss: 0.6373 - val_acc: 0.6667 Epoch 23/100 1141/1141 [==============================] - 7s 6ms/step - loss: 0.6253 - acc: 0.6693 - val_loss: 0.6350 - val_acc: 0.6667 Epoch 24/100 1141/1141 [==============================] - 7s 6ms/step - loss: 0.6258 - acc: 0.6710 - val_loss: 0.6328 - val_acc: 0.6667 Epoch 25/100 1141/1141 [==============================] - 8s 7ms/step - loss: 0.6290 - acc: 0.6699 - val_loss: 0.6387 - val_acc: 0.6667 Epoch 26/100 1141/1141 [==============================] - 8s 7ms/step - loss: 0.6214 - acc: 0.6710 - val_loss: 0.6362 - val_acc: 0.6667 Epoch 27/100 1141/1141 [==============================] - 8s 7ms/step - loss: 0.6270 - acc: 0.6743 - val_loss: 0.6289 - val_acc: 0.6667 Epoch 28/100 1141/1141 [==============================] - 7s 6ms/step - loss: 0.6067 - acc: 0.6786 - val_loss: 0.6241 - val_acc: 0.6667 Epoch 29/100 1141/1141 [==============================] - 8s 7ms/step - loss: 0.6300 - acc: 0.6728 - val_loss: 0.6186 - val_acc: 0.6667 Epoch 30/100 1141/1141 [==============================] - 8s 7ms/step - loss: 0.6143 - acc: 0.6748 - val_loss: 0.6254 - val_acc: 0.6667 Epoch 31/100 1141/1141 [==============================] - 7s 7ms/step - loss: 0.6055 - acc: 0.6819 - val_loss: 0.6178 - val_acc: 0.6667 Epoch 32/100 1141/1141 [==============================] - 7s 7ms/step - loss: 0.6077 - acc: 0.6766 - val_loss: 0.6134 - val_acc: 0.6667 Epoch 33/100 1141/1141 [==============================] - 8s 7ms/step - loss: 0.6089 - acc: 0.6769 - val_loss: 0.6127 - val_acc: 0.6667 Epoch 34/100 1141/1141 [==============================] - 7s 6ms/step - loss: 0.6073 - acc: 0.6822 - val_loss: 0.6172 - val_acc: 0.6667 Epoch 35/100 1141/1141 [==============================] - 8s 7ms/step - loss: 0.5992 - acc: 0.6868 - val_loss: 0.6035 - val_acc: 0.6667 Epoch 36/100 1141/1141 [==============================] - 8s 7ms/step - loss: 0.5829 - acc: 0.6933 - val_loss: 0.5998 - val_acc: 0.6741 Epoch 37/100 1141/1141 [==============================] - 7s 7ms/step - loss: 0.5954 - acc: 0.6871 - val_loss: 0.6113 - val_acc: 0.6667 Epoch 38/100 1141/1141 [==============================] - 7s 6ms/step - loss: 0.5815 - acc: 0.6997 - val_loss: 0.5970 - val_acc: 0.6667 Epoch 39/100 1141/1141 [==============================] - 9s 8ms/step - loss: 0.5909 - acc: 0.7035 - val_loss: 0.6026 - val_acc: 0.6667 Epoch 40/100 1141/1141 [==============================] - 10s 9ms/step - loss: 0.5967 - acc: 0.7000 - val_loss: 0.5928 - val_acc: 0.6667 Epoch 41/100 1141/1141 [==============================] - 8s 7ms/step - loss: 0.5772 - acc: 0.7128 - val_loss: 0.5862 - val_acc: 0.6695 Epoch 42/100 1141/1141 [==============================] - 8s 7ms/step - loss: 0.5595 - acc: 0.7204 - val_loss: 0.5464 - val_acc: 0.7140 Epoch 43/100 1141/1141 [==============================] - 8s 7ms/step - loss: 0.5503 - acc: 0.7260 - val_loss: 0.5634 - val_acc: 0.7047 Epoch 44/100 1141/1141 [==============================] - 8s 7ms/step - loss: 0.5751 - acc: 0.7017 - val_loss: 0.5466 - val_acc: 0.7270 Epoch 45/100 1141/1141 [==============================] - 7s 6ms/step - loss: 0.5538 - acc: 0.7155 - val_loss: 0.5375 - val_acc: 0.7159 Epoch 46/100 1141/1141 [==============================] - 7s 6ms/step - loss: 0.5611 - acc: 0.7155 - val_loss: 0.5113 - val_acc: 0.7307 Epoch 47/100 1141/1141 [==============================] - 7s 6ms/step - loss: 0.5330 - acc: 0.7309 - val_loss: 0.5145 - val_acc: 0.7233 Epoch 48/100 1141/1141 [==============================] - 7s 6ms/step - loss: 0.5614 - acc: 0.7096 - val_loss: 0.5104 - val_acc: 0.7326 Epoch 49/100 1141/1141 [==============================] - 7s 6ms/step - loss: 0.5503 - acc: 0.7245 - val_loss: 0.5418 - val_acc: 0.7103 Epoch 50/100 1141/1141 [==============================] - 7s 6ms/step - loss: 0.5288 - acc: 0.7327 - val_loss: 0.5266 - val_acc: 0.7149 Epoch 51/100 1141/1141 [==============================] - 7s 6ms/step - loss: 0.5171 - acc: 0.7365 - val_loss: 0.5271 - val_acc: 0.7196 Epoch 52/100 1141/1141 [==============================] - 7s 6ms/step - loss: 0.5515 - acc: 0.7219 - val_loss: 0.5054 - val_acc: 0.7289 Epoch 53/100 1141/1141 [==============================] - 7s 6ms/step - loss: 0.5327 - acc: 0.7306 - val_loss: 0.5284 - val_acc: 0.7149 Epoch 54/100 1141/1141 [==============================] - 7s 6ms/step - loss: 0.5272 - acc: 0.7333 - val_loss: 0.5085 - val_acc: 0.7279 Epoch 55/100 1141/1141 [==============================] - 7s 6ms/step - loss: 0.5529 - acc: 0.7236 - val_loss: 0.5148 - val_acc: 0.7224 Epoch 56/100 1141/1141 [==============================] - 8s 7ms/step - loss: 0.5332 - acc: 0.7286 - val_loss: 0.5071 - val_acc: 0.7270 Epoch 57/100 1141/1141 [==============================] - 8s 7ms/step - loss: 0.5040 - acc: 0.7438 - val_loss: 0.5283 - val_acc: 0.7168 Epoch 58/100 1141/1141 [==============================] - 7s 7ms/step - loss: 0.5262 - acc: 0.7356 - val_loss: 0.5099 - val_acc: 0.7437 Epoch 59/100 1141/1141 [==============================] - 8s 7ms/step - loss: 0.5072 - acc: 0.7458 - val_loss: 0.5100 - val_acc: 0.7521 Epoch 60/100 1141/1141 [==============================] - 8s 7ms/step - loss: 0.5077 - acc: 0.7417 - val_loss: 0.5012 - val_acc: 0.7419 Epoch 61/100 1141/1141 [==============================] - 8s 7ms/step - loss: 0.4850 - acc: 0.7444 - val_loss: 0.4959 - val_acc: 0.7586 Epoch 62/100 1141/1141 [==============================] - 8s 7ms/step - loss: 0.4797 - acc: 0.7543 - val_loss: 0.5178 - val_acc: 0.7549 Epoch 63/100 1141/1141 [==============================] - 7s 6ms/step - loss: 0.4812 - acc: 0.7523 - val_loss: 0.5022 - val_acc: 0.7493 Epoch 64/100 1141/1141 [==============================] - 8s 7ms/step - loss: 0.4847 - acc: 0.7502 - val_loss: 0.5773 - val_acc: 0.7642 Epoch 65/100 1141/1141 [==============================] - 8s 7ms/step - loss: 0.4638 - acc: 0.7584 - val_loss: 0.5520 - val_acc: 0.7577 Epoch 66/100 1141/1141 [==============================] - 7s 6ms/step - loss: 0.4509 - acc: 0.7680 - val_loss: 0.5132 - val_acc: 0.7474 Epoch 67/100 1141/1141 [==============================] - 8s 7ms/step - loss: 0.4727 - acc: 0.7572 - val_loss: 0.5313 - val_acc: 0.7549 Epoch 68/100 1141/1141 [==============================] - 7s 7ms/step - loss: 0.4962 - acc: 0.7476 - val_loss: 0.5024 - val_acc: 0.7419 Epoch 69/100 1141/1141 [==============================] - 8s 7ms/step - loss: 0.4667 - acc: 0.7523 - val_loss: 0.5785 - val_acc: 0.7707 Epoch 70/100 1141/1141 [==============================] - 8s 7ms/step - loss: 0.4522 - acc: 0.7639 - val_loss: 0.5848 - val_acc: 0.7669 Epoch 71/100 1141/1141 [==============================] - 8s 7ms/step - loss: 0.4753 - acc: 0.7508 - val_loss: 0.5101 - val_acc: 0.7465 Epoch 72/100 1141/1141 [==============================] - 7s 6ms/step - loss: 0.4514 - acc: 0.7628 - val_loss: 0.7056 - val_acc: 0.7586 Epoch 73/100 1141/1141 [==============================] - 8s 7ms/step - loss: 0.4521 - acc: 0.7625 - val_loss: 0.6092 - val_acc: 0.7688 Epoch 74/100 1141/1141 [==============================] - 7s 6ms/step - loss: 0.4256 - acc: 0.7713 - val_loss: 0.6536 - val_acc: 0.7660 Epoch 75/100 1141/1141 [==============================] - 8s 7ms/step - loss: 0.4273 - acc: 0.7683 - val_loss: 0.5098 - val_acc: 0.7781 Epoch 76/100 1141/1141 [==============================] - 8s 7ms/step - loss: 0.4464 - acc: 0.7631 - val_loss: 0.5127 - val_acc: 0.7716 Epoch 77/100 1141/1141 [==============================] - 7s 7ms/step - loss: 0.4477 - acc: 0.7604 - val_loss: 0.6532 - val_acc: 0.7567 Epoch 78/100 1141/1141 [==============================] - 7s 6ms/step - loss: 0.4260 - acc: 0.7710 - val_loss: 0.6301 - val_acc: 0.7669 Epoch 79/100 1141/1141 [==============================] - 7s 6ms/step - loss: 0.4279 - acc: 0.7745 - val_loss: 0.5796 - val_acc: 0.7651 Epoch 80/100 1141/1141 [==============================] - 7s 6ms/step - loss: 0.4192 - acc: 0.7718 - val_loss: 0.6002 - val_acc: 0.7679 Epoch 81/100 1141/1141 [==============================] - 7s 7ms/step - loss: 0.4291 - acc: 0.7753 - val_loss: 0.6881 - val_acc: 0.7623 Epoch 82/100 1141/1141 [==============================] - 8s 7ms/step - loss: 0.4175 - acc: 0.7748 - val_loss: 0.5897 - val_acc: 0.7623 Epoch 83/100 1141/1141 [==============================] - 7s 6ms/step - loss: 0.4091 - acc: 0.7791 - val_loss: 0.5913 - val_acc: 0.7837 Epoch 84/100 1141/1141 [==============================] - 7s 6ms/step - loss: 0.4593 - acc: 0.7648 - val_loss: 0.5414 - val_acc: 0.7604 Epoch 85/100 1141/1141 [==============================] - 8s 7ms/step - loss: 0.4384 - acc: 0.7739 - val_loss: 0.6986 - val_acc: 0.7669 Epoch 86/100 1141/1141 [==============================] - 8s 7ms/step - loss: 0.4354 - acc: 0.7689 - val_loss: 0.5528 - val_acc: 0.7493 Epoch 87/100 1141/1141 [==============================] - 8s 7ms/step - loss: 0.4276 - acc: 0.7730 - val_loss: 0.6663 - val_acc: 0.7539 Epoch 88/100 1141/1141 [==============================] - 8s 7ms/step - loss: 0.4430 - acc: 0.7642 - val_loss: 0.5774 - val_acc: 0.7716 Epoch 89/100 1141/1141 [==============================] - 8s 7ms/step - loss: 0.4170 - acc: 0.7736 - val_loss: 0.5437 - val_acc: 0.7781 Epoch 90/100 1141/1141 [==============================] - 8s 7ms/step - loss: 0.3802 - acc: 0.7826 - val_loss: 0.6280 - val_acc: 0.7716 Epoch 91/100 1141/1141 [==============================] - 7s 6ms/step - loss: 0.3738 - acc: 0.7923 - val_loss: 0.6289 - val_acc: 0.7809 Epoch 92/100 1141/1141 [==============================] - 8s 7ms/step - loss: 0.3825 - acc: 0.7824 - val_loss: 0.6491 - val_acc: 0.7734 Epoch 93/100 1141/1141 [==============================] - 7s 6ms/step - loss: 0.3849 - acc: 0.7870 - val_loss: 0.6365 - val_acc: 0.7855 Epoch 94/100 1141/1141 [==============================] - 7s 6ms/step - loss: 0.3839 - acc: 0.7835 - val_loss: 0.6183 - val_acc: 0.7883 Epoch 95/100 1141/1141 [==============================] - 8s 7ms/step - loss: 0.3926 - acc: 0.7794 - val_loss: 0.8045 - val_acc: 0.7772 Epoch 96/100 1141/1141 [==============================] - 7s 6ms/step - loss: 0.3915 - acc: 0.7479 - val_loss: 0.9111 - val_acc: 0.7679 Epoch 97/100 1141/1141 [==============================] - 7s 7ms/step - loss: 0.4029 - acc: 0.7423 - val_loss: 0.8113 - val_acc: 0.7790 Epoch 98/100 1141/1141 [==============================] - 7s 7ms/step - loss: 0.4213 - acc: 0.7213 - val_loss: 0.5641 - val_acc: 0.7632 Epoch 99/100 1141/1141 [==============================] - 7s 6ms/step - loss: 0.4014 - acc: 0.7426 - val_loss: 0.6712 - val_acc: 0.7772 Epoch 100/100 1141/1141 [==============================] - 7s 6ms/step - loss: 0.4086 - acc: 0.7564 - val_loss: 0.6618 - val_acc: 0.7632 359/359 [==============================] - 0s 623us/step [INFO] accuracy: 76.32% [INFO] Loss: 0.661828470014264
train_data.shape[1:]
(2, 2, 2048)
# model.evaluate(
# validation_data, validation_labels, batch_size=batch_size, verbose=1)
# model.predict_classes(validation_data)
# model.metrics_names
#top_k_categorical_accuracy(y_true, y_pred, k=5)
plt.figure(1)
# summarize history for accuracy
plt.subplot(211)
plt.plot(history.history['acc'])
plt.plot(history.history['val_acc'])
plt.title('model accuracy')
plt.ylabel('accuracy')
plt.xlabel('epoch')
#pylab.ylim([0.4,0.68])
plt.legend(['train', 'test'], loc='upper left')
<matplotlib.figure.Figure at 0x12cb37710>
<matplotlib.axes._subplots.AxesSubplot at 0x132bcc438>
[<matplotlib.lines.Line2D at 0x12de81470>]
[<matplotlib.lines.Line2D at 0x12de813c8>]
Text(0.5,1,'model accuracy')
Text(0,0.5,'accuracy')
Text(0.5,0,'epoch')
<matplotlib.legend.Legend at 0x12de81a20>
import pylab
plt.subplot(212)
plt.plot(history.history['loss'])
plt.plot(history.history['val_loss'])
plt.title('model loss')
plt.ylabel('loss')
plt.xlabel('epoch')
plt.legend(['train', 'test'], loc='upper left')
pylab.xlim([0,60])
# pylab.ylim([0,1000])
plt.show()
<matplotlib.axes._subplots.AxesSubplot at 0x12debd128>
[<matplotlib.lines.Line2D at 0x12e1af470>]
[<matplotlib.lines.Line2D at 0x12e1af438>]
Text(0.5,1,'model loss')
Text(0,0.5,'loss')
Text(0.5,0,'epoch')
<matplotlib.legend.Legend at 0x12e1afdd8>
(0, 60)
import matplotlib.pyplot as plt
import pylab
%matplotlib inline
%config InlineBackend.figure_format = 'retina'
fig = plt.figure()
plt.plot(history.history['loss'])
plt.plot(history.history['val_loss'])
plt.title('Classification Model Loss')
plt.xlabel('Epoch')
plt.ylabel('Loss')
pylab.xlim([0,60])
plt.legend(['Test', 'Validation'], loc='upper right')
fig.savefig('loss.png')
plt.show();
import matplotlib.pyplot as plt
%matplotlib inline
%config InlineBackend.figure_format = 'retina'
fig = plt.figure()
plt.plot(history.history['acc'])
plt.plot(history.history['val_acc'])
plt.plot(figsize=(15,15))
plt.title('Classification Model Accuracy')
plt.xlabel('Epoch')
plt.ylabel('Accuracy')
pylab.xlim([0,100])
plt.legend(['Test', 'Validation', 'Success Metric'], loc='lower right')
fig.savefig('acc.png')
plt.show();
!pwd
/Users/marcotavora/capstone_phase_2
os.listdir(os.path.abspath('train_toy_3/Pierre-Auguste_Renoir))
image_path = os.path.abspath('test_toy_3/Pierre-Auguste_Renoir/91485.jpg')
orig = cv2.imread(image_path)
image = load_img(image_path, target_size=(120,120))
image
image = img_to_array(image)
image
array([[[220., 215., 211.], [207., 202., 198.], [184., 179., 175.], ..., [218., 205., 199.], [221., 206., 211.], [229., 212., 228.]], [[207., 202., 198.], [208., 203., 199.], [214., 209., 205.], ..., [201., 188., 180.], [197., 182., 185.], [204., 188., 198.]], [[218., 213., 209.], [208., 203., 199.], [207., 202., 198.], ..., [210., 200., 190.], [209., 198., 194.], [223., 211., 213.]], ..., [[237., 215., 95.], [239., 219., 98.], [248., 227., 108.], ..., [253., 217., 131.], [242., 206., 118.], [219., 182., 94.]], [[246., 223., 111.], [250., 228., 116.], [244., 222., 110.], ..., [232., 195., 106.], [237., 200., 111.], [233., 194., 103.]], [[240., 214., 117.], [243., 219., 121.], [244., 220., 120.], ..., [240., 203., 112.], [231., 192., 101.], [240., 199., 107.]]], dtype=float32)
image = image / 255.
image = np.expand_dims(image, axis=0)
image
array([[[[0.8627451 , 0.84313726, 0.827451 ], [0.8117647 , 0.7921569 , 0.7764706 ], [0.72156864, 0.7019608 , 0.6862745 ], ..., [0.85490197, 0.8039216 , 0.78039217], [0.8666667 , 0.80784315, 0.827451 ], [0.8980392 , 0.83137256, 0.89411765]], [[0.8117647 , 0.7921569 , 0.7764706 ], [0.8156863 , 0.79607844, 0.78039217], [0.8392157 , 0.81960785, 0.8039216 ], ..., [0.7882353 , 0.7372549 , 0.7058824 ], [0.77254903, 0.7137255 , 0.7254902 ], [0.8 , 0.7372549 , 0.7764706 ]], [[0.85490197, 0.8352941 , 0.81960785], [0.8156863 , 0.79607844, 0.78039217], [0.8117647 , 0.7921569 , 0.7764706 ], ..., [0.8235294 , 0.78431374, 0.74509805], [0.81960785, 0.7764706 , 0.7607843 ], [0.8745098 , 0.827451 , 0.8352941 ]], ..., [[0.92941177, 0.84313726, 0.37254903], [0.9372549 , 0.85882354, 0.38431373], [0.972549 , 0.8901961 , 0.42352942], ..., [0.99215686, 0.8509804 , 0.5137255 ], [0.9490196 , 0.80784315, 0.4627451 ], [0.85882354, 0.7137255 , 0.36862746]], [[0.9647059 , 0.8745098 , 0.43529412], [0.98039216, 0.89411765, 0.45490196], [0.95686275, 0.87058824, 0.43137255], ..., [0.9098039 , 0.7647059 , 0.41568628], [0.92941177, 0.78431374, 0.43529412], [0.9137255 , 0.7607843 , 0.40392157]], [[0.9411765 , 0.8392157 , 0.45882353], [0.9529412 , 0.85882354, 0.4745098 ], [0.95686275, 0.8627451 , 0.47058824], ..., [0.9411765 , 0.79607844, 0.4392157 ], [0.90588236, 0.7529412 , 0.39607844], [0.9411765 , 0.78039217, 0.41960785]]]], dtype=float32)
# build the VGG16 network
#model = applications.VGG16(include_top=False, weights='imagenet')
model = applications.InceptionV3(include_top=False, weights='imagenet')
# get the bottleneck prediction from the pre-trained VGG16 model
bottleneck_prediction = model.predict(image)
# build top model
model = Sequential()
model.add(Flatten(input_shape=train_data.shape[1:]))
# model.add(Dense(1024, activation='relu'))
# model.add(Dropout(0.5))
model.add(Dense(512, activation='relu'))
model.add(Dropout(0.5))
model.add(Dense(256, activation='relu'))
model.add(Dropout(0.5))
model.add(Dense(128, activation='relu'))
model.add(Dropout(0.5))
model.add(Dense(64, activation='relu'))
model.add(Dropout(0.5))
model.add(Dense(32, activation='relu'))
model.add(Dropout(0.5))
model.add(Dense(16, activation='relu'))
model.add(Dropout(0.5))
model.add(Dense(8, activation='relu')) # Not valid for minimum = 500
model.add(Dropout(0.5))
# model.add(Dense(4, activation='relu')) # Not valid for minimum = 500
# model.add(Dropout(0.5))
model.add(Dense(num_classes, activation='sigmoid'))
model.load_weights(top_model_weights_path)
# use the bottleneck prediction on the top model to get the final classification
class_predicted = model.predict_classes(bottleneck_prediction)
inID = class_predicted[0]
class_dictionary = generator_top.class_indices
inv_map = {v: k for k, v in class_dictionary.items()}
label = inv_map[inID]
# get the prediction label
print("Image ID: {}, Label: {}".format(inID, label))
# display the predictions with the image
cv2.putText(orig, "Predicted: {}".format(label), (10, 30), cv2.FONT_HERSHEY_PLAIN, 1.5, (43, 99, 255), 2)
cv2.imshow("Classification", orig)
cv2.waitKey(0)
cv2.destroyAllWindows()