Imports

In [1]:
%matplotlib inline

from scipy.misc  import imresize

from keras.preprocessing import image
from keras.models        import Model
from keras.layers        import Input
from keras.layers        import Add
from keras.layers        import Convolution2D
from keras.layers        import Deconvolution2D
from keras.layers        import BatchNormalization
from keras.layers        import Activation
from keras.layers        import Convolution2DTranspose
from keras.layers        import UpSampling2D
from keras.optimizers    import Adam
from keras.callbacks     import ModelCheckpoint
from keras.callbacks     import LearningRateScheduler

import keras.backend     as K
import numpy             as np
import matplotlib.pyplot as plt
Using TensorFlow backend.
In [2]:
def limit_mem():
    cfg                          = K.tf.ConfigProto()
    cfg.gpu_options.allow_growth = True
    K.set_session(K.tf.Session(config = cfg))
limit_mem()

Definitions and helper functions

Path and constant definitions

In [3]:
image_folder   = '../../data/fractal_zooms/'
batch_size     = 4
preprocess     = lambda x: (x - 127.5) / 127.5
small_shape    = (128, 192)
big_shape      = (256, 384)
epochs         = 500
start_lr       = 1e-3
end_lr         = 1e-6
learning_rates = np.linspace(start_lr, end_lr, epochs)
cp_period      = 100

A small helper function to compare two sets of images side by side.

In [4]:
def visualize(arrays, nb_row):
    num_arrays = len(arrays)
    plt.figure(figsize = (17, 17))
    for i in range(nb_row):
        for j in range(num_arrays):
            plt.subplot(nb_row, num_arrays, num_arrays * i + j + 1)
            plt.imshow(arrays[j][i].squeeze(), cmap = 'gray')

We use an ImageDataGenerator to load batch of images from the disk. We also create a custom wrapper to generate image couples of the form (small_resolution_image, high_resolution_image).

In [5]:
generator       = image.ImageDataGenerator(preprocessing_function = preprocess)
flow_parameters = {
    'class_mode' : None,
    'target_size': big_shape,
    'color_mode' : 'grayscale',
    'batch_size' : batch_size
}
train_flow = generator.flow_from_directory(directory = image_folder + 'train/', **flow_parameters)
valid_flow = generator.flow_from_directory(directory = image_folder + 'test/', **flow_parameters)
Found 45000 images belonging to 1 classes.
Found 5000 images belonging to 1 classes.
In [6]:
def data_generator(flow, interp = 'nearest'):
    while True:
        output_images = next(flow)
        resize_image  = [imresize(output_images[i].squeeze(), small_shape, interp = interp) for i in range(output_images.shape[0])]
        input_images  = np.expand_dims(np.stack(resize_image), -1)
        
        yield (input_images, output_images)

Image loading and visualization test.

In [7]:
train_data_gen            = data_generator(train_flow)
valid_data_gen            = data_generator(valid_flow)
input_array, output_array = next(train_data_gen)
visualize([input_array, output_array], min(5, batch_size))

Model definition

Callbacks creation

Model checkpoint callback

Creation of the model checkpoint callback.

In [8]:
checkpoint_callback = ModelCheckpoint('../models/super_resolution.epoch_{epoch:02d}.h5', period = cp_period)

Learning rate annealing callback

This callback will make the learning rate decrease linearly along the epochs. It could also be interesting to try a cyclical learning rate as it seems to generaly improves convergence speed.

In [9]:
def schedule(epoch):
    return learning_rates[epoch]

learning_rate_callback = LearningRateScheduler(schedule)

Architecture

The architecture of the network comes from the article Perceptual Losses for Real-Time Style Transfer and Super-Resolution. In this first example, a simple mean squared error on the pixel values is used in place of the perceptual loss. Another network in this repository will contain further exploration of the perceptual loss applied to this problem.

In [10]:
def convolution_block(inp, filter_size = 3, nb_filters = 64, activation = True):
    x = Convolution2D(nb_filters, (filter_size, filter_size), strides = (1, 1), padding = 'same')(inp)
    x = BatchNormalization()(x)
    x = Activation('relu')(x) if activation else x
    
    return x
In [11]:
def residual_block(inp, nb_filders = 64):
    x = convolution_block(inp, nb_filders)
    x = convolution_block(x  , nb_filders, activation = False)
    x = Add()([inp, x])
    
    return x

Now for the part the of the network that increases the size of the picture, there is multiple choices. The one presented in the original is the deconvolution (or more correctly fractionnaly strided convolution). This kind of layer seem to produce checkerboard patterns and a solution to this problem is to use upsampling and regular convolutions. The two choices are implemented here.

In [12]:
def deconvolution_block(inp, nb_filters = 64):
    x = Convolution2DTranspose(nb_filters, (3, 3), strides = (2, 2), padding = 'same')(inp)
    x = BatchNormalization()(x)
    
    return x
In [13]:
def upsampling_block(inp, nb_filter = 64):
    x = UpSampling2D()(inp)
    x = Convolution2D(nb_filter, (3, 3), strides = (1, 1), padding = 'same')(x)
    x = BatchNormalization()(x)
    
    return x
In [14]:
def create_model_deconvolution(input_shape):
    inp = Input(shape = input_shape + (1,))
    x   = convolution_block(inp, filter_size = 9, nb_filters = 64)
    for _ in range(4):
        x = residual_block(x)
    x     = deconvolution_block(x)
    x     = Convolution2D(1, (9, 9), strides = (1, 1), padding = 'same')(x)
    x     = Activation('tanh')(x)
    model = Model(inp, x)

    return model
In [15]:
def create_model_upsampling(input_shape):
    inp = Input(shape = input_shape + (1,))
    x   = convolution_block(inp, filter_size = 9, nb_filters = 64)
    for _ in range(4):
        x = residual_block(x)
    x     = upsampling_block(x)
    x     = Convolution2D(1, (9, 9), strides = (1, 1), padding = 'same')(x)
    x     = Activation('tanh')(x)
    model = Model(inp, x)

    return model
In [16]:
model = create_model_deconvolution(small_shape)
# model = create_model_upsampling(small_shape)
model.compile(optimizer = Adam(1e-3), loss = 'mse')
model.summary()
____________________________________________________________________________________________________
Layer (type)                     Output Shape          Param #     Connected to                     
====================================================================================================
input_1 (InputLayer)             (None, 128, 192, 1)   0                                            
____________________________________________________________________________________________________
conv2d_1 (Conv2D)                (None, 128, 192, 64)  5248        input_1[0][0]                    
____________________________________________________________________________________________________
batch_normalization_1 (BatchNorm (None, 128, 192, 64)  256         conv2d_1[0][0]                   
____________________________________________________________________________________________________
activation_1 (Activation)        (None, 128, 192, 64)  0           batch_normalization_1[0][0]      
____________________________________________________________________________________________________
conv2d_2 (Conv2D)                (None, 128, 192, 64)  16777280    activation_1[0][0]               
____________________________________________________________________________________________________
batch_normalization_2 (BatchNorm (None, 128, 192, 64)  256         conv2d_2[0][0]                   
____________________________________________________________________________________________________
activation_2 (Activation)        (None, 128, 192, 64)  0           batch_normalization_2[0][0]      
____________________________________________________________________________________________________
conv2d_3 (Conv2D)                (None, 128, 192, 64)  16777280    activation_2[0][0]               
____________________________________________________________________________________________________
batch_normalization_3 (BatchNorm (None, 128, 192, 64)  256         conv2d_3[0][0]                   
____________________________________________________________________________________________________
add_1 (Add)                      (None, 128, 192, 64)  0           activation_1[0][0]               
                                                                   batch_normalization_3[0][0]      
____________________________________________________________________________________________________
conv2d_4 (Conv2D)                (None, 128, 192, 64)  16777280    add_1[0][0]                      
____________________________________________________________________________________________________
batch_normalization_4 (BatchNorm (None, 128, 192, 64)  256         conv2d_4[0][0]                   
____________________________________________________________________________________________________
activation_3 (Activation)        (None, 128, 192, 64)  0           batch_normalization_4[0][0]      
____________________________________________________________________________________________________
conv2d_5 (Conv2D)                (None, 128, 192, 64)  16777280    activation_3[0][0]               
____________________________________________________________________________________________________
batch_normalization_5 (BatchNorm (None, 128, 192, 64)  256         conv2d_5[0][0]                   
____________________________________________________________________________________________________
add_2 (Add)                      (None, 128, 192, 64)  0           add_1[0][0]                      
                                                                   batch_normalization_5[0][0]      
____________________________________________________________________________________________________
conv2d_6 (Conv2D)                (None, 128, 192, 64)  16777280    add_2[0][0]                      
____________________________________________________________________________________________________
batch_normalization_6 (BatchNorm (None, 128, 192, 64)  256         conv2d_6[0][0]                   
____________________________________________________________________________________________________
activation_4 (Activation)        (None, 128, 192, 64)  0           batch_normalization_6[0][0]      
____________________________________________________________________________________________________
conv2d_7 (Conv2D)                (None, 128, 192, 64)  16777280    activation_4[0][0]               
____________________________________________________________________________________________________
batch_normalization_7 (BatchNorm (None, 128, 192, 64)  256         conv2d_7[0][0]                   
____________________________________________________________________________________________________
add_3 (Add)                      (None, 128, 192, 64)  0           add_2[0][0]                      
                                                                   batch_normalization_7[0][0]      
____________________________________________________________________________________________________
conv2d_8 (Conv2D)                (None, 128, 192, 64)  16777280    add_3[0][0]                      
____________________________________________________________________________________________________
batch_normalization_8 (BatchNorm (None, 128, 192, 64)  256         conv2d_8[0][0]                   
____________________________________________________________________________________________________
activation_5 (Activation)        (None, 128, 192, 64)  0           batch_normalization_8[0][0]      
____________________________________________________________________________________________________
conv2d_9 (Conv2D)                (None, 128, 192, 64)  16777280    activation_5[0][0]               
____________________________________________________________________________________________________
batch_normalization_9 (BatchNorm (None, 128, 192, 64)  256         conv2d_9[0][0]                   
____________________________________________________________________________________________________
add_4 (Add)                      (None, 128, 192, 64)  0           add_3[0][0]                      
                                                                   batch_normalization_9[0][0]      
____________________________________________________________________________________________________
conv2d_transpose_1 (Conv2DTransp (None, 256, 384, 64)  36928       add_4[0][0]                      
____________________________________________________________________________________________________
batch_normalization_10 (BatchNor (None, 256, 384, 64)  256         conv2d_transpose_1[0][0]         
____________________________________________________________________________________________________
conv2d_10 (Conv2D)               (None, 256, 384, 1)   5185        batch_normalization_10[0][0]     
____________________________________________________________________________________________________
activation_6 (Activation)        (None, 256, 384, 1)   0           conv2d_10[0][0]                  
====================================================================================================
Total params: 134,268,161
Trainable params: 134,266,881
Non-trainable params: 1,280
____________________________________________________________________________________________________
In [17]:
fit_parameters = {
    'epochs'           : epochs,
    'generator'        : train_data_gen,
    'steps_per_epoch'  : 20,
    'validation_data'  : valid_data_gen,
    'validation_steps' : 10,
    'callbacks'        : [checkpoint_callback, learning_rate_callback],
    'verbose'          : 2
}
In [18]:
model.fit_generator(**fit_parameters)
/home/rodgzilla/Documents/machine_learning/keras/keras/backend/tensorflow_backend.py:2252: UserWarning: Expected no kwargs, you passed 1
kwargs passed to function are ignored with Tensorflow backend
  warnings.warn('\n'.join(msg))
Epoch 1/500
56s - loss: 0.5167 - val_loss: 2.8227
Epoch 2/500
45s - loss: 0.2921 - val_loss: 1.7210
Epoch 3/500
45s - loss: 0.2573 - val_loss: 1.4632
Epoch 4/500
44s - loss: 0.2301 - val_loss: 0.9971
Epoch 5/500
44s - loss: 0.1878 - val_loss: 1.0928
Epoch 6/500
44s - loss: 0.1377 - val_loss: 0.2007
Epoch 7/500
44s - loss: 0.0675 - val_loss: 0.1482
Epoch 8/500
44s - loss: 0.0539 - val_loss: 0.2260
Epoch 9/500
44s - loss: 0.0699 - val_loss: 0.1103
Epoch 10/500
44s - loss: 0.0517 - val_loss: 0.0999
Epoch 11/500
44s - loss: 0.0431 - val_loss: 0.0613
Epoch 12/500
44s - loss: 0.0475 - val_loss: 0.0753
Epoch 13/500
44s - loss: 0.0386 - val_loss: 0.0591
Epoch 14/500
44s - loss: 0.0426 - val_loss: 0.0370
Epoch 15/500
44s - loss: 0.0249 - val_loss: 0.0244
Epoch 16/500
44s - loss: 0.0248 - val_loss: 0.0284
Epoch 17/500
44s - loss: 0.0446 - val_loss: 0.0415
Epoch 18/500
44s - loss: 0.0485 - val_loss: 0.0576
Epoch 19/500
44s - loss: 0.0373 - val_loss: 0.0195
Epoch 20/500
45s - loss: 0.0251 - val_loss: 0.0157
Epoch 21/500
44s - loss: 0.0367 - val_loss: 0.0325
Epoch 22/500
44s - loss: 0.0272 - val_loss: 0.0177
Epoch 23/500
44s - loss: 0.0258 - val_loss: 0.0180
Epoch 24/500
44s - loss: 0.0245 - val_loss: 0.0190
Epoch 25/500
44s - loss: 0.0374 - val_loss: 0.0259
Epoch 26/500
44s - loss: 0.0370 - val_loss: 0.0255
Epoch 27/500
44s - loss: 0.0329 - val_loss: 0.0319
Epoch 28/500
44s - loss: 0.0308 - val_loss: 0.0194
Epoch 29/500
44s - loss: 0.0325 - val_loss: 0.0402
Epoch 30/500
45s - loss: 0.0342 - val_loss: 0.0454
Epoch 31/500
44s - loss: 0.0332 - val_loss: 0.0217
Epoch 32/500
44s - loss: 0.0249 - val_loss: 0.0206
Epoch 33/500
44s - loss: 0.0266 - val_loss: 0.0160
Epoch 34/500
44s - loss: 0.0225 - val_loss: 0.0179
Epoch 35/500
44s - loss: 0.0329 - val_loss: 0.0233
Epoch 36/500
44s - loss: 0.0214 - val_loss: 0.0186
Epoch 37/500
44s - loss: 0.0223 - val_loss: 0.0157
Epoch 38/500
45s - loss: 0.0201 - val_loss: 0.0236
Epoch 39/500
44s - loss: 0.0275 - val_loss: 0.0134
Epoch 40/500
44s - loss: 0.0259 - val_loss: 0.0121
Epoch 41/500
44s - loss: 0.0233 - val_loss: 0.0186
Epoch 42/500
44s - loss: 0.0231 - val_loss: 0.0205
Epoch 43/500
44s - loss: 0.0387 - val_loss: 0.0784
Epoch 44/500
44s - loss: 0.0422 - val_loss: 0.0380
Epoch 45/500
44s - loss: 0.0373 - val_loss: 0.0437
Epoch 46/500
44s - loss: 0.0257 - val_loss: 0.0210
Epoch 47/500
44s - loss: 0.0333 - val_loss: 0.0191
Epoch 48/500
44s - loss: 0.0184 - val_loss: 0.0295
Epoch 49/500
44s - loss: 0.0280 - val_loss: 0.0257
Epoch 50/500
44s - loss: 0.0254 - val_loss: 0.0360
Epoch 51/500
45s - loss: 0.0285 - val_loss: 0.0432
Epoch 52/500
45s - loss: 0.0364 - val_loss: 0.0269
Epoch 53/500
44s - loss: 0.0205 - val_loss: 0.0200
Epoch 54/500
44s - loss: 0.0231 - val_loss: 0.0299
Epoch 55/500
44s - loss: 0.0231 - val_loss: 0.0249
Epoch 56/500
45s - loss: 0.0366 - val_loss: 0.0258
Epoch 57/500
44s - loss: 0.0206 - val_loss: 0.0326
Epoch 58/500
45s - loss: 0.0253 - val_loss: 0.0140
Epoch 59/500
44s - loss: 0.0200 - val_loss: 0.0153
Epoch 60/500
44s - loss: 0.0191 - val_loss: 0.0191
Epoch 61/500
45s - loss: 0.0260 - val_loss: 0.0163
Epoch 62/500
44s - loss: 0.0213 - val_loss: 0.0178
Epoch 63/500
44s - loss: 0.0249 - val_loss: 0.0171
Epoch 64/500
44s - loss: 0.0243 - val_loss: 0.0171
Epoch 65/500
45s - loss: 0.0207 - val_loss: 0.0163
Epoch 66/500
44s - loss: 0.0227 - val_loss: 0.0175
Epoch 67/500
45s - loss: 0.0192 - val_loss: 0.0267
Epoch 68/500
44s - loss: 0.0224 - val_loss: 0.0160
Epoch 69/500
44s - loss: 0.0299 - val_loss: 0.0195
Epoch 70/500
44s - loss: 0.0232 - val_loss: 0.0192
Epoch 71/500
44s - loss: 0.0281 - val_loss: 0.0215
Epoch 72/500
44s - loss: 0.0374 - val_loss: 0.0218
Epoch 73/500
44s - loss: 0.0246 - val_loss: 0.0251
Epoch 74/500
45s - loss: 0.0229 - val_loss: 0.0170
Epoch 75/500
44s - loss: 0.0370 - val_loss: 0.0228
Epoch 76/500
45s - loss: 0.0484 - val_loss: 0.0416
Epoch 77/500
44s - loss: 0.0299 - val_loss: 0.0215
Epoch 78/500
44s - loss: 0.0275 - val_loss: 0.0290
Epoch 79/500
44s - loss: 0.0262 - val_loss: 0.0703
Epoch 80/500
45s - loss: 0.0228 - val_loss: 0.0214
Epoch 81/500
45s - loss: 0.0264 - val_loss: 0.0238
Epoch 82/500
46s - loss: 0.0293 - val_loss: 0.0391
Epoch 83/500
45s - loss: 0.0224 - val_loss: 0.0206
Epoch 84/500
44s - loss: 0.0266 - val_loss: 0.0230
Epoch 85/500
44s - loss: 0.0232 - val_loss: 0.0278
Epoch 86/500
45s - loss: 0.0211 - val_loss: 0.0133
Epoch 87/500
45s - loss: 0.0240 - val_loss: 0.0108
Epoch 88/500
44s - loss: 0.0170 - val_loss: 0.0185
Epoch 89/500
44s - loss: 0.0142 - val_loss: 0.0121
Epoch 90/500
44s - loss: 0.0190 - val_loss: 0.0137
Epoch 91/500
44s - loss: 0.0178 - val_loss: 0.0482
Epoch 92/500
44s - loss: 0.0295 - val_loss: 0.0218
Epoch 93/500
44s - loss: 0.0236 - val_loss: 0.0188
Epoch 94/500
44s - loss: 0.0319 - val_loss: 0.0162
Epoch 95/500
44s - loss: 0.0267 - val_loss: 0.0225
Epoch 96/500
44s - loss: 0.0228 - val_loss: 0.0169
Epoch 97/500
44s - loss: 0.0309 - val_loss: 0.0236
Epoch 98/500
44s - loss: 0.0239 - val_loss: 0.0140
Epoch 99/500
44s - loss: 0.0204 - val_loss: 0.0139
Epoch 100/500
46s - loss: 0.0241 - val_loss: 0.0106
Epoch 101/500
44s - loss: 0.0248 - val_loss: 0.0148
Epoch 102/500
44s - loss: 0.0216 - val_loss: 0.0128
Epoch 103/500
45s - loss: 0.0230 - val_loss: 0.0189
Epoch 104/500
42s - loss: 0.0208 - val_loss: 0.0179
Epoch 105/500
44s - loss: 0.0272 - val_loss: 0.0221
Epoch 106/500
44s - loss: 0.0254 - val_loss: 0.0230
Epoch 107/500
45s - loss: 0.0195 - val_loss: 0.0128
Epoch 108/500
44s - loss: 0.0210 - val_loss: 0.0170
Epoch 109/500
45s - loss: 0.0144 - val_loss: 0.0229
Epoch 110/500
45s - loss: 0.0201 - val_loss: 0.0143
Epoch 111/500
44s - loss: 0.0215 - val_loss: 0.0150
Epoch 112/500
45s - loss: 0.0188 - val_loss: 0.0125
Epoch 113/500
44s - loss: 0.0175 - val_loss: 0.0177
Epoch 114/500
45s - loss: 0.0159 - val_loss: 0.0126
Epoch 115/500
44s - loss: 0.0167 - val_loss: 0.0114
Epoch 116/500
44s - loss: 0.0192 - val_loss: 0.0136
Epoch 117/500
44s - loss: 0.0174 - val_loss: 0.0142
Epoch 118/500
44s - loss: 0.0235 - val_loss: 0.0190
Epoch 119/500
44s - loss: 0.0263 - val_loss: 0.0274
Epoch 120/500
45s - loss: 0.0247 - val_loss: 0.0223
Epoch 121/500
44s - loss: 0.0300 - val_loss: 0.0313
Epoch 122/500
44s - loss: 0.0266 - val_loss: 0.0311
Epoch 123/500
44s - loss: 0.0222 - val_loss: 0.0191
Epoch 124/500
45s - loss: 0.0144 - val_loss: 0.0184
Epoch 125/500
44s - loss: 0.0156 - val_loss: 0.0204
Epoch 126/500
44s - loss: 0.0181 - val_loss: 0.0116
Epoch 127/500
44s - loss: 0.0191 - val_loss: 0.0202
Epoch 128/500
44s - loss: 0.0186 - val_loss: 0.0153
Epoch 129/500
44s - loss: 0.0187 - val_loss: 0.0167
Epoch 130/500
44s - loss: 0.0136 - val_loss: 0.0149
Epoch 131/500
44s - loss: 0.0181 - val_loss: 0.0124
Epoch 132/500
44s - loss: 0.0166 - val_loss: 0.0164
Epoch 133/500
44s - loss: 0.0184 - val_loss: 0.0161
Epoch 134/500
44s - loss: 0.0226 - val_loss: 0.0341
Epoch 135/500
44s - loss: 0.0180 - val_loss: 0.0179
Epoch 136/500
44s - loss: 0.0262 - val_loss: 0.0141
Epoch 137/500
44s - loss: 0.0198 - val_loss: 0.0131
Epoch 138/500
45s - loss: 0.0183 - val_loss: 0.0194
Epoch 139/500
44s - loss: 0.0156 - val_loss: 0.0171
Epoch 140/500
45s - loss: 0.0162 - val_loss: 0.0177
Epoch 141/500
44s - loss: 0.0196 - val_loss: 0.0237
Epoch 142/500
44s - loss: 0.0229 - val_loss: 0.0171
Epoch 143/500
44s - loss: 0.0201 - val_loss: 0.0306
Epoch 144/500
44s - loss: 0.0280 - val_loss: 0.0220
Epoch 145/500
44s - loss: 0.0207 - val_loss: 0.0275
Epoch 146/500
45s - loss: 0.0178 - val_loss: 0.0249
Epoch 147/500
44s - loss: 0.0182 - val_loss: 0.0160
Epoch 148/500
44s - loss: 0.0192 - val_loss: 0.0175
Epoch 149/500
44s - loss: 0.0203 - val_loss: 0.0216
Epoch 150/500
44s - loss: 0.0262 - val_loss: 0.0146
Epoch 151/500
45s - loss: 0.0254 - val_loss: 0.0177
Epoch 152/500
45s - loss: 0.0194 - val_loss: 0.0126
Epoch 153/500
44s - loss: 0.0253 - val_loss: 0.0167
Epoch 154/500
45s - loss: 0.0250 - val_loss: 0.0176
Epoch 155/500
45s - loss: 0.0147 - val_loss: 0.0138
Epoch 156/500
45s - loss: 0.0184 - val_loss: 0.0193
Epoch 157/500
46s - loss: 0.0182 - val_loss: 0.0195
Epoch 158/500
46s - loss: 0.0196 - val_loss: 0.0188
Epoch 159/500
46s - loss: 0.0205 - val_loss: 0.0112
Epoch 160/500
46s - loss: 0.0172 - val_loss: 0.0172
Epoch 161/500
46s - loss: 0.0182 - val_loss: 0.0156
Epoch 162/500
46s - loss: 0.0171 - val_loss: 0.0155
Epoch 163/500
45s - loss: 0.0181 - val_loss: 0.0188
Epoch 164/500
45s - loss: 0.0159 - val_loss: 0.0275
Epoch 165/500
45s - loss: 0.0173 - val_loss: 0.0125
Epoch 166/500
46s - loss: 0.0161 - val_loss: 0.0129
Epoch 167/500
46s - loss: 0.0202 - val_loss: 0.0184
Epoch 168/500
46s - loss: 0.0203 - val_loss: 0.0107
Epoch 169/500
46s - loss: 0.0202 - val_loss: 0.0189
Epoch 170/500
45s - loss: 0.0187 - val_loss: 0.0175
Epoch 171/500
45s - loss: 0.0131 - val_loss: 0.0091
Epoch 172/500
46s - loss: 0.0203 - val_loss: 0.0103
Epoch 173/500
45s - loss: 0.0236 - val_loss: 0.0148
Epoch 174/500
45s - loss: 0.0179 - val_loss: 0.0279
Epoch 175/500
45s - loss: 0.0156 - val_loss: 0.0156
Epoch 176/500
46s - loss: 0.0140 - val_loss: 0.0114
Epoch 177/500
46s - loss: 0.0185 - val_loss: 0.0274
Epoch 178/500
46s - loss: 0.0187 - val_loss: 0.0188
Epoch 179/500
46s - loss: 0.0124 - val_loss: 0.0120
Epoch 180/500
46s - loss: 0.0221 - val_loss: 0.0235
Epoch 181/500
46s - loss: 0.0187 - val_loss: 0.0186
Epoch 182/500
46s - loss: 0.0158 - val_loss: 0.0130
Epoch 183/500
45s - loss: 0.0167 - val_loss: 0.0140
Epoch 184/500
46s - loss: 0.0174 - val_loss: 0.0116
Epoch 185/500
46s - loss: 0.0183 - val_loss: 0.0133
Epoch 186/500
46s - loss: 0.0169 - val_loss: 0.0170
Epoch 187/500
45s - loss: 0.0167 - val_loss: 0.0148
Epoch 188/500
46s - loss: 0.0202 - val_loss: 0.0286
Epoch 189/500
46s - loss: 0.0206 - val_loss: 0.0215
Epoch 190/500
45s - loss: 0.0236 - val_loss: 0.0146
Epoch 191/500
46s - loss: 0.0153 - val_loss: 0.0149
Epoch 192/500
46s - loss: 0.0166 - val_loss: 0.0128
Epoch 193/500
46s - loss: 0.0275 - val_loss: 0.0176
Epoch 194/500
46s - loss: 0.0226 - val_loss: 0.0675
Epoch 195/500
46s - loss: 0.0294 - val_loss: 0.0281
Epoch 196/500
46s - loss: 0.0186 - val_loss: 0.0199
Epoch 197/500
47s - loss: 0.0180 - val_loss: 0.0106
Epoch 198/500
46s - loss: 0.0196 - val_loss: 0.0195
Epoch 199/500
44s - loss: 0.0177 - val_loss: 0.0180
Epoch 200/500
45s - loss: 0.0202 - val_loss: 0.0247
Epoch 201/500
44s - loss: 0.0191 - val_loss: 0.0268
Epoch 202/500
45s - loss: 0.0248 - val_loss: 0.0142
Epoch 203/500
45s - loss: 0.0293 - val_loss: 0.0593
Epoch 204/500
44s - loss: 0.0200 - val_loss: 0.0229
Epoch 205/500
45s - loss: 0.0209 - val_loss: 0.0247
Epoch 206/500
44s - loss: 0.0221 - val_loss: 0.0215
Epoch 207/500
44s - loss: 0.0225 - val_loss: 0.0168
Epoch 208/500
44s - loss: 0.0280 - val_loss: 0.0294
Epoch 209/500
44s - loss: 0.0279 - val_loss: 0.0771
Epoch 210/500
44s - loss: 0.0455 - val_loss: 0.0401
Epoch 211/500
45s - loss: 0.0233 - val_loss: 0.0282
Epoch 212/500
44s - loss: 0.0197 - val_loss: 0.0124
Epoch 213/500
45s - loss: 0.0213 - val_loss: 0.0135
Epoch 214/500
44s - loss: 0.0223 - val_loss: 0.0168
Epoch 215/500
44s - loss: 0.0178 - val_loss: 0.0120
Epoch 216/500
44s - loss: 0.0169 - val_loss: 0.0155
Epoch 217/500
44s - loss: 0.0186 - val_loss: 0.0175
Epoch 218/500
45s - loss: 0.0217 - val_loss: 0.0183
Epoch 219/500
45s - loss: 0.0195 - val_loss: 0.0132
Epoch 220/500
45s - loss: 0.0187 - val_loss: 0.0105
Epoch 221/500
44s - loss: 0.0239 - val_loss: 0.0191
Epoch 222/500
44s - loss: 0.0216 - val_loss: 0.0125
Epoch 223/500
44s - loss: 0.0201 - val_loss: 0.0145
Epoch 224/500
44s - loss: 0.0168 - val_loss: 0.0151
Epoch 225/500
44s - loss: 0.0183 - val_loss: 0.0202
Epoch 226/500
45s - loss: 0.0164 - val_loss: 0.0145
Epoch 227/500
44s - loss: 0.0157 - val_loss: 0.0474
Epoch 228/500
44s - loss: 0.0205 - val_loss: 0.0139
Epoch 229/500
44s - loss: 0.0205 - val_loss: 0.0153
Epoch 230/500
44s - loss: 0.0162 - val_loss: 0.0163
Epoch 231/500
44s - loss: 0.0188 - val_loss: 0.0290
Epoch 232/500
45s - loss: 0.0161 - val_loss: 0.0117
Epoch 233/500
46s - loss: 0.0187 - val_loss: 0.0162
Epoch 234/500
45s - loss: 0.0172 - val_loss: 0.0190
Epoch 235/500
45s - loss: 0.0166 - val_loss: 0.0148
Epoch 236/500
45s - loss: 0.0204 - val_loss: 0.0117
Epoch 237/500
46s - loss: 0.0163 - val_loss: 0.0154
Epoch 238/500
45s - loss: 0.0144 - val_loss: 0.0160
Epoch 239/500
45s - loss: 0.0179 - val_loss: 0.0163
Epoch 240/500
46s - loss: 0.0175 - val_loss: 0.0212
Epoch 241/500
46s - loss: 0.0216 - val_loss: 0.0104
Epoch 242/500
46s - loss: 0.0173 - val_loss: 0.0128
Epoch 243/500
45s - loss: 0.0178 - val_loss: 0.0113
Epoch 244/500
46s - loss: 0.0122 - val_loss: 0.0150
Epoch 245/500
45s - loss: 0.0186 - val_loss: 0.0287
Epoch 246/500
45s - loss: 0.0160 - val_loss: 0.0109
Epoch 247/500
45s - loss: 0.0157 - val_loss: 0.0112
Epoch 248/500
45s - loss: 0.0180 - val_loss: 0.0170
Epoch 249/500
46s - loss: 0.0228 - val_loss: 0.0102
Epoch 250/500
45s - loss: 0.0176 - val_loss: 0.0122
Epoch 251/500
45s - loss: 0.0212 - val_loss: 0.0112
Epoch 252/500
45s - loss: 0.0206 - val_loss: 0.0156
Epoch 253/500
45s - loss: 0.0145 - val_loss: 0.0122
Epoch 254/500
46s - loss: 0.0172 - val_loss: 0.0115
Epoch 255/500
46s - loss: 0.0190 - val_loss: 0.0133
Epoch 256/500
46s - loss: 0.0177 - val_loss: 0.0200
Epoch 257/500
46s - loss: 0.0174 - val_loss: 0.0128
Epoch 258/500
46s - loss: 0.0164 - val_loss: 0.0130
Epoch 259/500
46s - loss: 0.0167 - val_loss: 0.0107
Epoch 260/500
45s - loss: 0.0169 - val_loss: 0.0181
Epoch 261/500
45s - loss: 0.0179 - val_loss: 0.0088
Epoch 262/500
45s - loss: 0.0139 - val_loss: 0.0117
Epoch 263/500
45s - loss: 0.0155 - val_loss: 0.0176
Epoch 264/500
45s - loss: 0.0200 - val_loss: 0.0345
Epoch 265/500
45s - loss: 0.0197 - val_loss: 0.0159
Epoch 266/500
45s - loss: 0.0218 - val_loss: 0.0167
Epoch 267/500
45s - loss: 0.0155 - val_loss: 0.0174
Epoch 268/500
45s - loss: 0.0160 - val_loss: 0.0096
Epoch 269/500
46s - loss: 0.0172 - val_loss: 0.0146
Epoch 270/500
46s - loss: 0.0195 - val_loss: 0.0162
Epoch 271/500
45s - loss: 0.0128 - val_loss: 0.0220
Epoch 272/500
45s - loss: 0.0172 - val_loss: 0.0199
Epoch 273/500
46s - loss: 0.0168 - val_loss: 0.0194
Epoch 274/500
46s - loss: 0.0136 - val_loss: 0.0139
Epoch 275/500
46s - loss: 0.0168 - val_loss: 0.0191
Epoch 276/500
46s - loss: 0.0157 - val_loss: 0.0224
Epoch 277/500
46s - loss: 0.0166 - val_loss: 0.0211
Epoch 278/500
46s - loss: 0.0225 - val_loss: 0.0117
Epoch 279/500
46s - loss: 0.0176 - val_loss: 0.0127
Epoch 280/500
46s - loss: 0.0145 - val_loss: 0.0170
Epoch 281/500
47s - loss: 0.0177 - val_loss: 0.0116
Epoch 282/500
46s - loss: 0.0202 - val_loss: 0.0118
Epoch 283/500
46s - loss: 0.0173 - val_loss: 0.0136
Epoch 284/500
46s - loss: 0.0160 - val_loss: 0.0147
Epoch 285/500
46s - loss: 0.0171 - val_loss: 0.0144
Epoch 286/500
46s - loss: 0.0147 - val_loss: 0.0136
Epoch 287/500
46s - loss: 0.0174 - val_loss: 0.0151
Epoch 288/500
46s - loss: 0.0180 - val_loss: 0.0121
Epoch 289/500
46s - loss: 0.0173 - val_loss: 0.0193
Epoch 290/500
46s - loss: 0.0152 - val_loss: 0.0203
Epoch 291/500
46s - loss: 0.0253 - val_loss: 0.0167
Epoch 292/500
46s - loss: 0.0165 - val_loss: 0.0148
Epoch 293/500
46s - loss: 0.0159 - val_loss: 0.0168
Epoch 294/500
46s - loss: 0.0215 - val_loss: 0.0134
Epoch 295/500
46s - loss: 0.0155 - val_loss: 0.0100
Epoch 296/500
46s - loss: 0.0183 - val_loss: 0.0131
Epoch 297/500
46s - loss: 0.0180 - val_loss: 0.0097
Epoch 298/500
46s - loss: 0.0162 - val_loss: 0.0117
Epoch 299/500
46s - loss: 0.0172 - val_loss: 0.0105
Epoch 300/500
47s - loss: 0.0149 - val_loss: 0.0134
Epoch 301/500
46s - loss: 0.0192 - val_loss: 0.0140
Epoch 302/500
46s - loss: 0.0154 - val_loss: 0.0135
Epoch 303/500
46s - loss: 0.0184 - val_loss: 0.0133
Epoch 304/500
46s - loss: 0.0172 - val_loss: 0.0122
Epoch 305/500
46s - loss: 0.0178 - val_loss: 0.0120
Epoch 306/500
46s - loss: 0.0130 - val_loss: 0.0201
Epoch 307/500
46s - loss: 0.0153 - val_loss: 0.0101
Epoch 308/500
46s - loss: 0.0179 - val_loss: 0.0132
Epoch 309/500
46s - loss: 0.0141 - val_loss: 0.0132
Epoch 310/500
46s - loss: 0.0155 - val_loss: 0.0173
Epoch 311/500
46s - loss: 0.0124 - val_loss: 0.0133
Epoch 312/500
46s - loss: 0.0166 - val_loss: 0.0277
Epoch 313/500
46s - loss: 0.0204 - val_loss: 0.0126
Epoch 314/500
46s - loss: 0.0166 - val_loss: 0.0154
Epoch 315/500
46s - loss: 0.0197 - val_loss: 0.0131
Epoch 316/500
46s - loss: 0.0145 - val_loss: 0.0105
Epoch 317/500
46s - loss: 0.0163 - val_loss: 0.0113
Epoch 318/500
46s - loss: 0.0170 - val_loss: 0.0102
Epoch 319/500
46s - loss: 0.0183 - val_loss: 0.0138
Epoch 320/500
46s - loss: 0.0230 - val_loss: 0.0125
Epoch 321/500
46s - loss: 0.0161 - val_loss: 0.0176
Epoch 322/500
46s - loss: 0.0147 - val_loss: 0.0201
Epoch 323/500
46s - loss: 0.0215 - val_loss: 0.0099
Epoch 324/500
46s - loss: 0.0150 - val_loss: 0.0191
Epoch 325/500
46s - loss: 0.0156 - val_loss: 0.0133
Epoch 326/500
46s - loss: 0.0189 - val_loss: 0.0136
Epoch 327/500
46s - loss: 0.0204 - val_loss: 0.0158
Epoch 328/500
46s - loss: 0.0172 - val_loss: 0.0135
Epoch 329/500
46s - loss: 0.0147 - val_loss: 0.0150
Epoch 330/500
46s - loss: 0.0175 - val_loss: 0.0109
Epoch 331/500
46s - loss: 0.0173 - val_loss: 0.0123
Epoch 332/500
46s - loss: 0.0119 - val_loss: 0.0125
Epoch 333/500
46s - loss: 0.0181 - val_loss: 0.0140
Epoch 334/500
46s - loss: 0.0173 - val_loss: 0.0137
Epoch 335/500
46s - loss: 0.0157 - val_loss: 0.0159
Epoch 336/500
46s - loss: 0.0126 - val_loss: 0.0204
Epoch 337/500
46s - loss: 0.0129 - val_loss: 0.0133
Epoch 338/500
46s - loss: 0.0157 - val_loss: 0.0121
Epoch 339/500
46s - loss: 0.0120 - val_loss: 0.0170
Epoch 340/500
46s - loss: 0.0149 - val_loss: 0.0149
Epoch 341/500
46s - loss: 0.0145 - val_loss: 0.0146
Epoch 342/500
46s - loss: 0.0147 - val_loss: 0.0132
Epoch 343/500
46s - loss: 0.0207 - val_loss: 0.0099
Epoch 344/500
46s - loss: 0.0135 - val_loss: 0.0144
Epoch 345/500
46s - loss: 0.0112 - val_loss: 0.0087
Epoch 346/500
46s - loss: 0.0230 - val_loss: 0.0147
Epoch 347/500
46s - loss: 0.0137 - val_loss: 0.0198
Epoch 348/500
46s - loss: 0.0178 - val_loss: 0.0142
Epoch 349/500
46s - loss: 0.0210 - val_loss: 0.0096
Epoch 350/500
46s - loss: 0.0125 - val_loss: 0.0165
Epoch 351/500
46s - loss: 0.0173 - val_loss: 0.0145
Epoch 352/500
46s - loss: 0.0158 - val_loss: 0.0097
Epoch 353/500
46s - loss: 0.0121 - val_loss: 0.0119
Epoch 354/500
46s - loss: 0.0218 - val_loss: 0.0108
Epoch 355/500
46s - loss: 0.0161 - val_loss: 0.0132
Epoch 356/500
46s - loss: 0.0145 - val_loss: 0.0123
Epoch 357/500
46s - loss: 0.0165 - val_loss: 0.0161
Epoch 358/500
46s - loss: 0.0139 - val_loss: 0.0155
Epoch 359/500
46s - loss: 0.0154 - val_loss: 0.0139
Epoch 360/500
46s - loss: 0.0137 - val_loss: 0.0134
Epoch 361/500
46s - loss: 0.0127 - val_loss: 0.0138
Epoch 362/500
46s - loss: 0.0146 - val_loss: 0.0117
Epoch 363/500
46s - loss: 0.0147 - val_loss: 0.0129
Epoch 364/500
46s - loss: 0.0176 - val_loss: 0.0104
Epoch 365/500
46s - loss: 0.0135 - val_loss: 0.0183
Epoch 366/500
46s - loss: 0.0213 - val_loss: 0.0128
Epoch 367/500
46s - loss: 0.0178 - val_loss: 0.0144
Epoch 368/500
46s - loss: 0.0119 - val_loss: 0.0162
Epoch 369/500
46s - loss: 0.0136 - val_loss: 0.0131
Epoch 370/500
46s - loss: 0.0139 - val_loss: 0.0158
Epoch 371/500
46s - loss: 0.0156 - val_loss: 0.0099
Epoch 372/500
46s - loss: 0.0187 - val_loss: 0.0115
Epoch 373/500
46s - loss: 0.0132 - val_loss: 0.0131
Epoch 374/500
46s - loss: 0.0152 - val_loss: 0.0154
Epoch 375/500
46s - loss: 0.0172 - val_loss: 0.0115
Epoch 376/500
46s - loss: 0.0160 - val_loss: 0.0193
Epoch 377/500
46s - loss: 0.0180 - val_loss: 0.0144
Epoch 378/500
46s - loss: 0.0128 - val_loss: 0.0118
Epoch 379/500
46s - loss: 0.0122 - val_loss: 0.0135
Epoch 380/500
46s - loss: 0.0139 - val_loss: 0.0102
Epoch 381/500
46s - loss: 0.0162 - val_loss: 0.0114
Epoch 382/500
46s - loss: 0.0137 - val_loss: 0.0122
Epoch 383/500
46s - loss: 0.0129 - val_loss: 0.0223
Epoch 384/500
46s - loss: 0.0171 - val_loss: 0.0142
Epoch 385/500
46s - loss: 0.0195 - val_loss: 0.0250
Epoch 386/500
46s - loss: 0.0175 - val_loss: 0.0101
Epoch 387/500
46s - loss: 0.0139 - val_loss: 0.0099
Epoch 388/500
46s - loss: 0.0166 - val_loss: 0.0134
Epoch 389/500
46s - loss: 0.0153 - val_loss: 0.0160
Epoch 390/500
46s - loss: 0.0158 - val_loss: 0.0197
Epoch 391/500
46s - loss: 0.0199 - val_loss: 0.0122
Epoch 392/500
46s - loss: 0.0122 - val_loss: 0.0134
Epoch 393/500
46s - loss: 0.0167 - val_loss: 0.0144
Epoch 394/500
46s - loss: 0.0125 - val_loss: 0.0090
Epoch 395/500
46s - loss: 0.0168 - val_loss: 0.0115
Epoch 396/500
46s - loss: 0.0151 - val_loss: 0.0145
Epoch 397/500
46s - loss: 0.0141 - val_loss: 0.0104
Epoch 398/500
46s - loss: 0.0157 - val_loss: 0.0140
Epoch 399/500
46s - loss: 0.0165 - val_loss: 0.0167
Epoch 400/500
47s - loss: 0.0183 - val_loss: 0.0118
Epoch 401/500
46s - loss: 0.0145 - val_loss: 0.0064
Epoch 402/500
46s - loss: 0.0163 - val_loss: 0.0087
Epoch 403/500
46s - loss: 0.0164 - val_loss: 0.0110
Epoch 404/500
46s - loss: 0.0136 - val_loss: 0.0083
Epoch 405/500
46s - loss: 0.0127 - val_loss: 0.0093
Epoch 406/500
46s - loss: 0.0167 - val_loss: 0.0097
Epoch 407/500
46s - loss: 0.0173 - val_loss: 0.0124
Epoch 408/500
46s - loss: 0.0117 - val_loss: 0.0133
Epoch 409/500
46s - loss: 0.0144 - val_loss: 0.0189
Epoch 410/500
46s - loss: 0.0173 - val_loss: 0.0115
Epoch 411/500
46s - loss: 0.0130 - val_loss: 0.0116
Epoch 412/500
46s - loss: 0.0119 - val_loss: 0.0179
Epoch 413/500
46s - loss: 0.0172 - val_loss: 0.0122
Epoch 414/500
46s - loss: 0.0133 - val_loss: 0.0166
Epoch 415/500
46s - loss: 0.0127 - val_loss: 0.0140
Epoch 416/500
46s - loss: 0.0149 - val_loss: 0.0146
Epoch 417/500
46s - loss: 0.0130 - val_loss: 0.0122
Epoch 418/500
46s - loss: 0.0127 - val_loss: 0.0092
Epoch 419/500
46s - loss: 0.0091 - val_loss: 0.0137
Epoch 420/500
46s - loss: 0.0130 - val_loss: 0.0144
Epoch 421/500
46s - loss: 0.0147 - val_loss: 0.0091
Epoch 422/500
46s - loss: 0.0146 - val_loss: 0.0087
Epoch 423/500
46s - loss: 0.0140 - val_loss: 0.0110
Epoch 424/500
46s - loss: 0.0151 - val_loss: 0.0081
Epoch 425/500
46s - loss: 0.0177 - val_loss: 0.0106
Epoch 426/500
46s - loss: 0.0147 - val_loss: 0.0134
Epoch 427/500
46s - loss: 0.0144 - val_loss: 0.0114
Epoch 428/500
46s - loss: 0.0136 - val_loss: 0.0141
Epoch 429/500
46s - loss: 0.0172 - val_loss: 0.0105
Epoch 430/500
46s - loss: 0.0143 - val_loss: 0.0177
Epoch 431/500
46s - loss: 0.0132 - val_loss: 0.0103
Epoch 432/500
46s - loss: 0.0125 - val_loss: 0.0110
Epoch 433/500
46s - loss: 0.0136 - val_loss: 0.0112
Epoch 434/500
46s - loss: 0.0134 - val_loss: 0.0110
Epoch 435/500
46s - loss: 0.0147 - val_loss: 0.0107
Epoch 436/500
46s - loss: 0.0181 - val_loss: 0.0095
Epoch 437/500
46s - loss: 0.0127 - val_loss: 0.0198
Epoch 438/500
46s - loss: 0.0224 - val_loss: 0.0131
Epoch 439/500
46s - loss: 0.0126 - val_loss: 0.0111
Epoch 440/500
46s - loss: 0.0161 - val_loss: 0.0081
Epoch 441/500
46s - loss: 0.0134 - val_loss: 0.0097
Epoch 442/500
46s - loss: 0.0130 - val_loss: 0.0111
Epoch 443/500
46s - loss: 0.0149 - val_loss: 0.0137
Epoch 444/500
46s - loss: 0.0130 - val_loss: 0.0158
Epoch 445/500
46s - loss: 0.0130 - val_loss: 0.0101
Epoch 446/500
46s - loss: 0.0134 - val_loss: 0.0127
Epoch 447/500
46s - loss: 0.0166 - val_loss: 0.0100
Epoch 448/500
46s - loss: 0.0138 - val_loss: 0.0109
Epoch 449/500
46s - loss: 0.0188 - val_loss: 0.0116
Epoch 450/500
46s - loss: 0.0140 - val_loss: 0.0159
Epoch 451/500
45s - loss: 0.0169 - val_loss: 0.0093
Epoch 452/500
46s - loss: 0.0131 - val_loss: 0.0102
Epoch 453/500
46s - loss: 0.0175 - val_loss: 0.0154
Epoch 454/500
46s - loss: 0.0130 - val_loss: 0.0094
Epoch 455/500
46s - loss: 0.0112 - val_loss: 0.0091
Epoch 456/500
46s - loss: 0.0127 - val_loss: 0.0120
Epoch 457/500
46s - loss: 0.0170 - val_loss: 0.0128
Epoch 458/500
46s - loss: 0.0115 - val_loss: 0.0128
Epoch 459/500
46s - loss: 0.0176 - val_loss: 0.0096
Epoch 460/500
46s - loss: 0.0113 - val_loss: 0.0135
Epoch 461/500
46s - loss: 0.0113 - val_loss: 0.0128
Epoch 462/500
46s - loss: 0.0143 - val_loss: 0.0142
Epoch 463/500
46s - loss: 0.0161 - val_loss: 0.0112
Epoch 464/500
46s - loss: 0.0163 - val_loss: 0.0100
Epoch 465/500
46s - loss: 0.0163 - val_loss: 0.0282
Epoch 466/500
46s - loss: 0.0206 - val_loss: 0.0111
Epoch 467/500
46s - loss: 0.0140 - val_loss: 0.0123
Epoch 468/500
46s - loss: 0.0157 - val_loss: 0.0111
Epoch 469/500
46s - loss: 0.0143 - val_loss: 0.0132
Epoch 470/500
46s - loss: 0.0137 - val_loss: 0.0086
Epoch 471/500
46s - loss: 0.0140 - val_loss: 0.0094
Epoch 472/500
46s - loss: 0.0175 - val_loss: 0.0137
Epoch 473/500
46s - loss: 0.0145 - val_loss: 0.0113
Epoch 474/500
46s - loss: 0.0160 - val_loss: 0.0106
Epoch 475/500
46s - loss: 0.0151 - val_loss: 0.0104
Epoch 476/500
46s - loss: 0.0166 - val_loss: 0.0119
Epoch 477/500
46s - loss: 0.0190 - val_loss: 0.0131
Epoch 478/500
46s - loss: 0.0154 - val_loss: 0.0094
Epoch 479/500
46s - loss: 0.0143 - val_loss: 0.0086
Epoch 480/500
46s - loss: 0.0117 - val_loss: 0.0083
Epoch 481/500
46s - loss: 0.0131 - val_loss: 0.0101
Epoch 482/500
46s - loss: 0.0146 - val_loss: 0.0119
Epoch 483/500
46s - loss: 0.0122 - val_loss: 0.0119
Epoch 484/500
46s - loss: 0.0186 - val_loss: 0.0114
Epoch 485/500
46s - loss: 0.0141 - val_loss: 0.0093
Epoch 486/500
46s - loss: 0.0153 - val_loss: 0.0117
Epoch 487/500
46s - loss: 0.0171 - val_loss: 0.0105
Epoch 488/500
46s - loss: 0.0133 - val_loss: 0.0082
Epoch 489/500
46s - loss: 0.0134 - val_loss: 0.0130
Epoch 490/500
46s - loss: 0.0130 - val_loss: 0.0136
Epoch 491/500
46s - loss: 0.0127 - val_loss: 0.0106
Epoch 492/500
46s - loss: 0.0140 - val_loss: 0.0120
Epoch 493/500
46s - loss: 0.0127 - val_loss: 0.0102
Epoch 494/500
46s - loss: 0.0142 - val_loss: 0.0069
Epoch 495/500
46s - loss: 0.0139 - val_loss: 0.0081
Epoch 496/500
46s - loss: 0.0136 - val_loss: 0.0092
Epoch 497/500
46s - loss: 0.0114 - val_loss: 0.0122
Epoch 498/500
46s - loss: 0.0146 - val_loss: 0.0082
Epoch 499/500
46s - loss: 0.0154 - val_loss: 0.0089
Epoch 500/500
47s - loss: 0.0111 - val_loss: 0.0119
Out[18]:
<keras.callbacks.History at 0x7f8644454fd0>
In [19]:
model.save('../models/super_resolution_deconvolution.h5')

Now let's visualize the results of the super-resolutions produced by the network. In the following figure, the first, second and third columns contains respectively low resolution image, corresponding high resolution image and result of the super-resolution on the low resolution image.

In [20]:
small_img, big_img = next(valid_data_gen)
big_img_pred       = model.predict(small_img, batch_size = batch_size)
visualize([small_img, big_img_pred, big_img], batch_size)
/home/rodgzilla/Documents/machine_learning/keras/keras/backend/tensorflow_backend.py:2252: UserWarning: Expected no kwargs, you passed 1
kwargs passed to function are ignored with Tensorflow backend
  warnings.warn('\n'.join(msg))