This python notebook is using custom layers in Keras to implement Relativistic average GAN.
If you using google colab, please restart your runtime after you install pydot and graphviz
!apt install graphviz
!pip install -q pydot
E: Could not open lock file /var/lib/dpkg/lock - open (13: Permission denied) E: Unable to lock the administration directory (/var/lib/dpkg/), are you root?
% matplotlib inline
import numpy as np
import matplotlib.pyplot as plt
from keras.layers import Input, Dense, Lambda, Conv2D, Conv2DTranspose, Activation, LeakyReLU, Concatenate
from keras.layers import BatchNormalization, GlobalAveragePooling2D, Reshape
import keras.backend as K
from keras.models import Model
from keras.utils import plot_model
from keras.optimizers import *
from keras.utils.generic_utils import Progbar
from time import time
import sys
import os
import pickle
Using TensorFlow backend.
from IPython.display import SVG
from keras.utils.vis_utils import model_to_dot
EPOCHS = 100
BATCHSIZE = 64
TRAINING_RATIO =1
DATASET = 'fashion_mnist' # 'mnist', 'fashion_mnist' 'cifar10'
LOSS = 'BXE' #'BXE' 'LS'
GENERATE_ROW_NUM = 10
OPT = Adam(lr=0.0002, beta_1=0.5, beta_2=0.999)
from keras.datasets import mnist, fashion_mnist, cifar10
exec('(X_train, y_train), (X_test, y_test) = {}.load_data()'.format(DATASET))
X = np.concatenate((X_train, X_test))
if len(X.shape)==3:
X = np.expand_dims(X, axis=-1)
X = X/255*2-1
def DC_Generator(input_shape=(128,) ,output_shape=(28,28,1), dc_shape=(7,7,128), name='Generator'):
layer_num=int(np.log2(output_shape[1]/dc_shape[1]))
z = Input(shape=input_shape)
h = Dense(dc_shape[0]*dc_shape[1]*dc_shape[2], activation='relu',kernel_initializer='glorot_uniform')(z)
h = Reshape(dc_shape)(h)
for i in range(layer_num):
h = Conv2DTranspose(int(dc_shape[2]/(2**(i+1))), kernel_size=4, strides=2, padding='same', activation='relu',kernel_initializer='glorot_uniform')(h)
h = BatchNormalization(momentum=0.9, epsilon=0.00002)(h)
x = Conv2DTranspose(output_shape[-1], kernel_size=3, strides=1, padding='same', activation='tanh',kernel_initializer='glorot_uniform')(h)
model = Model(z,x, name=name)
model.summary()
return model
def DC_Discriminator(input_shape=(28,28,1),layer_num=2, start_dim=64, name='Discriminator'):
x = Input(shape=input_shape)
h = x
for i in range(layer_num):
h = Conv2D(start_dim*(2**i), kernel_size=4, strides=2, padding='same',kernel_initializer='glorot_uniform')(h)
h = LeakyReLU(0.1)(h)
h = GlobalAveragePooling2D()(h)
y = Dense(1,kernel_initializer='glorot_uniform' )(h)
model = Model(x,y, name=name)
model.summary()
return model
if X.shape[2] == 28:
dc_shape = (7,7,128)
dis_layer_num = 2
else:
dc_shape = (4,4,512)
dis_layer_num = 4
generator = DC_Generator(output_shape=X.shape[1:], dc_shape=dc_shape)
discriminator = DC_Discriminator(input_shape=X.shape[1:], layer_num=dis_layer_num)
_________________________________________________________________ Layer (type) Output Shape Param # ================================================================= input_1 (InputLayer) (None, 128) 0 _________________________________________________________________ dense_1 (Dense) (None, 6272) 809088 _________________________________________________________________ reshape_1 (Reshape) (None, 7, 7, 128) 0 _________________________________________________________________ conv2d_transpose_1 (Conv2DTr (None, 14, 14, 64) 131136 _________________________________________________________________ batch_normalization_1 (Batch (None, 14, 14, 64) 256 _________________________________________________________________ conv2d_transpose_2 (Conv2DTr (None, 28, 28, 32) 32800 _________________________________________________________________ batch_normalization_2 (Batch (None, 28, 28, 32) 128 _________________________________________________________________ conv2d_transpose_3 (Conv2DTr (None, 28, 28, 1) 289 ================================================================= Total params: 973,697 Trainable params: 973,505 Non-trainable params: 192 _________________________________________________________________ _________________________________________________________________ Layer (type) Output Shape Param # ================================================================= input_2 (InputLayer) (None, 28, 28, 1) 0 _________________________________________________________________ conv2d_1 (Conv2D) (None, 14, 14, 64) 1088 _________________________________________________________________ leaky_re_lu_1 (LeakyReLU) (None, 14, 14, 64) 0 _________________________________________________________________ conv2d_2 (Conv2D) (None, 7, 7, 128) 131200 _________________________________________________________________ leaky_re_lu_2 (LeakyReLU) (None, 7, 7, 128) 0 _________________________________________________________________ global_average_pooling2d_1 ( (None, 128) 0 _________________________________________________________________ dense_2 (Dense) (None, 1) 129 ================================================================= Total params: 132,417 Trainable params: 132,417 Non-trainable params: 0 _________________________________________________________________
def relativistic_average(input_):
x_0 = input_[0]
x_1 = input_[1]
return x_0 - K.mean(x_1, axis=0)
x_0 = Input(shape=(1,))
x_1 = Input(shape=(1,))
y = Lambda(relativistic_average)([x_0, x_1])
model = Model([x_0,x_1],y)
model.summary()
pre = model.predict([np.array([[9],[4],[8],[7]]), np.array([[5],[5],[6],[6]])])
print('lambda out:', pre)
print('numpy out:', np.array([[9],[4],[8],[7]])-np.mean(np.array([[5],[5],[6],[6]])))
__________________________________________________________________________________________________ Layer (type) Output Shape Param # Connected to ================================================================================================== input_3 (InputLayer) (None, 1) 0 __________________________________________________________________________________________________ input_4 (InputLayer) (None, 1) 0 __________________________________________________________________________________________________ lambda_1 (Lambda) (None, 1) 0 input_3[0][0] input_4[0][0] ================================================================================================== Total params: 0 Trainable params: 0 Non-trainable params: 0 __________________________________________________________________________________________________ lambda out: [[ 3.5] [-1.5] [ 2.5] [ 1.5]] numpy out: [[ 3.5] [-1.5] [ 2.5] [ 1.5]]
Real_image = Input(shape=X.shape[1:])
Noise_input = Input(shape=(128,))
Fake_image = generator(Noise_input)
Discriminator_real_out = discriminator(Real_image)
Discriminator_fake_out = discriminator(Fake_image)
Real_Fake_relativistic_average_out = Lambda(relativistic_average, name='Real_minus_mean_fake')([Discriminator_real_out, Discriminator_fake_out])
Fake_Real_relativistic_average_out = Lambda(relativistic_average, name='Fake_minus_mean_real')([Discriminator_fake_out, Discriminator_real_out])
and $$\tilde{D}(x_{fake})=\text{sigmoid}(C(x_{fake})-\mathbb{E}_{x_{real}\sim\mathbb{P}_{real}}C(x_{real}))$$
if LOSS=='BXE':
Real_Fake_relativistic_average_out = Activation('sigmoid')(Real_Fake_relativistic_average_out)
Fake_Real_relativistic_average_out = Activation('sigmoid')(Fake_Real_relativistic_average_out)
Discriminator_Relativistic_out = Concatenate()([Real_Fake_relativistic_average_out, Fake_Real_relativistic_average_out])
If we use original GAN loss(BXE), we will use keras defult 'binary_crossentropy' loss. If we use Least Square GAN loss, we will use keras defult 'mean_squared_error' loss.
if LOSS=='BXE':
LOSS='binary_crossentropy'
elif LOSS=='LS':
LOSS='mean_squared_error'
Remember we should make discriminator untrainable before we compile it
generator_train = Model([Noise_input, Real_image], Discriminator_Relativistic_out)
discriminator.trainable=False
generator_train.compile(OPT, loss=LOSS)
generator_train.summary()
SVG(model_to_dot(generator_train, show_shapes=True).create(prog='dot', format='svg'))
__________________________________________________________________________________________________ Layer (type) Output Shape Param # Connected to ================================================================================================== input_6 (InputLayer) (None, 128) 0 __________________________________________________________________________________________________ input_5 (InputLayer) (None, 28, 28, 1) 0 __________________________________________________________________________________________________ Generator (Model) (None, 28, 28, 1) 973697 input_6[0][0] __________________________________________________________________________________________________ Discriminator (Model) (None, 1) 132417 input_5[0][0] Generator[1][0] __________________________________________________________________________________________________ Real_minus_mean_fake (Lambda) (None, 1) 0 Discriminator[1][0] Discriminator[2][0] __________________________________________________________________________________________________ Fake_minus_mean_real (Lambda) (None, 1) 0 Discriminator[2][0] Discriminator[1][0] __________________________________________________________________________________________________ activation_1 (Activation) (None, 1) 0 Real_minus_mean_fake[0][0] __________________________________________________________________________________________________ activation_2 (Activation) (None, 1) 0 Fake_minus_mean_real[0][0] __________________________________________________________________________________________________ concatenate_1 (Concatenate) (None, 2) 0 activation_1[0][0] activation_2[0][0] ================================================================================================== Total params: 1,106,114 Trainable params: 973,505 Non-trainable params: 132,609 __________________________________________________________________________________________________
Remember we should make discriminator trainable and generator untrainable before we compile it
discriminator_train = Model([Noise_input, Real_image],Discriminator_Relativistic_out)
generator.trainable = False
discriminator.trainable=True
discriminator_train.summary()
discriminator_train.compile(OPT, loss=LOSS)
SVG(model_to_dot(discriminator_train, show_shapes=True).create(prog='dot', format='svg'))
__________________________________________________________________________________________________ Layer (type) Output Shape Param # Connected to ================================================================================================== input_6 (InputLayer) (None, 128) 0 __________________________________________________________________________________________________ input_5 (InputLayer) (None, 28, 28, 1) 0 __________________________________________________________________________________________________ Generator (Model) (None, 28, 28, 1) 973697 input_6[0][0] __________________________________________________________________________________________________ Discriminator (Model) (None, 1) 132417 input_5[0][0] Generator[1][0] __________________________________________________________________________________________________ Real_minus_mean_fake (Lambda) (None, 1) 0 Discriminator[1][0] Discriminator[2][0] __________________________________________________________________________________________________ Fake_minus_mean_real (Lambda) (None, 1) 0 Discriminator[2][0] Discriminator[1][0] __________________________________________________________________________________________________ activation_1 (Activation) (None, 1) 0 Real_minus_mean_fake[0][0] __________________________________________________________________________________________________ activation_2 (Activation) (None, 1) 0 Fake_minus_mean_real[0][0] __________________________________________________________________________________________________ concatenate_1 (Concatenate) (None, 2) 0 activation_1[0][0] activation_2[0][0] ================================================================================================== Total params: 1,106,114 Trainable params: 132,417 Non-trainable params: 973,697 __________________________________________________________________________________________________
if LOSS=='binary_crossentropy':
true_y = np.ones((BATCHSIZE, 1), dtype=np.float32)
fake_y = np.zeros((BATCHSIZE, 1), dtype=np.float32)
y_for_dis = np.concatenate((true_y, fake_y), axis=1)
y_for_gen = np.concatenate((fake_y, true_y), axis=1)
if LOSS=='mean_squared_error':
true_y = np.ones((BATCHSIZE, 1), dtype=np.float32)
fake_y = -np.ones((BATCHSIZE, 1), dtype=np.float32)
y_for_dis = np.concatenate((true_y, fake_y), axis=1)
y_for_gen = np.concatenate((fake_y, true_y), axis=1)
GENERATE_BATCHSIZE = GENERATE_ROW_NUM*GENERATE_ROW_NUM
test_noise = np.random.randn(GENERATE_BATCHSIZE, 128)
discriminator_loss = list()
generator_loss = list()
for epoch in range(EPOCHS):
np.random.shuffle(X)
print("epoch {} of {}".format(epoch+1, EPOCHS))
num_batches = int(X.shape[0] // BATCHSIZE)
minibatches_size = BATCHSIZE * (TRAINING_RATIO+1)
print("number of batches: {}".format(int(X.shape[0] // (minibatches_size))))
progress_bar = Progbar(target=int(X.shape[0] // minibatches_size))
plt.clf()
start_time = time()
for index in range(int(X.shape[0] // (minibatches_size))):
progress_bar.update(index)
itreation_minibatches = X[index * minibatches_size:(index + 1) * minibatches_size]
for j in range(TRAINING_RATIO):
image_batch = itreation_minibatches[j * BATCHSIZE : (j + 1) * BATCHSIZE]
noise = np.random.randn(BATCHSIZE, 128).astype(np.float32)
discriminator.trainable = True
generator.trainable = False
discriminator_loss.append(discriminator_train.train_on_batch([noise, image_batch],y_for_dis))
image_batch = itreation_minibatches[TRAINING_RATIO*BATCHSIZE : (TRAINING_RATIO + 1) * BATCHSIZE]
noise = np.random.randn(BATCHSIZE, 128).astype(np.float32)
discriminator.trainable = False
generator.trainable = True
generator_loss.append(generator_train.train_on_batch([noise, image_batch], y_for_gen))
print('\nepoch time: {}'.format(time()-start_time))
generated_image = generator.predict(test_noise)
generated_image = (generated_image+1)/2
for i in range(GENERATE_ROW_NUM):
if X.shape[3]==1:
new = generated_image[i*GENERATE_ROW_NUM:i*GENERATE_ROW_NUM+GENERATE_ROW_NUM].reshape(X.shape[2]*GENERATE_ROW_NUM,X.shape[2])
else:
new = generated_image[i*GENERATE_ROW_NUM:i*GENERATE_ROW_NUM+GENERATE_ROW_NUM].reshape(X.shape[2]*GENERATE_ROW_NUM,X.shape[2], 3)
if i!=0:
old = np.concatenate((old,new),axis=1)
else:
old = new
print('plot generated_image')
plt.figure()
if X.shape[-1]==1:
plt.imshow(old, cmap='gray')
else:
plt.imshow(old)
print('plot Loss')
plt.figure()
plt.plot(discriminator_loss)
plt.plot(generator_loss)
plt.legend(['discriminator', 'generator'])
plt.show()
epoch 1 of 100 number of batches: 546 545/546 [============================>.] - ETA: 0s epoch time: 16.89719533920288 plot generated_image plot Loss
<matplotlib.figure.Figure at 0x7f00e15db588>
epoch 2 of 100 number of batches: 546 544/546 [============================>.] - ETA: 0s epoch time: 15.203698635101318 plot generated_image plot Loss
<matplotlib.figure.Figure at 0x7f00e168a390>
epoch 3 of 100 number of batches: 546 544/546 [============================>.] - ETA: 0s epoch time: 15.207426071166992 plot generated_image plot Loss
<matplotlib.figure.Figure at 0x7f00c02d8320>
epoch 4 of 100 number of batches: 546 544/546 [============================>.] - ETA: 0s epoch time: 15.206706523895264 plot generated_image plot Loss
<matplotlib.figure.Figure at 0x7f00c0315710>
epoch 5 of 100 number of batches: 546 544/546 [============================>.] - ETA: 0s epoch time: 15.138116121292114 plot generated_image plot Loss
<matplotlib.figure.Figure at 0x7f00c0315f60>
epoch 6 of 100 number of batches: 546 544/546 [============================>.] - ETA: 0s epoch time: 15.116730213165283 plot generated_image plot Loss
<matplotlib.figure.Figure at 0x7f00c018ce48>
epoch 7 of 100 number of batches: 546 544/546 [============================>.] - ETA: 0s epoch time: 15.125133037567139 plot generated_image plot Loss
<matplotlib.figure.Figure at 0x7f00c1b35588>
epoch 8 of 100 number of batches: 546 544/546 [============================>.] - ETA: 0s epoch time: 15.11116337776184 plot generated_image plot Loss
<matplotlib.figure.Figure at 0x7f00c0066f60>
epoch 9 of 100 number of batches: 546 544/546 [============================>.] - ETA: 0s epoch time: 15.099085092544556 plot generated_image plot Loss
<matplotlib.figure.Figure at 0x7f00a0301b00>
epoch 10 of 100 number of batches: 546 544/546 [============================>.] - ETA: 0s epoch time: 15.124206304550171 plot generated_image plot Loss
<matplotlib.figure.Figure at 0x7f00a01b3dd8>
epoch 11 of 100 number of batches: 546 544/546 [============================>.] - ETA: 0s epoch time: 15.110005140304565 plot generated_image plot Loss
<matplotlib.figure.Figure at 0x7f00a020de10>
epoch 12 of 100 number of batches: 546 544/546 [============================>.] - ETA: 0s epoch time: 15.119497776031494 plot generated_image plot Loss
<matplotlib.figure.Figure at 0x7f00a020d198>
epoch 13 of 100 number of batches: 546 544/546 [============================>.] - ETA: 0s epoch time: 15.096516132354736 plot generated_image plot Loss
<matplotlib.figure.Figure at 0x7f00c1b73860>
epoch 14 of 100 number of batches: 546 544/546 [============================>.] - ETA: 0s epoch time: 15.108959674835205 plot generated_image plot Loss
<matplotlib.figure.Figure at 0x7f00a0409860>
epoch 15 of 100 number of batches: 546 544/546 [============================>.] - ETA: 0s epoch time: 15.083105564117432 plot generated_image plot Loss
<matplotlib.figure.Figure at 0x7f00a0221b38>
epoch 16 of 100 number of batches: 546 544/546 [============================>.] - ETA: 0s epoch time: 15.091355562210083 plot generated_image plot Loss
<matplotlib.figure.Figure at 0x7f00c01f7b70>
epoch 17 of 100 number of batches: 546 544/546 [============================>.] - ETA: 0s epoch time: 15.099490404129028 plot generated_image plot Loss
<matplotlib.figure.Figure at 0x7f00c0388eb8>
epoch 18 of 100 number of batches: 546 544/546 [============================>.] - ETA: 0s epoch time: 15.089349269866943 plot generated_image plot Loss
<matplotlib.figure.Figure at 0x7f00c00eb7b8>
epoch 19 of 100 number of batches: 546 544/546 [============================>.] - ETA: 0s epoch time: 15.109418869018555 plot generated_image plot Loss
<matplotlib.figure.Figure at 0x7f00c1ba7b00>
epoch 20 of 100 number of batches: 546 544/546 [============================>.] - ETA: 0s epoch time: 15.080561637878418 plot generated_image plot Loss
<matplotlib.figure.Figure at 0x7f00c1ba7b38>
epoch 21 of 100 number of batches: 546 544/546 [============================>.] - ETA: 0s epoch time: 15.078979969024658 plot generated_image plot Loss
<matplotlib.figure.Figure at 0x7f00c03eb630>
epoch 22 of 100 number of batches: 546 544/546 [============================>.] - ETA: 0s epoch time: 15.078155755996704 plot generated_image plot Loss
<matplotlib.figure.Figure at 0x7f00a0395828>
epoch 23 of 100 number of batches: 546 544/546 [============================>.] - ETA: 0s epoch time: 15.08890986442566 plot generated_image plot Loss
<matplotlib.figure.Figure at 0x7f00c02d40b8>
epoch 24 of 100 number of batches: 546 544/546 [============================>.] - ETA: 0s epoch time: 15.072980880737305 plot generated_image plot Loss
<matplotlib.figure.Figure at 0x7f00a0354ba8>
epoch 25 of 100 number of batches: 546 544/546 [============================>.] - ETA: 0s epoch time: 15.115654706954956 plot generated_image plot Loss
<matplotlib.figure.Figure at 0x7f00a048cc18>
epoch 26 of 100 number of batches: 546 544/546 [============================>.] - ETA: 0s epoch time: 15.090907335281372 plot generated_image plot Loss
<matplotlib.figure.Figure at 0x7f00c01896a0>
epoch 27 of 100 number of batches: 546 544/546 [============================>.] - ETA: 0s epoch time: 15.088297367095947 plot generated_image plot Loss
<matplotlib.figure.Figure at 0x7f00c1ac2668>
epoch 28 of 100 number of batches: 546 544/546 [============================>.] - ETA: 0s epoch time: 15.060896396636963 plot generated_image plot Loss
<matplotlib.figure.Figure at 0x7f00c1ac2320>
epoch 29 of 100 number of batches: 546 544/546 [============================>.] - ETA: 0s epoch time: 15.057746410369873 plot generated_image plot Loss
<matplotlib.figure.Figure at 0x7f00a00a1f28>
epoch 30 of 100 number of batches: 546 544/546 [============================>.] - ETA: 0s epoch time: 15.062651634216309 plot generated_image plot Loss
<matplotlib.figure.Figure at 0x7f00c0378f60>
epoch 31 of 100 number of batches: 546 544/546 [============================>.] - ETA: 0s epoch time: 15.058683156967163 plot generated_image plot Loss
<matplotlib.figure.Figure at 0x7f00c02b13c8>
epoch 32 of 100 number of batches: 546 544/546 [============================>.] - ETA: 0s epoch time: 15.066499471664429 plot generated_image plot Loss
<matplotlib.figure.Figure at 0x7f00c017eb70>
epoch 33 of 100 number of batches: 546 544/546 [============================>.] - ETA: 0s epoch time: 15.068830728530884 plot generated_image plot Loss
<matplotlib.figure.Figure at 0x7f00c1b2eba8>
epoch 34 of 100 number of batches: 546 544/546 [============================>.] - ETA: 0s epoch time: 15.04741644859314 plot generated_image plot Loss
<matplotlib.figure.Figure at 0x7f00c0132518>
epoch 35 of 100 number of batches: 546 544/546 [============================>.] - ETA: 0s epoch time: 15.07057237625122 plot generated_image plot Loss
<matplotlib.figure.Figure at 0x7f00c017e588>
epoch 36 of 100 number of batches: 546 544/546 [============================>.] - ETA: 0s epoch time: 15.057405710220337 plot generated_image plot Loss
<matplotlib.figure.Figure at 0x7f00c1bf08d0>
epoch 37 of 100 number of batches: 546 544/546 [============================>.] - ETA: 0s epoch time: 15.069922685623169 plot generated_image plot Loss
<matplotlib.figure.Figure at 0x7f00c00d5390>
epoch 38 of 100 number of batches: 546 544/546 [============================>.] - ETA: 0s epoch time: 15.063660621643066 plot generated_image plot Loss
<matplotlib.figure.Figure at 0x7f00a043e9b0>
epoch 39 of 100 number of batches: 546 544/546 [============================>.] - ETA: 0s epoch time: 15.07515025138855 plot generated_image plot Loss
<matplotlib.figure.Figure at 0x7f00a026c748>
epoch 40 of 100 number of batches: 546 544/546 [============================>.] - ETA: 0s epoch time: 15.065810680389404 plot generated_image plot Loss
<matplotlib.figure.Figure at 0x7f00a026cdd8>
epoch 41 of 100 number of batches: 546 544/546 [============================>.] - ETA: 0s epoch time: 15.056687355041504 plot generated_image plot Loss
<matplotlib.figure.Figure at 0x7f00c1a46eb8>
epoch 42 of 100 number of batches: 546 544/546 [============================>.] - ETA: 0s epoch time: 15.045896530151367 plot generated_image plot Loss
<matplotlib.figure.Figure at 0x7f00c1a46da0>
epoch 43 of 100 number of batches: 546 544/546 [============================>.] - ETA: 0s epoch time: 15.043113231658936 plot generated_image plot Loss
<matplotlib.figure.Figure at 0x7f00c01f2a20>
epoch 44 of 100 number of batches: 546 544/546 [============================>.] - ETA: 0s epoch time: 15.04243516921997 plot generated_image plot Loss
<matplotlib.figure.Figure at 0x7f00c01f29e8>
epoch 45 of 100 number of batches: 546 544/546 [============================>.] - ETA: 0s epoch time: 15.045885801315308 plot generated_image plot Loss
<matplotlib.figure.Figure at 0x7f00c00cb898>
epoch 46 of 100 number of batches: 546 544/546 [============================>.] - ETA: 0s epoch time: 15.06371283531189 plot generated_image plot Loss
<matplotlib.figure.Figure at 0x7f00a04cc080>
epoch 47 of 100 number of batches: 546 544/546 [============================>.] - ETA: 0s epoch time: 15.046988487243652 plot generated_image plot Loss
<matplotlib.figure.Figure at 0x7f00c1b0c978>
epoch 48 of 100 number of batches: 546 544/546 [============================>.] - ETA: 0s epoch time: 15.075071573257446 plot generated_image plot Loss
<matplotlib.figure.Figure at 0x7f00c1a44898>
epoch 49 of 100 number of batches: 546 544/546 [============================>.] - ETA: 0s epoch time: 15.047409057617188 plot generated_image plot Loss
<matplotlib.figure.Figure at 0x7f00a022f940>
epoch 50 of 100 number of batches: 546 544/546 [============================>.] - ETA: 0s epoch time: 15.04529333114624 plot generated_image plot Loss
<matplotlib.figure.Figure at 0x7f00c0075be0>
epoch 51 of 100 number of batches: 546 544/546 [============================>.] - ETA: 0s epoch time: 15.060620546340942 plot generated_image plot Loss
<matplotlib.figure.Figure at 0x7f00c0075ac8>
epoch 52 of 100 number of batches: 546 544/546 [============================>.] - ETA: 0s epoch time: 15.049874305725098 plot generated_image plot Loss
<matplotlib.figure.Figure at 0x7f00c012d400>
epoch 53 of 100 number of batches: 546 544/546 [============================>.] - ETA: 0s epoch time: 15.040600538253784 plot generated_image plot Loss
<matplotlib.figure.Figure at 0x7f00a0419198>
epoch 54 of 100 number of batches: 546 544/546 [============================>.] - ETA: 0s epoch time: 15.04722261428833 plot generated_image plot Loss
<matplotlib.figure.Figure at 0x7f0110147320>
epoch 55 of 100 number of batches: 546 544/546 [============================>.] - ETA: 0s epoch time: 15.048543930053711 plot generated_image plot Loss
<matplotlib.figure.Figure at 0x7f0110147358>
epoch 56 of 100 number of batches: 546 544/546 [============================>.] - ETA: 0s epoch time: 15.076906204223633 plot generated_image plot Loss
<matplotlib.figure.Figure at 0x7f00a0457a90>
epoch 57 of 100 number of batches: 546 544/546 [============================>.] - ETA: 0s epoch time: 15.044974327087402 plot generated_image plot Loss
<matplotlib.figure.Figure at 0x7f00c1a0ce48>
epoch 58 of 100 number of batches: 546 544/546 [============================>.] - ETA: 0s epoch time: 15.044674634933472 plot generated_image plot Loss
<matplotlib.figure.Figure at 0x7f00c036a828>
epoch 59 of 100 number of batches: 546 544/546 [============================>.] - ETA: 0s epoch time: 15.038030862808228 plot generated_image plot Loss
<matplotlib.figure.Figure at 0x7f00c036a7b8>
epoch 60 of 100 number of batches: 546 544/546 [============================>.] - ETA: 0s epoch time: 15.044466018676758 plot generated_image plot Loss
<matplotlib.figure.Figure at 0x7f00a0056a58>
epoch 61 of 100 number of batches: 546 544/546 [============================>.] - ETA: 0s epoch time: 15.074561595916748 plot generated_image plot Loss
<matplotlib.figure.Figure at 0x7f00a00560b8>
epoch 62 of 100 number of batches: 546 544/546 [============================>.] - ETA: 0s epoch time: 15.035974025726318 plot generated_image plot Loss
<matplotlib.figure.Figure at 0x7f00a017afd0>
epoch 63 of 100 number of batches: 546 544/546 [============================>.] - ETA: 0s epoch time: 15.032606840133667 plot generated_image plot Loss
<matplotlib.figure.Figure at 0x7f00c01a1240>
epoch 64 of 100 number of batches: 546 544/546 [============================>.] - ETA: 0s epoch time: 15.035537481307983 plot generated_image plot Loss
<matplotlib.figure.Figure at 0x7f00c01a19e8>
epoch 65 of 100 number of batches: 546 544/546 [============================>.] - ETA: 0s epoch time: 15.033587217330933 plot generated_image plot Loss
<matplotlib.figure.Figure at 0x7f00c0241e48>
epoch 66 of 100 number of batches: 546 544/546 [============================>.] - ETA: 0s epoch time: 15.038937330245972 plot generated_image plot Loss
<matplotlib.figure.Figure at 0x7f00c003f208>
epoch 67 of 100 number of batches: 546 544/546 [============================>.] - ETA: 0s epoch time: 15.035195589065552 plot generated_image plot Loss
<matplotlib.figure.Figure at 0x7f00a028c668>
epoch 68 of 100 number of batches: 546 544/546 [============================>.] - ETA: 0s epoch time: 15.031373023986816 plot generated_image plot Loss
<matplotlib.figure.Figure at 0x7f00c1b74390>
epoch 69 of 100 number of batches: 546 544/546 [============================>.] - ETA: 0s epoch time: 15.062103033065796 plot generated_image plot Loss
<matplotlib.figure.Figure at 0x7f00687bcef0>
epoch 70 of 100 number of batches: 546 544/546 [============================>.] - ETA: 0s epoch time: 15.034269332885742 plot generated_image plot Loss
<matplotlib.figure.Figure at 0x7f00687bce80>
epoch 71 of 100 number of batches: 546 544/546 [============================>.] - ETA: 0s epoch time: 15.029695749282837 plot generated_image plot Loss
<matplotlib.figure.Figure at 0x7f00687e72b0>
epoch 72 of 100 number of batches: 546 544/546 [============================>.] - ETA: 0s epoch time: 15.033279657363892 plot generated_image plot Loss
<matplotlib.figure.Figure at 0x7f00c0159ba8>
epoch 73 of 100 number of batches: 546 544/546 [============================>.] - ETA: 0s epoch time: 15.01935338973999 plot generated_image plot Loss
<matplotlib.figure.Figure at 0x7f00c1a0b860>
epoch 74 of 100 number of batches: 546 544/546 [============================>.] - ETA: 0s epoch time: 15.018490076065063 plot generated_image plot Loss
<matplotlib.figure.Figure at 0x7f00c03f9240>
epoch 75 of 100 number of batches: 546 544/546 [============================>.] - ETA: 0s epoch time: 15.01839280128479 plot generated_image plot Loss
<matplotlib.figure.Figure at 0x7f00c03f9cc0>
epoch 76 of 100 number of batches: 546 544/546 [============================>.] - ETA: 0s epoch time: 15.013611793518066 plot generated_image plot Loss
<matplotlib.figure.Figure at 0x7f00a02460b8>
epoch 77 of 100 number of batches: 546 544/546 [============================>.] - ETA: 0s epoch time: 15.065334558486938 plot generated_image plot Loss
<matplotlib.figure.Figure at 0x7f00c0167da0>
epoch 78 of 100 number of batches: 546 544/546 [============================>.] - ETA: 0s epoch time: 15.01820993423462 plot generated_image plot Loss
<matplotlib.figure.Figure at 0x7f00a01c08d0>
epoch 79 of 100 number of batches: 546 544/546 [============================>.] - ETA: 0s epoch time: 15.028219223022461 plot generated_image plot Loss
<matplotlib.figure.Figure at 0x7f00a01c0ac8>
epoch 80 of 100 number of batches: 546 544/546 [============================>.] - ETA: 0s epoch time: 15.001707553863525 plot generated_image plot Loss
<matplotlib.figure.Figure at 0x7f00a047ec88>
epoch 81 of 100 number of batches: 546 544/546 [============================>.] - ETA: 0s epoch time: 15.010579586029053 plot generated_image plot Loss
<matplotlib.figure.Figure at 0x7f00a0436eb8>
epoch 82 of 100 number of batches: 546 544/546 [============================>.] - ETA: 0s epoch time: 15.010360717773438 plot generated_image plot Loss
<matplotlib.figure.Figure at 0x7f00a0436828>
epoch 83 of 100 number of batches: 546 544/546 [============================>.] - ETA: 0s epoch time: 15.01469373703003 plot generated_image plot Loss
<matplotlib.figure.Figure at 0x7f01100a7320>
epoch 84 of 100 number of batches: 546 544/546 [============================>.] - ETA: 0s epoch time: 15.022291660308838 plot generated_image plot Loss
<matplotlib.figure.Figure at 0x7f00c01bfb70>
epoch 85 of 100 number of batches: 546 544/546 [============================>.] - ETA: 0s epoch time: 15.047173023223877 plot generated_image plot Loss
<matplotlib.figure.Figure at 0x7f00a0229748>
epoch 86 of 100 number of batches: 546 544/546 [============================>.] - ETA: 0s epoch time: 15.013346195220947 plot generated_image plot Loss
<matplotlib.figure.Figure at 0x7f00c039d940>
epoch 87 of 100 number of batches: 546 544/546 [============================>.] - ETA: 0s epoch time: 15.012192249298096 plot generated_image plot Loss
<matplotlib.figure.Figure at 0x7f00c039d908>
epoch 88 of 100 number of batches: 546 544/546 [============================>.] - ETA: 0s epoch time: 15.013889789581299 plot generated_image plot Loss
<matplotlib.figure.Figure at 0x7f00c0226518>
epoch 89 of 100 number of batches: 546 544/546 [============================>.] - ETA: 0s epoch time: 15.016473293304443 plot generated_image plot Loss
<matplotlib.figure.Figure at 0x7f00c0226fd0>
epoch 90 of 100 number of batches: 546 544/546 [============================>.] - ETA: 0s epoch time: 15.012144565582275 plot generated_image plot Loss
<matplotlib.figure.Figure at 0x7f00687afc18>
epoch 91 of 100 number of batches: 546 544/546 [============================>.] - ETA: 0s epoch time: 15.011441946029663 plot generated_image plot Loss
<matplotlib.figure.Figure at 0x7f00c006c748>
epoch 92 of 100 number of batches: 546 544/546 [============================>.] - ETA: 0s epoch time: 15.022628784179688 plot generated_image plot Loss
<matplotlib.figure.Figure at 0x7f00a0360550>
epoch 93 of 100 number of batches: 546 544/546 [============================>.] - ETA: 0s epoch time: 15.036357879638672 plot generated_image plot Loss
<matplotlib.figure.Figure at 0x7f00c0255748>
epoch 94 of 100 number of batches: 546 544/546 [============================>.] - ETA: 0s epoch time: 15.011064052581787 plot generated_image plot Loss
<matplotlib.figure.Figure at 0x7f00c0280f28>
epoch 95 of 100 number of batches: 546 544/546 [============================>.] - ETA: 0s epoch time: 15.005614519119263 plot generated_image plot Loss
<matplotlib.figure.Figure at 0x7f00687279e8>
epoch 96 of 100 number of batches: 546 544/546 [============================>.] - ETA: 0s epoch time: 14.99910855293274 plot generated_image plot Loss
<matplotlib.figure.Figure at 0x7f0068727ac8>
epoch 97 of 100 number of batches: 546 544/546 [============================>.] - ETA: 0s epoch time: 15.00034785270691 plot generated_image plot Loss
<matplotlib.figure.Figure at 0x7f00a023b208>
epoch 98 of 100 number of batches: 546 544/546 [============================>.] - ETA: 0s epoch time: 14.998729705810547 plot generated_image plot Loss
<matplotlib.figure.Figure at 0x7f0068734b70>
epoch 99 of 100 number of batches: 546 544/546 [============================>.] - ETA: 0s epoch time: 14.998383283615112 plot generated_image plot Loss
<matplotlib.figure.Figure at 0x7f00c0297f28>
epoch 100 of 100 number of batches: 546 544/546 [============================>.] - ETA: 0s epoch time: 15.003016710281372 plot generated_image plot Loss
<matplotlib.figure.Figure at 0x7f00c0297f60>