Deep Learning Models -- A collection of various deep learning architectures, models, and tips for TensorFlow and PyTorch in Jupyter Notebooks.

In [1]:
%load_ext watermark
%watermark -a 'Sebastian Raschka' -v -p torch
Sebastian Raschka 

CPython 3.7.3
IPython 7.9.0

torch 1.4.0
  • Runs on CPU or GPU (if available)

Deep Convolutional GAN (for CelebA Face Images)

Implementation of a deep convolutional GAN (DCGAN) to generate new faces based on seeing examples of face images from CelebA (http://mmlab.ie.cuhk.edu.hk/projects/CelebA.html).

This DCGAN architecture is based on Radford et al.'s Unsupervised representation learning with deep convolutional generative adversarial networks [1], where the generator consists of

  • transposed convolutional layers
  • BatchNorm
  • ReLU

and the discriminator consists of

  • strided convolutional layers (no maxpooling)
  • BatchNorm
  • Leaky ReLU

References

Imports

In [2]:
import time
import os
import numpy as np
import torch
import random

import torch.nn.functional as F
import torch.nn as nn
import torchvision.utils as vutils

from PIL import Image
from torch.utils.data import Dataset
from torch.utils.data import DataLoader
from torchvision import transforms


if torch.cuda.is_available():
    torch.backends.cudnn.deterministic = True
In [3]:
import matplotlib.pyplot as plt
%matplotlib inline

Settings

In [4]:
##########################
### SETTINGS
##########################

# Device
CUDA = 'cuda:0'
DEVICE = torch.device(CUDA if torch.cuda.is_available() else "cpu")

# Hyperparameters
RANDOM_SEED = 42
GENERATOR_LEARNING_RATE = 0.0002
DISCRIMINATOR_LEARNING_RATE = 0.0002
NUM_EPOCHS = 50
BATCH_SIZE = 128
NUM_WORKERS = 4 # workers for data loader

IMAGE_SIZE = (64, 64, 3)

# Size of the latent vector
LATENT_DIM = 100

# Number of feature maps in generator and discriminator
NUM_MAPS_GEN = 64
NUM_MAPS_DIS = 64

# Set random seeds for reproducibility
random.seed(RANDOM_SEED)
np.random.seed(RANDOM_SEED)
torch.manual_seed(RANDOM_SEED);

CelebA

Download the "Align&Cropped Images" Dataset from http://mmlab.ie.cuhk.edu.hk/projects/CelebA.html:

Then, unzip the dataset. Since GANs are a method for unsupervised learning, we are only interested in the images (saved as .jpg files in the img_align_celeba subfolder) -- we don't need the class labels here. There are 202,600 images in total.

In [5]:
img = Image.open(os.path.join('celeba', 'img_align_celeba', '125696.jpg'))
print(np.asarray(img, dtype=np.uint8).shape)
plt.imshow(img);
(218, 178, 3)

Dataloaders

In [6]:
class CelebaDataset(Dataset):
    """Custom Dataset for loading CelebA face images"""

    def __init__(self, img_dir, transform=None):
    
        self.img_dir = img_dir
        
        self.img_names = [i for i in 
                          os.listdir(img_dir) 
                          if i.endswith('.jpg')]

        self.transform = transform

    def __getitem__(self, index):
        img = Image.open(os.path.join(self.img_dir,
                                      self.img_names[index]))
        
        if self.transform is not None:
            img = self.transform(img)
        
        return img

    def __len__(self):
        return len(self.img_names)
In [7]:
data_transforms = {
    'train': transforms.Compose([
        #transforms.RandomRotation(5),
        #transforms.RandomHorizontalFlip(),
        transforms.RandomResizedCrop(IMAGE_SIZE[0], scale=(0.96, 1.0), ratio=(0.95, 1.05)),
        transforms.ToTensor(),
        # normalize images to [-1, 1] range
        transforms.Normalize((0.5, 0.5, 0.5), (0.5, 0.5, 0.5))
    ]),
    'valid': transforms.Compose([
        transforms.Resize([IMAGE_SIZE[0], IMAGE_SIZE[1]]),
        transforms.ToTensor(),
        # normalize images to [-1, 1] range
        transforms.Normalize((0.5, 0.5, 0.5), (0.5, 0.5, 0.5))
    ]),
}


train_dataset = CelebaDataset(img_dir=os.path.join('celeba', 'img_align_celeba'), 
                                transform=data_transforms['train'])

train_loader = DataLoader(dataset=train_dataset, 
                          batch_size=BATCH_SIZE,
                          drop_last=True,
                          num_workers=NUM_WORKERS,
                          shuffle=True)

# We don't need validation and test sets for GANs, which are unsupervised models
In [8]:
real_batch = next(iter(train_loader))
plt.figure(figsize=(8, 8))
plt.axis("off")
plt.title("Training Images")
plt.imshow(np.transpose(vutils.make_grid(real_batch[:64], 
                                         padding=2, normalize=True),
                        (1, 2, 0)))
Out[8]:
<matplotlib.image.AxesImage at 0x7f5a60376d68>

Model

In [9]:
# Some model code is loosely inspired by 
# https://pytorch.org/tutorials/beginner/dcgan_faces_tutorial.html

def weights_init(module):
    """
    Function that initializes weights according to
    Radford et al.'s DCGAN paper
    """
    classname = module.__class__.__name__
    if classname.find('Conv') != -1:
        nn.init.normal_(module.weight.data, 0.0, 0.02)
    elif classname.find('BatchNorm') != -1:
        nn.init.normal_(module.weight.data, 1.0, 0.02)
        nn.init.constant_(module.bias.data, 0)
In [10]:
##########################
### MODEL
##########################

class DCGAN(torch.nn.Module):

    def __init__(self):
        super(DCGAN, self).__init__()
        
        
        self.generator = nn.Sequential(
            #
            # input size: vector z of size LATENT_DIM
            #
            nn.ConvTranspose2d(LATENT_DIM, NUM_MAPS_GEN*8, 
                               kernel_size=4, stride=1, padding=0,
                               bias=False), # bias is redundant when using BatchNorm
            nn.BatchNorm2d(NUM_MAPS_GEN*8),
            nn.ReLU(True),
            #
            # size: NUM_MAPS_GEN*8 x 4 x 4
            #
            nn.ConvTranspose2d(NUM_MAPS_GEN*8, NUM_MAPS_GEN*4, 
                               kernel_size=4, stride=2, padding=1,
                               bias=False),
            nn.BatchNorm2d(NUM_MAPS_GEN*4),
            nn.ReLU(True),
            #
            # size: NUM_MAPS_GEN*4 x 8 x 8
            #
            nn.ConvTranspose2d(NUM_MAPS_GEN*4, NUM_MAPS_GEN*2, 
                               kernel_size=4, stride=2, padding=1,
                               bias=False),
            nn.BatchNorm2d(NUM_MAPS_GEN*2),
            nn.ReLU(True),
            #
            # size: NUM_MAPS_GEN*2 x 16 x 16
            #
            nn.ConvTranspose2d(NUM_MAPS_GEN*2, NUM_MAPS_GEN, 
                               kernel_size=4, stride=2, padding=1,
                               bias=False),
            nn.BatchNorm2d(NUM_MAPS_GEN),
            nn.ReLU(True),   
            #
            # size: NUM_MAPS_GEN x 32 x 32
            #
            nn.ConvTranspose2d(NUM_MAPS_GEN, IMAGE_SIZE[2], 
                               kernel_size=4, stride=2, padding=1,
                               bias=False),
            #
            # size: IMAGE_SIZE[2] x 64 x 64
            #  
            nn.Tanh()
        )
        
        self.discriminator = nn.Sequential(
            #
            # input size IMAGE_SIZE[2] x IMAGE_SIZE[0] x IMAGE_SIZE[1]
            #
            nn.Conv2d(IMAGE_SIZE[2], NUM_MAPS_DIS,
                      kernel_size=4, stride=2, padding=1),
            nn.LeakyReLU(0.2, inplace=True),
            #
            # size: NUM_MAPS_DIS x 32 x 32
            #              
            nn.Conv2d(NUM_MAPS_DIS, NUM_MAPS_DIS*2,
                      kernel_size=4, stride=2, padding=1,
                      bias=False),        
            nn.BatchNorm2d(NUM_MAPS_DIS*2),
            nn.LeakyReLU(0.2, inplace=True),
            #
            # size: NUM_MAPS_DIS*2 x 16 x 16
            #   
            nn.Conv2d(NUM_MAPS_DIS*2, NUM_MAPS_DIS*4,
                      kernel_size=4, stride=2, padding=1,
                      bias=False),        
            nn.BatchNorm2d(NUM_MAPS_DIS*4),
            nn.LeakyReLU(0.2, inplace=True),
            #
            # size: NUM_MAPS_DIS*4 x 8 x 8
            #   
            nn.Conv2d(NUM_MAPS_DIS*4, NUM_MAPS_DIS*8,
                      kernel_size=4, stride=2, padding=1,
                      bias=False),        
            nn.BatchNorm2d(NUM_MAPS_DIS*8),
            nn.LeakyReLU(0.2, inplace=True),
            #
            # size: NUM_MAPS_DIS*8 x 4 x 4
            #   
            nn.Conv2d(NUM_MAPS_DIS*8, 1,
                      kernel_size=4, stride=1, padding=0),
            nn.Sigmoid()
        )

            
    def generator_forward(self, z):
        img = self.generator(z)
        return img
    
    def discriminator_forward(self, img):
        pred = model.discriminator(img)
        return pred
In [11]:
torch.manual_seed(RANDOM_SEED)

loss_function = nn.BCELoss()

real_label = 1
fake_label = 0

# Batch of latent (noise) vectors for
# evaluating / visualizing the training progress
# of the generator
fixed_noise = torch.randn(64, LATENT_DIM, 1, 1, device=DEVICE)

model = DCGAN()
model = model.to(DEVICE)
model.apply(weights_init)

print(model)
DCGAN(
  (generator): Sequential(
    (0): ConvTranspose2d(100, 512, kernel_size=(4, 4), stride=(1, 1), bias=False)
    (1): BatchNorm2d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
    (2): ReLU(inplace=True)
    (3): ConvTranspose2d(512, 256, kernel_size=(4, 4), stride=(2, 2), padding=(1, 1), bias=False)
    (4): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
    (5): ReLU(inplace=True)
    (6): ConvTranspose2d(256, 128, kernel_size=(4, 4), stride=(2, 2), padding=(1, 1), bias=False)
    (7): BatchNorm2d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
    (8): ReLU(inplace=True)
    (9): ConvTranspose2d(128, 64, kernel_size=(4, 4), stride=(2, 2), padding=(1, 1), bias=False)
    (10): BatchNorm2d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
    (11): ReLU(inplace=True)
    (12): ConvTranspose2d(64, 3, kernel_size=(4, 4), stride=(2, 2), padding=(1, 1), bias=False)
    (13): Tanh()
  )
  (discriminator): Sequential(
    (0): Conv2d(3, 64, kernel_size=(4, 4), stride=(2, 2), padding=(1, 1))
    (1): LeakyReLU(negative_slope=0.2, inplace=True)
    (2): Conv2d(64, 128, kernel_size=(4, 4), stride=(2, 2), padding=(1, 1), bias=False)
    (3): BatchNorm2d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
    (4): LeakyReLU(negative_slope=0.2, inplace=True)
    (5): Conv2d(128, 256, kernel_size=(4, 4), stride=(2, 2), padding=(1, 1), bias=False)
    (6): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
    (7): LeakyReLU(negative_slope=0.2, inplace=True)
    (8): Conv2d(256, 512, kernel_size=(4, 4), stride=(2, 2), padding=(1, 1), bias=False)
    (9): BatchNorm2d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
    (10): LeakyReLU(negative_slope=0.2, inplace=True)
    (11): Conv2d(512, 1, kernel_size=(4, 4), stride=(1, 1))
    (12): Sigmoid()
  )
)
In [12]:
from torchsummary import summary

# torchsummary can only use default cuda device, which
# causes issues if e.g., cuda:1 is used

with torch.cuda.device(int(CUDA.split(':')[-1])):
    summary(model.generator, input_size=(100, 1, 1), device='cuda')
    summary(model.discriminator, input_size=((IMAGE_SIZE[2], IMAGE_SIZE[0], IMAGE_SIZE[1])), device='cuda')
----------------------------------------------------------------
        Layer (type)               Output Shape         Param #
================================================================
   ConvTranspose2d-1            [-1, 512, 4, 4]         819,200
       BatchNorm2d-2            [-1, 512, 4, 4]           1,024
              ReLU-3            [-1, 512, 4, 4]               0
   ConvTranspose2d-4            [-1, 256, 8, 8]       2,097,152
       BatchNorm2d-5            [-1, 256, 8, 8]             512
              ReLU-6            [-1, 256, 8, 8]               0
   ConvTranspose2d-7          [-1, 128, 16, 16]         524,288
       BatchNorm2d-8          [-1, 128, 16, 16]             256
              ReLU-9          [-1, 128, 16, 16]               0
  ConvTranspose2d-10           [-1, 64, 32, 32]         131,072
      BatchNorm2d-11           [-1, 64, 32, 32]             128
             ReLU-12           [-1, 64, 32, 32]               0
  ConvTranspose2d-13            [-1, 3, 64, 64]           3,072
             Tanh-14            [-1, 3, 64, 64]               0
================================================================
Total params: 3,576,704
Trainable params: 3,576,704
Non-trainable params: 0
----------------------------------------------------------------
Input size (MB): 0.00
Forward/backward pass size (MB): 3.00
Params size (MB): 13.64
Estimated Total Size (MB): 16.64
----------------------------------------------------------------
----------------------------------------------------------------
        Layer (type)               Output Shape         Param #
================================================================
            Conv2d-1           [-1, 64, 32, 32]           3,136
         LeakyReLU-2           [-1, 64, 32, 32]               0
            Conv2d-3          [-1, 128, 16, 16]         131,072
       BatchNorm2d-4          [-1, 128, 16, 16]             256
         LeakyReLU-5          [-1, 128, 16, 16]               0
            Conv2d-6            [-1, 256, 8, 8]         524,288
       BatchNorm2d-7            [-1, 256, 8, 8]             512
         LeakyReLU-8            [-1, 256, 8, 8]               0
            Conv2d-9            [-1, 512, 4, 4]       2,097,152
      BatchNorm2d-10            [-1, 512, 4, 4]           1,024
        LeakyReLU-11            [-1, 512, 4, 4]               0
           Conv2d-12              [-1, 1, 1, 1]           8,193
          Sigmoid-13              [-1, 1, 1, 1]               0
================================================================
Total params: 2,765,633
Trainable params: 2,765,633
Non-trainable params: 0
----------------------------------------------------------------
Input size (MB): 0.05
Forward/backward pass size (MB): 2.31
Params size (MB): 10.55
Estimated Total Size (MB): 12.91
----------------------------------------------------------------
In [13]:
optim_gener = torch.optim.Adam(model.generator.parameters(),
                               betas=(0.5, 0.999),
                               lr=GENERATOR_LEARNING_RATE)
optim_discr = torch.optim.Adam(model.discriminator.parameters(),
                               betas=(0.5, 0.999),
                               lr=DISCRIMINATOR_LEARNING_RATE)

Training

In [14]:
start_time = time.time()    

discr_costs = []
gener_costs = []
images_from_noise = []


for epoch in range(NUM_EPOCHS):
    model = model.train()
    for batch_idx, features in enumerate(train_loader):

        
        # --------------------------
        # Train Discriminator
        # --------------------------        
        
        optim_discr.zero_grad()
        
        real_images = features.to(DEVICE)
        num_real = real_images.size(0)
        real_label_vec = torch.full((num_real,), real_label, device=DEVICE)
        
        # get discriminator loss on real images
        discr_pred_real = model.discriminator_forward(real_images).view(-1)
        real_loss = loss_function(discr_pred_real, real_label_vec)
        #real_loss.backward()
        
        # get discriminator loss on fake images
        random_vec = torch.randn(BATCH_SIZE, LATENT_DIM, 1, 1, device=DEVICE)
        fake_images = model.generator_forward(random_vec)
        fake_label_vec = torch.full((num_real,), fake_label, device=DEVICE)
        discr_pred_fake = model.discriminator_forward(fake_images.detach()).view(-1)
        fake_loss = loss_function(discr_pred_fake, fake_label_vec)
        #fake_loss.backward()        

        # combined loss
        discr_loss = 0.5*(real_loss + fake_loss)
        discr_loss.backward()

        optim_discr.step()        
  
        # --------------------------
        # Train Generator
        # --------------------------      

        optim_gener.zero_grad()        
        
        discr_pred_fake = model.discriminator_forward(fake_images).view(-1)
        gener_loss = loss_function(discr_pred_fake, real_label_vec)
        gener_loss.backward()

        optim_gener.step()
        
        # --------------------------
        # Logging
        # --------------------------
        discr_costs.append(discr_loss.item())
        gener_costs.append(gener_loss.item())
        
        
        ### LOGGING
        if not batch_idx % 100:
            print ('Epoch: %03d/%03d | Batch %03d/%03d | Gen/Dis Loss: %.4f/%.4f' 
                   %(epoch+1, NUM_EPOCHS, batch_idx, 
                     len(train_loader), gener_loss, discr_loss))
            
    ### Save images for evaluation
    with torch.no_grad():
        fake_images = model.generator_forward(fixed_noise).detach().cpu()
        images_from_noise.append(
            vutils.make_grid(fake_images, padding=2, normalize=True))
            
    print('Time elapsed: %.2f min' % ((time.time() - start_time)/60))
    
print('Total Training Time: %.2f min' % ((time.time() - start_time)/60))
Epoch: 001/050 | Batch 000/1582 | Gen/Dis Loss: 5.0977/0.8404
Epoch: 001/050 | Batch 100/1582 | Gen/Dis Loss: 33.1598/0.0001
Epoch: 001/050 | Batch 200/1582 | Gen/Dis Loss: 17.1698/0.5191
Epoch: 001/050 | Batch 300/1582 | Gen/Dis Loss: 4.4210/0.5183
Epoch: 001/050 | Batch 400/1582 | Gen/Dis Loss: 1.7246/0.4012
Epoch: 001/050 | Batch 500/1582 | Gen/Dis Loss: 1.3876/0.4557
Epoch: 001/050 | Batch 600/1582 | Gen/Dis Loss: 2.3218/0.4525
Epoch: 001/050 | Batch 700/1582 | Gen/Dis Loss: 3.3357/0.4172
Epoch: 001/050 | Batch 800/1582 | Gen/Dis Loss: 4.8874/0.4200
Epoch: 001/050 | Batch 900/1582 | Gen/Dis Loss: 3.3706/0.3585
Epoch: 001/050 | Batch 1000/1582 | Gen/Dis Loss: 2.2442/0.5014
Epoch: 001/050 | Batch 1100/1582 | Gen/Dis Loss: 4.8756/0.5768
Epoch: 001/050 | Batch 1200/1582 | Gen/Dis Loss: 2.6479/0.4212
Epoch: 001/050 | Batch 1300/1582 | Gen/Dis Loss: 1.8149/0.4404
Epoch: 001/050 | Batch 1400/1582 | Gen/Dis Loss: 2.4050/0.4352
Epoch: 001/050 | Batch 1500/1582 | Gen/Dis Loss: 3.0832/0.3119
Time elapsed: 2.41 min
Epoch: 002/050 | Batch 000/1582 | Gen/Dis Loss: 2.6546/0.2975
Epoch: 002/050 | Batch 100/1582 | Gen/Dis Loss: 2.6883/0.3474
Epoch: 002/050 | Batch 200/1582 | Gen/Dis Loss: 3.4184/0.3272
Epoch: 002/050 | Batch 300/1582 | Gen/Dis Loss: 4.4773/0.3086
Epoch: 002/050 | Batch 400/1582 | Gen/Dis Loss: 3.5770/0.2953
Epoch: 002/050 | Batch 500/1582 | Gen/Dis Loss: 3.4053/0.2438
Epoch: 002/050 | Batch 600/1582 | Gen/Dis Loss: 5.3833/0.6251
Epoch: 002/050 | Batch 700/1582 | Gen/Dis Loss: 2.9281/0.2487
Epoch: 002/050 | Batch 800/1582 | Gen/Dis Loss: 2.9793/0.2472
Epoch: 002/050 | Batch 900/1582 | Gen/Dis Loss: 4.5050/0.3062
Epoch: 002/050 | Batch 1000/1582 | Gen/Dis Loss: 5.1507/0.4747
Epoch: 002/050 | Batch 1100/1582 | Gen/Dis Loss: 2.1651/0.2566
Epoch: 002/050 | Batch 1200/1582 | Gen/Dis Loss: 2.1005/0.2975
Epoch: 002/050 | Batch 1300/1582 | Gen/Dis Loss: 3.9655/0.4226
Epoch: 002/050 | Batch 1400/1582 | Gen/Dis Loss: 3.0551/0.3189
Epoch: 002/050 | Batch 1500/1582 | Gen/Dis Loss: 1.5689/0.3868
Time elapsed: 4.87 min
Epoch: 003/050 | Batch 000/1582 | Gen/Dis Loss: 4.1844/0.5114
Epoch: 003/050 | Batch 100/1582 | Gen/Dis Loss: 1.7956/0.3303
Epoch: 003/050 | Batch 200/1582 | Gen/Dis Loss: 2.4450/0.3053
Epoch: 003/050 | Batch 300/1582 | Gen/Dis Loss: 2.4365/0.3097
Epoch: 003/050 | Batch 400/1582 | Gen/Dis Loss: 1.4114/0.4715
Epoch: 003/050 | Batch 500/1582 | Gen/Dis Loss: 2.7515/0.3554
Epoch: 003/050 | Batch 600/1582 | Gen/Dis Loss: 4.1454/0.4082
Epoch: 003/050 | Batch 700/1582 | Gen/Dis Loss: 2.5645/0.3946
Epoch: 003/050 | Batch 800/1582 | Gen/Dis Loss: 3.2766/0.3802
Epoch: 003/050 | Batch 900/1582 | Gen/Dis Loss: 1.6295/0.4355
Epoch: 003/050 | Batch 1000/1582 | Gen/Dis Loss: 2.6155/0.3245
Epoch: 003/050 | Batch 1100/1582 | Gen/Dis Loss: 2.1656/0.2866
Epoch: 003/050 | Batch 1200/1582 | Gen/Dis Loss: 2.9378/0.2993
Epoch: 003/050 | Batch 1300/1582 | Gen/Dis Loss: 0.4118/0.8424
Epoch: 003/050 | Batch 1400/1582 | Gen/Dis Loss: 2.6158/0.2673
Epoch: 003/050 | Batch 1500/1582 | Gen/Dis Loss: 0.8343/0.5431
Time elapsed: 7.32 min
Epoch: 004/050 | Batch 000/1582 | Gen/Dis Loss: 1.2788/0.5229
Epoch: 004/050 | Batch 100/1582 | Gen/Dis Loss: 3.8973/0.5083
Epoch: 004/050 | Batch 200/1582 | Gen/Dis Loss: 0.8725/0.5823
Epoch: 004/050 | Batch 300/1582 | Gen/Dis Loss: 1.4901/0.3865
Epoch: 004/050 | Batch 400/1582 | Gen/Dis Loss: 1.5818/0.3457
Epoch: 004/050 | Batch 500/1582 | Gen/Dis Loss: 0.7975/0.7507
Epoch: 004/050 | Batch 600/1582 | Gen/Dis Loss: 2.0922/0.3015
Epoch: 004/050 | Batch 700/1582 | Gen/Dis Loss: 3.5414/0.4045
Epoch: 004/050 | Batch 800/1582 | Gen/Dis Loss: 1.1401/0.4408
Epoch: 004/050 | Batch 900/1582 | Gen/Dis Loss: 3.2030/0.4306
Epoch: 004/050 | Batch 1000/1582 | Gen/Dis Loss: 1.1991/0.4849
Epoch: 004/050 | Batch 1100/1582 | Gen/Dis Loss: 2.5690/0.1914
Epoch: 004/050 | Batch 1200/1582 | Gen/Dis Loss: 3.3579/0.5584
Epoch: 004/050 | Batch 1300/1582 | Gen/Dis Loss: 3.2803/0.4280
Epoch: 004/050 | Batch 1400/1582 | Gen/Dis Loss: 2.0197/0.4506
Epoch: 004/050 | Batch 1500/1582 | Gen/Dis Loss: 2.7495/0.4555
Time elapsed: 9.77 min
Epoch: 005/050 | Batch 000/1582 | Gen/Dis Loss: 0.7921/0.3848
Epoch: 005/050 | Batch 100/1582 | Gen/Dis Loss: 1.5024/0.4068
Epoch: 005/050 | Batch 200/1582 | Gen/Dis Loss: 2.0930/0.3167
Epoch: 005/050 | Batch 300/1582 | Gen/Dis Loss: 2.1022/0.2655
Epoch: 005/050 | Batch 400/1582 | Gen/Dis Loss: 1.7729/0.3567
Epoch: 005/050 | Batch 500/1582 | Gen/Dis Loss: 2.6353/0.4587
Epoch: 005/050 | Batch 600/1582 | Gen/Dis Loss: 2.2618/0.3033
Epoch: 005/050 | Batch 700/1582 | Gen/Dis Loss: 1.8241/0.3409
Epoch: 005/050 | Batch 800/1582 | Gen/Dis Loss: 0.9900/0.6584
Epoch: 005/050 | Batch 900/1582 | Gen/Dis Loss: 1.7008/0.3598
Epoch: 005/050 | Batch 1000/1582 | Gen/Dis Loss: 3.0501/0.3585
Epoch: 005/050 | Batch 1100/1582 | Gen/Dis Loss: 3.2090/0.2969
Epoch: 005/050 | Batch 1200/1582 | Gen/Dis Loss: 3.0979/0.3711
Epoch: 005/050 | Batch 1300/1582 | Gen/Dis Loss: 2.8273/0.2484
Epoch: 005/050 | Batch 1400/1582 | Gen/Dis Loss: 2.5707/0.2596
Epoch: 005/050 | Batch 1500/1582 | Gen/Dis Loss: 1.9507/0.3465
Time elapsed: 12.24 min
Epoch: 006/050 | Batch 000/1582 | Gen/Dis Loss: 2.2717/0.2547
Epoch: 006/050 | Batch 100/1582 | Gen/Dis Loss: 2.6437/0.2221
Epoch: 006/050 | Batch 200/1582 | Gen/Dis Loss: 2.4963/0.3049
Epoch: 006/050 | Batch 300/1582 | Gen/Dis Loss: 2.1344/0.2831
Epoch: 006/050 | Batch 400/1582 | Gen/Dis Loss: 2.5161/0.2974
Epoch: 006/050 | Batch 500/1582 | Gen/Dis Loss: 1.3945/0.3307
Epoch: 006/050 | Batch 600/1582 | Gen/Dis Loss: 2.0203/0.2776
Epoch: 006/050 | Batch 700/1582 | Gen/Dis Loss: 2.0314/0.2551
Epoch: 006/050 | Batch 800/1582 | Gen/Dis Loss: 1.5436/0.3365
Epoch: 006/050 | Batch 900/1582 | Gen/Dis Loss: 0.9169/0.5555
Epoch: 006/050 | Batch 1000/1582 | Gen/Dis Loss: 1.6998/0.3473
Epoch: 006/050 | Batch 1100/1582 | Gen/Dis Loss: 3.5575/0.3191
Epoch: 006/050 | Batch 1200/1582 | Gen/Dis Loss: 1.5284/0.3600
Epoch: 006/050 | Batch 1300/1582 | Gen/Dis Loss: 2.1816/0.2663
Epoch: 006/050 | Batch 1400/1582 | Gen/Dis Loss: 4.2007/0.3633
Epoch: 006/050 | Batch 1500/1582 | Gen/Dis Loss: 2.6890/0.2958
Time elapsed: 14.68 min
Epoch: 007/050 | Batch 000/1582 | Gen/Dis Loss: 2.7584/0.2908
Epoch: 007/050 | Batch 100/1582 | Gen/Dis Loss: 4.1284/0.3650
Epoch: 007/050 | Batch 200/1582 | Gen/Dis Loss: 4.4526/0.4167
Epoch: 007/050 | Batch 300/1582 | Gen/Dis Loss: 3.2206/0.2703
Epoch: 007/050 | Batch 400/1582 | Gen/Dis Loss: 2.4905/0.1797
Epoch: 007/050 | Batch 500/1582 | Gen/Dis Loss: 3.1470/0.4093
Epoch: 007/050 | Batch 600/1582 | Gen/Dis Loss: 2.5501/0.2351
Epoch: 007/050 | Batch 700/1582 | Gen/Dis Loss: 2.8835/0.2463
Epoch: 007/050 | Batch 800/1582 | Gen/Dis Loss: 2.1774/0.3351
Epoch: 007/050 | Batch 900/1582 | Gen/Dis Loss: 2.5601/0.2806
Epoch: 007/050 | Batch 1000/1582 | Gen/Dis Loss: 2.0921/0.2292
Epoch: 007/050 | Batch 1100/1582 | Gen/Dis Loss: 3.7335/1.2924
Epoch: 007/050 | Batch 1200/1582 | Gen/Dis Loss: 3.8992/0.2415
Epoch: 007/050 | Batch 1300/1582 | Gen/Dis Loss: 1.7858/0.2452
Epoch: 007/050 | Batch 1400/1582 | Gen/Dis Loss: 2.1010/0.2632
Epoch: 007/050 | Batch 1500/1582 | Gen/Dis Loss: 2.5515/0.2721
Time elapsed: 17.12 min
Epoch: 008/050 | Batch 000/1582 | Gen/Dis Loss: 2.8788/0.2189
Epoch: 008/050 | Batch 100/1582 | Gen/Dis Loss: 2.6680/0.2433
Epoch: 008/050 | Batch 200/1582 | Gen/Dis Loss: 2.1978/0.2612
Epoch: 008/050 | Batch 300/1582 | Gen/Dis Loss: 1.4075/0.3578
Epoch: 008/050 | Batch 400/1582 | Gen/Dis Loss: 2.4919/0.1883
Epoch: 008/050 | Batch 500/1582 | Gen/Dis Loss: 1.2902/0.2953
Epoch: 008/050 | Batch 600/1582 | Gen/Dis Loss: 0.8833/0.4354
Epoch: 008/050 | Batch 700/1582 | Gen/Dis Loss: 4.7740/0.5978
Epoch: 008/050 | Batch 800/1582 | Gen/Dis Loss: 0.6711/0.6106
Epoch: 008/050 | Batch 900/1582 | Gen/Dis Loss: 2.0333/0.2631
Epoch: 008/050 | Batch 1000/1582 | Gen/Dis Loss: 5.3810/0.4634
Epoch: 008/050 | Batch 1100/1582 | Gen/Dis Loss: 1.9523/0.2972
Epoch: 008/050 | Batch 1200/1582 | Gen/Dis Loss: 2.3688/0.2946
Epoch: 008/050 | Batch 1300/1582 | Gen/Dis Loss: 2.2858/0.2170
Epoch: 008/050 | Batch 1400/1582 | Gen/Dis Loss: 2.8013/0.1691
Epoch: 008/050 | Batch 1500/1582 | Gen/Dis Loss: 4.4253/0.4265
Time elapsed: 19.57 min
Epoch: 009/050 | Batch 000/1582 | Gen/Dis Loss: 3.2850/0.3180
Epoch: 009/050 | Batch 100/1582 | Gen/Dis Loss: 3.9528/0.4771
Epoch: 009/050 | Batch 200/1582 | Gen/Dis Loss: 4.9477/0.3345
Epoch: 009/050 | Batch 300/1582 | Gen/Dis Loss: 6.7088/1.1486
Epoch: 009/050 | Batch 400/1582 | Gen/Dis Loss: 1.6133/0.4584
Epoch: 009/050 | Batch 500/1582 | Gen/Dis Loss: 2.5430/0.1632
Epoch: 009/050 | Batch 600/1582 | Gen/Dis Loss: 4.6278/0.3069
Epoch: 009/050 | Batch 700/1582 | Gen/Dis Loss: 2.0141/0.2103
Epoch: 009/050 | Batch 800/1582 | Gen/Dis Loss: 5.6783/1.0691
Epoch: 009/050 | Batch 900/1582 | Gen/Dis Loss: 2.1624/0.1696
Epoch: 009/050 | Batch 1000/1582 | Gen/Dis Loss: 3.1192/0.1491
Epoch: 009/050 | Batch 1100/1582 | Gen/Dis Loss: 3.0938/0.2885
Epoch: 009/050 | Batch 1200/1582 | Gen/Dis Loss: 2.2178/0.2097
Epoch: 009/050 | Batch 1300/1582 | Gen/Dis Loss: 2.0659/0.1727
Epoch: 009/050 | Batch 1400/1582 | Gen/Dis Loss: 4.7766/0.3881
Epoch: 009/050 | Batch 1500/1582 | Gen/Dis Loss: 2.8205/0.1880
Time elapsed: 22.03 min
Epoch: 010/050 | Batch 000/1582 | Gen/Dis Loss: 2.2797/0.3028
Epoch: 010/050 | Batch 100/1582 | Gen/Dis Loss: 3.3219/0.1475
Epoch: 010/050 | Batch 200/1582 | Gen/Dis Loss: 0.6153/1.5521
Epoch: 010/050 | Batch 300/1582 | Gen/Dis Loss: 2.7312/0.2096
Epoch: 010/050 | Batch 400/1582 | Gen/Dis Loss: 4.5733/0.2595
Epoch: 010/050 | Batch 500/1582 | Gen/Dis Loss: 2.6643/0.2227
Epoch: 010/050 | Batch 600/1582 | Gen/Dis Loss: 6.3498/0.4152
Epoch: 010/050 | Batch 700/1582 | Gen/Dis Loss: 2.0462/0.1690
Epoch: 010/050 | Batch 800/1582 | Gen/Dis Loss: 3.6527/0.2396
Epoch: 010/050 | Batch 900/1582 | Gen/Dis Loss: 2.2485/0.2062
Epoch: 010/050 | Batch 1000/1582 | Gen/Dis Loss: 2.8591/0.1599
Epoch: 010/050 | Batch 1100/1582 | Gen/Dis Loss: 2.0696/0.1849
Epoch: 010/050 | Batch 1200/1582 | Gen/Dis Loss: 3.9865/0.1543
Epoch: 010/050 | Batch 1300/1582 | Gen/Dis Loss: 2.6435/0.3248
Epoch: 010/050 | Batch 1400/1582 | Gen/Dis Loss: 2.7184/0.1879
Epoch: 010/050 | Batch 1500/1582 | Gen/Dis Loss: 2.9266/0.1388
Time elapsed: 24.47 min
Epoch: 011/050 | Batch 000/1582 | Gen/Dis Loss: 4.7872/0.4017
Epoch: 011/050 | Batch 100/1582 | Gen/Dis Loss: 1.9369/0.2213
Epoch: 011/050 | Batch 200/1582 | Gen/Dis Loss: 3.1758/0.1217
Epoch: 011/050 | Batch 300/1582 | Gen/Dis Loss: 4.2206/0.1772
Epoch: 011/050 | Batch 400/1582 | Gen/Dis Loss: 2.6228/0.1631
Epoch: 011/050 | Batch 500/1582 | Gen/Dis Loss: 3.3844/0.1193
Epoch: 011/050 | Batch 600/1582 | Gen/Dis Loss: 2.7206/0.2827
Epoch: 011/050 | Batch 700/1582 | Gen/Dis Loss: 3.2443/0.1687
Epoch: 011/050 | Batch 800/1582 | Gen/Dis Loss: 3.9561/0.1805
Epoch: 011/050 | Batch 900/1582 | Gen/Dis Loss: 3.5614/0.0838
Epoch: 011/050 | Batch 1000/1582 | Gen/Dis Loss: 2.7372/0.1273
Epoch: 011/050 | Batch 1100/1582 | Gen/Dis Loss: 3.0349/0.1355
Epoch: 011/050 | Batch 1200/1582 | Gen/Dis Loss: 1.1350/0.5331
Epoch: 011/050 | Batch 1300/1582 | Gen/Dis Loss: 7.1063/0.5201
Epoch: 011/050 | Batch 1400/1582 | Gen/Dis Loss: 3.5726/0.1416
Epoch: 011/050 | Batch 1500/1582 | Gen/Dis Loss: 4.0311/0.1940
Time elapsed: 26.92 min
Epoch: 012/050 | Batch 000/1582 | Gen/Dis Loss: 3.6484/0.1064
Epoch: 012/050 | Batch 100/1582 | Gen/Dis Loss: 2.7725/0.1917
Epoch: 012/050 | Batch 200/1582 | Gen/Dis Loss: 3.2880/0.1067
Epoch: 012/050 | Batch 300/1582 | Gen/Dis Loss: 3.0767/0.1336
Epoch: 012/050 | Batch 400/1582 | Gen/Dis Loss: 0.0274/1.8923
Epoch: 012/050 | Batch 500/1582 | Gen/Dis Loss: 1.4976/0.1859
Epoch: 012/050 | Batch 600/1582 | Gen/Dis Loss: 4.4071/0.2013
Epoch: 012/050 | Batch 700/1582 | Gen/Dis Loss: 3.2998/0.1394
Epoch: 012/050 | Batch 800/1582 | Gen/Dis Loss: 3.8031/0.1650
Epoch: 012/050 | Batch 900/1582 | Gen/Dis Loss: 1.1044/0.9318
Epoch: 012/050 | Batch 1000/1582 | Gen/Dis Loss: 2.9200/0.1812
Epoch: 012/050 | Batch 1100/1582 | Gen/Dis Loss: 4.7578/0.1687
Epoch: 012/050 | Batch 1200/1582 | Gen/Dis Loss: 2.8928/0.1553
Epoch: 012/050 | Batch 1300/1582 | Gen/Dis Loss: 3.1774/0.1590
Epoch: 012/050 | Batch 1400/1582 | Gen/Dis Loss: 0.7832/0.3013
Epoch: 012/050 | Batch 1500/1582 | Gen/Dis Loss: 1.9825/0.1740
Time elapsed: 29.37 min
Epoch: 013/050 | Batch 000/1582 | Gen/Dis Loss: 2.9352/0.2559
Epoch: 013/050 | Batch 100/1582 | Gen/Dis Loss: 1.9057/0.3148
Epoch: 013/050 | Batch 200/1582 | Gen/Dis Loss: 3.2805/0.1143
Epoch: 013/050 | Batch 300/1582 | Gen/Dis Loss: 4.4632/0.6146
Epoch: 013/050 | Batch 400/1582 | Gen/Dis Loss: 5.3593/0.3872
Epoch: 013/050 | Batch 500/1582 | Gen/Dis Loss: 3.8818/0.1371
Epoch: 013/050 | Batch 600/1582 | Gen/Dis Loss: 4.4362/0.1019
Epoch: 013/050 | Batch 700/1582 | Gen/Dis Loss: 2.6236/0.1394
Epoch: 013/050 | Batch 800/1582 | Gen/Dis Loss: 2.6288/0.1342
Epoch: 013/050 | Batch 900/1582 | Gen/Dis Loss: 2.3635/0.0963
Epoch: 013/050 | Batch 1000/1582 | Gen/Dis Loss: 4.1414/0.1340
Epoch: 013/050 | Batch 1100/1582 | Gen/Dis Loss: 4.4927/0.1197
Epoch: 013/050 | Batch 1200/1582 | Gen/Dis Loss: 3.9764/0.0756
Epoch: 013/050 | Batch 1300/1582 | Gen/Dis Loss: 4.3606/0.3232
Epoch: 013/050 | Batch 1400/1582 | Gen/Dis Loss: 4.0256/0.0860
Epoch: 013/050 | Batch 1500/1582 | Gen/Dis Loss: 4.1148/0.0825
Time elapsed: 31.81 min
Epoch: 014/050 | Batch 000/1582 | Gen/Dis Loss: 1.1043/0.9285
Epoch: 014/050 | Batch 100/1582 | Gen/Dis Loss: 3.6425/0.0926
Epoch: 014/050 | Batch 200/1582 | Gen/Dis Loss: 2.8304/0.1877
Epoch: 014/050 | Batch 300/1582 | Gen/Dis Loss: 3.3682/0.1247
Epoch: 014/050 | Batch 400/1582 | Gen/Dis Loss: 3.9780/0.1473
Epoch: 014/050 | Batch 500/1582 | Gen/Dis Loss: 2.9161/0.1648
Epoch: 014/050 | Batch 600/1582 | Gen/Dis Loss: 4.2146/0.1189
Epoch: 014/050 | Batch 700/1582 | Gen/Dis Loss: 3.7289/0.0632
Epoch: 014/050 | Batch 800/1582 | Gen/Dis Loss: 2.1305/0.1692
Epoch: 014/050 | Batch 900/1582 | Gen/Dis Loss: 3.9756/0.0835
Epoch: 014/050 | Batch 1000/1582 | Gen/Dis Loss: 4.6995/0.0916
Epoch: 014/050 | Batch 1100/1582 | Gen/Dis Loss: 3.5199/0.0720
Epoch: 014/050 | Batch 1200/1582 | Gen/Dis Loss: 4.3629/0.0614
Epoch: 014/050 | Batch 1300/1582 | Gen/Dis Loss: 0.7565/0.3985
Epoch: 014/050 | Batch 1400/1582 | Gen/Dis Loss: 3.1447/0.0867
Epoch: 014/050 | Batch 1500/1582 | Gen/Dis Loss: 2.8430/0.3123
Time elapsed: 34.26 min
Epoch: 015/050 | Batch 000/1582 | Gen/Dis Loss: 4.5839/0.0803
Epoch: 015/050 | Batch 100/1582 | Gen/Dis Loss: 2.7709/0.1099
Epoch: 015/050 | Batch 200/1582 | Gen/Dis Loss: 4.5145/0.0403
Epoch: 015/050 | Batch 300/1582 | Gen/Dis Loss: 1.5461/0.2635
Epoch: 015/050 | Batch 400/1582 | Gen/Dis Loss: 4.1678/0.0990
Epoch: 015/050 | Batch 500/1582 | Gen/Dis Loss: 3.3877/0.0644
Epoch: 015/050 | Batch 600/1582 | Gen/Dis Loss: 4.1457/0.0862
Epoch: 015/050 | Batch 700/1582 | Gen/Dis Loss: 8.8815/0.7453
Epoch: 015/050 | Batch 800/1582 | Gen/Dis Loss: 4.4792/0.1020
Epoch: 015/050 | Batch 900/1582 | Gen/Dis Loss: 4.4948/0.1057
Epoch: 015/050 | Batch 1000/1582 | Gen/Dis Loss: 4.3685/0.1219
Epoch: 015/050 | Batch 1100/1582 | Gen/Dis Loss: 5.2142/0.1152
Epoch: 015/050 | Batch 1200/1582 | Gen/Dis Loss: 4.6588/0.0475
Epoch: 015/050 | Batch 1300/1582 | Gen/Dis Loss: 3.7746/0.0867
Epoch: 015/050 | Batch 1400/1582 | Gen/Dis Loss: 4.3930/0.0603
Epoch: 015/050 | Batch 1500/1582 | Gen/Dis Loss: 4.2740/0.1428
Time elapsed: 36.70 min
Epoch: 016/050 | Batch 000/1582 | Gen/Dis Loss: 3.6693/0.0732
Epoch: 016/050 | Batch 100/1582 | Gen/Dis Loss: 4.4327/0.0692
Epoch: 016/050 | Batch 200/1582 | Gen/Dis Loss: 0.5015/0.3050
Epoch: 016/050 | Batch 300/1582 | Gen/Dis Loss: 5.9430/0.1589
Epoch: 016/050 | Batch 400/1582 | Gen/Dis Loss: 1.2633/0.5085
Epoch: 016/050 | Batch 500/1582 | Gen/Dis Loss: 5.9694/0.2277
Epoch: 016/050 | Batch 600/1582 | Gen/Dis Loss: 3.5210/0.0464
Epoch: 016/050 | Batch 700/1582 | Gen/Dis Loss: 6.2908/0.1089
Epoch: 016/050 | Batch 800/1582 | Gen/Dis Loss: 3.5885/0.0453
Epoch: 016/050 | Batch 900/1582 | Gen/Dis Loss: 5.1825/0.0755
Epoch: 016/050 | Batch 1000/1582 | Gen/Dis Loss: 5.1304/0.1356
Epoch: 016/050 | Batch 1100/1582 | Gen/Dis Loss: 3.9108/0.0659
Epoch: 016/050 | Batch 1200/1582 | Gen/Dis Loss: 3.6563/0.0827
Epoch: 016/050 | Batch 1300/1582 | Gen/Dis Loss: 2.7972/0.0743
Epoch: 016/050 | Batch 1400/1582 | Gen/Dis Loss: 4.9236/0.0453
Epoch: 016/050 | Batch 1500/1582 | Gen/Dis Loss: 0.6455/0.8469
Time elapsed: 39.15 min
Epoch: 017/050 | Batch 000/1582 | Gen/Dis Loss: 2.4154/0.1971
Epoch: 017/050 | Batch 100/1582 | Gen/Dis Loss: 2.6671/0.1268
Epoch: 017/050 | Batch 200/1582 | Gen/Dis Loss: 4.3567/0.0615
Epoch: 017/050 | Batch 300/1582 | Gen/Dis Loss: 5.5731/0.1075
Epoch: 017/050 | Batch 400/1582 | Gen/Dis Loss: 4.1623/0.1074
Epoch: 017/050 | Batch 500/1582 | Gen/Dis Loss: 3.8884/0.1199
Epoch: 017/050 | Batch 600/1582 | Gen/Dis Loss: 3.2655/0.0933
Epoch: 017/050 | Batch 700/1582 | Gen/Dis Loss: 3.9366/0.1402
Epoch: 017/050 | Batch 800/1582 | Gen/Dis Loss: 3.7244/0.0487
Epoch: 017/050 | Batch 900/1582 | Gen/Dis Loss: 5.3038/0.0280
Epoch: 017/050 | Batch 1000/1582 | Gen/Dis Loss: 2.5123/0.1224
Epoch: 017/050 | Batch 1100/1582 | Gen/Dis Loss: 1.0286/0.6975
Epoch: 017/050 | Batch 1200/1582 | Gen/Dis Loss: 3.7874/0.0824
Epoch: 017/050 | Batch 1300/1582 | Gen/Dis Loss: 5.1960/0.0653
Epoch: 017/050 | Batch 1400/1582 | Gen/Dis Loss: 4.7890/0.0441
Epoch: 017/050 | Batch 1500/1582 | Gen/Dis Loss: 6.2927/0.0777
Time elapsed: 41.59 min
Epoch: 018/050 | Batch 000/1582 | Gen/Dis Loss: 3.0539/0.1145
Epoch: 018/050 | Batch 100/1582 | Gen/Dis Loss: 2.2086/0.2534
Epoch: 018/050 | Batch 200/1582 | Gen/Dis Loss: 4.6719/0.0531
Epoch: 018/050 | Batch 300/1582 | Gen/Dis Loss: 4.5728/0.0413
Epoch: 018/050 | Batch 400/1582 | Gen/Dis Loss: 3.7756/0.0534
Epoch: 018/050 | Batch 500/1582 | Gen/Dis Loss: 3.8915/0.1871
Epoch: 018/050 | Batch 600/1582 | Gen/Dis Loss: 3.2599/0.1262
Epoch: 018/050 | Batch 700/1582 | Gen/Dis Loss: 3.2453/0.2047
Epoch: 018/050 | Batch 800/1582 | Gen/Dis Loss: 4.8517/0.0808
Epoch: 018/050 | Batch 900/1582 | Gen/Dis Loss: 4.7489/0.0394
Epoch: 018/050 | Batch 1000/1582 | Gen/Dis Loss: 4.1476/0.0650
Epoch: 018/050 | Batch 1100/1582 | Gen/Dis Loss: 4.7929/0.0404
Epoch: 018/050 | Batch 1200/1582 | Gen/Dis Loss: 3.5185/0.0980
Epoch: 018/050 | Batch 1300/1582 | Gen/Dis Loss: 4.4956/0.0402
Epoch: 018/050 | Batch 1400/1582 | Gen/Dis Loss: 4.4323/0.0590
Epoch: 018/050 | Batch 1500/1582 | Gen/Dis Loss: 3.4464/0.0679
Time elapsed: 44.03 min
Epoch: 019/050 | Batch 000/1582 | Gen/Dis Loss: 4.1268/0.0767
Epoch: 019/050 | Batch 100/1582 | Gen/Dis Loss: 4.6126/0.0393
Epoch: 019/050 | Batch 200/1582 | Gen/Dis Loss: 5.5232/0.0526
Epoch: 019/050 | Batch 300/1582 | Gen/Dis Loss: 0.9774/1.0839
Epoch: 019/050 | Batch 400/1582 | Gen/Dis Loss: 3.4379/0.1391
Epoch: 019/050 | Batch 500/1582 | Gen/Dis Loss: 4.3749/0.1466
Epoch: 019/050 | Batch 600/1582 | Gen/Dis Loss: 4.9807/0.0555
Epoch: 019/050 | Batch 700/1582 | Gen/Dis Loss: 2.2483/0.5447
Epoch: 019/050 | Batch 800/1582 | Gen/Dis Loss: 3.8043/0.0719
Epoch: 019/050 | Batch 900/1582 | Gen/Dis Loss: 5.6987/0.0340
Epoch: 019/050 | Batch 1000/1582 | Gen/Dis Loss: 4.5018/0.1072
Epoch: 019/050 | Batch 1100/1582 | Gen/Dis Loss: 4.7758/0.0345
Epoch: 019/050 | Batch 1200/1582 | Gen/Dis Loss: 4.6211/0.0271
Epoch: 019/050 | Batch 1300/1582 | Gen/Dis Loss: 4.3455/0.0339
Epoch: 019/050 | Batch 1400/1582 | Gen/Dis Loss: 4.5755/0.0291
Epoch: 019/050 | Batch 1500/1582 | Gen/Dis Loss: 4.3068/0.0998
Time elapsed: 46.48 min
Epoch: 020/050 | Batch 000/1582 | Gen/Dis Loss: 4.1104/0.0649
Epoch: 020/050 | Batch 100/1582 | Gen/Dis Loss: 4.0590/0.0962
Epoch: 020/050 | Batch 200/1582 | Gen/Dis Loss: 4.7994/0.0422
Epoch: 020/050 | Batch 300/1582 | Gen/Dis Loss: 2.3004/0.2807
Epoch: 020/050 | Batch 400/1582 | Gen/Dis Loss: 4.8004/0.0420
Epoch: 020/050 | Batch 500/1582 | Gen/Dis Loss: 4.3600/0.0489
Epoch: 020/050 | Batch 600/1582 | Gen/Dis Loss: 5.0067/0.0243
Epoch: 020/050 | Batch 700/1582 | Gen/Dis Loss: 4.4948/0.1586
Epoch: 020/050 | Batch 800/1582 | Gen/Dis Loss: 4.0925/0.0469
Epoch: 020/050 | Batch 900/1582 | Gen/Dis Loss: 4.5186/0.1317
Epoch: 020/050 | Batch 1000/1582 | Gen/Dis Loss: 2.9005/0.2559
Epoch: 020/050 | Batch 1100/1582 | Gen/Dis Loss: 3.0729/0.1302
Epoch: 020/050 | Batch 1200/1582 | Gen/Dis Loss: 4.6651/0.0788
Epoch: 020/050 | Batch 1300/1582 | Gen/Dis Loss: 5.1099/0.0250
Epoch: 020/050 | Batch 1400/1582 | Gen/Dis Loss: 4.1965/0.0343
Epoch: 020/050 | Batch 1500/1582 | Gen/Dis Loss: 4.7827/0.0639
Time elapsed: 48.94 min
Epoch: 021/050 | Batch 000/1582 | Gen/Dis Loss: 2.3703/0.2827
Epoch: 021/050 | Batch 100/1582 | Gen/Dis Loss: 2.3348/0.1936
Epoch: 021/050 | Batch 200/1582 | Gen/Dis Loss: 4.8067/0.0359
Epoch: 021/050 | Batch 300/1582 | Gen/Dis Loss: 5.3462/0.0197
Epoch: 021/050 | Batch 400/1582 | Gen/Dis Loss: 4.3397/0.0335
Epoch: 021/050 | Batch 500/1582 | Gen/Dis Loss: 5.1778/0.0180
Epoch: 021/050 | Batch 600/1582 | Gen/Dis Loss: 2.5495/0.3667
Epoch: 021/050 | Batch 700/1582 | Gen/Dis Loss: 4.8889/0.0454
Epoch: 021/050 | Batch 800/1582 | Gen/Dis Loss: 4.4369/0.0611
Epoch: 021/050 | Batch 900/1582 | Gen/Dis Loss: 7.7600/0.1938
Epoch: 021/050 | Batch 1000/1582 | Gen/Dis Loss: 4.5032/0.0444
Epoch: 021/050 | Batch 1100/1582 | Gen/Dis Loss: 4.5510/0.0389
Epoch: 021/050 | Batch 1200/1582 | Gen/Dis Loss: 4.0619/0.0450
Epoch: 021/050 | Batch 1300/1582 | Gen/Dis Loss: 5.3057/0.0235
Epoch: 021/050 | Batch 1400/1582 | Gen/Dis Loss: 5.0934/0.0192
Epoch: 021/050 | Batch 1500/1582 | Gen/Dis Loss: 5.0066/0.0531
Time elapsed: 51.38 min
Epoch: 022/050 | Batch 000/1582 | Gen/Dis Loss: 5.5475/0.0353
Epoch: 022/050 | Batch 100/1582 | Gen/Dis Loss: 5.5323/0.0167
Epoch: 022/050 | Batch 200/1582 | Gen/Dis Loss: 5.7709/0.0250
Epoch: 022/050 | Batch 300/1582 | Gen/Dis Loss: 4.3576/0.1076
Epoch: 022/050 | Batch 400/1582 | Gen/Dis Loss: 11.4555/3.5824
Epoch: 022/050 | Batch 500/1582 | Gen/Dis Loss: 4.9599/0.1072
Epoch: 022/050 | Batch 600/1582 | Gen/Dis Loss: 4.5137/0.0428
Epoch: 022/050 | Batch 700/1582 | Gen/Dis Loss: 4.9481/0.0423
Epoch: 022/050 | Batch 800/1582 | Gen/Dis Loss: 5.2620/0.0231
Epoch: 022/050 | Batch 900/1582 | Gen/Dis Loss: 3.1689/0.1021
Epoch: 022/050 | Batch 1000/1582 | Gen/Dis Loss: 4.8002/0.0655
Epoch: 022/050 | Batch 1100/1582 | Gen/Dis Loss: 4.8881/0.0403
Epoch: 022/050 | Batch 1200/1582 | Gen/Dis Loss: 4.2580/0.0240
Epoch: 022/050 | Batch 1300/1582 | Gen/Dis Loss: 4.5626/0.0424
Epoch: 022/050 | Batch 1400/1582 | Gen/Dis Loss: 5.5846/0.0414
Epoch: 022/050 | Batch 1500/1582 | Gen/Dis Loss: 3.6022/0.0491
Time elapsed: 53.83 min
Epoch: 023/050 | Batch 000/1582 | Gen/Dis Loss: 5.3894/0.0133
Epoch: 023/050 | Batch 100/1582 | Gen/Dis Loss: 5.0222/0.0286
Epoch: 023/050 | Batch 200/1582 | Gen/Dis Loss: 4.9922/0.0188
Epoch: 023/050 | Batch 300/1582 | Gen/Dis Loss: 6.0378/0.0440
Epoch: 023/050 | Batch 400/1582 | Gen/Dis Loss: 4.6773/0.0619
Epoch: 023/050 | Batch 500/1582 | Gen/Dis Loss: 5.2353/0.0381
Epoch: 023/050 | Batch 600/1582 | Gen/Dis Loss: 3.1907/0.2050
Epoch: 023/050 | Batch 700/1582 | Gen/Dis Loss: 3.1499/0.3823
Epoch: 023/050 | Batch 800/1582 | Gen/Dis Loss: 3.4689/0.0972
Epoch: 023/050 | Batch 900/1582 | Gen/Dis Loss: 4.8332/0.0593
Epoch: 023/050 | Batch 1000/1582 | Gen/Dis Loss: 5.3042/0.0586
Epoch: 023/050 | Batch 1100/1582 | Gen/Dis Loss: 4.5839/0.0267
Epoch: 023/050 | Batch 1200/1582 | Gen/Dis Loss: 2.1627/0.3667
Epoch: 023/050 | Batch 1300/1582 | Gen/Dis Loss: 4.7130/0.1389
Epoch: 023/050 | Batch 1400/1582 | Gen/Dis Loss: 4.9289/0.0466
Epoch: 023/050 | Batch 1500/1582 | Gen/Dis Loss: 3.6571/0.1729
Time elapsed: 56.26 min
Epoch: 024/050 | Batch 000/1582 | Gen/Dis Loss: 1.9833/0.2262
Epoch: 024/050 | Batch 100/1582 | Gen/Dis Loss: 5.2528/0.0278
Epoch: 024/050 | Batch 200/1582 | Gen/Dis Loss: 5.6972/0.0203
Epoch: 024/050 | Batch 300/1582 | Gen/Dis Loss: 5.4641/0.0113
Epoch: 024/050 | Batch 400/1582 | Gen/Dis Loss: 4.8718/0.0968
Epoch: 024/050 | Batch 500/1582 | Gen/Dis Loss: 4.8778/0.0382
Epoch: 024/050 | Batch 600/1582 | Gen/Dis Loss: 5.7183/0.0289
Epoch: 024/050 | Batch 700/1582 | Gen/Dis Loss: 4.6323/0.0350
Epoch: 024/050 | Batch 800/1582 | Gen/Dis Loss: 5.0967/0.1114
Epoch: 024/050 | Batch 900/1582 | Gen/Dis Loss: 0.6931/1.0311
Epoch: 024/050 | Batch 1000/1582 | Gen/Dis Loss: 5.9580/0.0358
Epoch: 024/050 | Batch 1100/1582 | Gen/Dis Loss: 5.2827/0.0338
Epoch: 024/050 | Batch 1200/1582 | Gen/Dis Loss: 4.2460/0.0401
Epoch: 024/050 | Batch 1300/1582 | Gen/Dis Loss: 5.2427/0.0309
Epoch: 024/050 | Batch 1400/1582 | Gen/Dis Loss: 4.6742/0.0858
Epoch: 024/050 | Batch 1500/1582 | Gen/Dis Loss: 4.8516/0.1018
Time elapsed: 58.71 min
Epoch: 025/050 | Batch 000/1582 | Gen/Dis Loss: 6.0134/0.0175
Epoch: 025/050 | Batch 100/1582 | Gen/Dis Loss: 5.1292/0.0333
Epoch: 025/050 | Batch 200/1582 | Gen/Dis Loss: 5.2934/0.0199
Epoch: 025/050 | Batch 300/1582 | Gen/Dis Loss: 1.7002/0.4246
Epoch: 025/050 | Batch 400/1582 | Gen/Dis Loss: 4.0120/0.0462
Epoch: 025/050 | Batch 500/1582 | Gen/Dis Loss: 5.1448/0.0167
Epoch: 025/050 | Batch 600/1582 | Gen/Dis Loss: 5.3695/0.0194
Epoch: 025/050 | Batch 700/1582 | Gen/Dis Loss: 5.4673/0.1447
Epoch: 025/050 | Batch 800/1582 | Gen/Dis Loss: 3.3572/0.4049
Epoch: 025/050 | Batch 900/1582 | Gen/Dis Loss: 5.1075/0.1243
Epoch: 025/050 | Batch 1000/1582 | Gen/Dis Loss: 5.4846/0.0711
Epoch: 025/050 | Batch 1100/1582 | Gen/Dis Loss: 5.1564/0.0220
Epoch: 025/050 | Batch 1200/1582 | Gen/Dis Loss: 4.8444/0.1825
Epoch: 025/050 | Batch 1300/1582 | Gen/Dis Loss: 6.6722/0.0429
Epoch: 025/050 | Batch 1400/1582 | Gen/Dis Loss: 5.0143/0.0337
Epoch: 025/050 | Batch 1500/1582 | Gen/Dis Loss: 2.5430/0.1417
Time elapsed: 61.16 min
Epoch: 026/050 | Batch 000/1582 | Gen/Dis Loss: 4.6523/0.0839
Epoch: 026/050 | Batch 100/1582 | Gen/Dis Loss: 5.3957/0.0178
Epoch: 026/050 | Batch 200/1582 | Gen/Dis Loss: 5.2479/0.0201
Epoch: 026/050 | Batch 300/1582 | Gen/Dis Loss: 5.2473/0.0266
Epoch: 026/050 | Batch 400/1582 | Gen/Dis Loss: 6.1226/0.0410
Epoch: 026/050 | Batch 500/1582 | Gen/Dis Loss: 5.8653/0.0878
Epoch: 026/050 | Batch 600/1582 | Gen/Dis Loss: 6.0915/0.0189
Epoch: 026/050 | Batch 700/1582 | Gen/Dis Loss: 5.8512/0.0471
Epoch: 026/050 | Batch 800/1582 | Gen/Dis Loss: 0.9134/1.7851
Epoch: 026/050 | Batch 900/1582 | Gen/Dis Loss: 5.1830/0.0241
Epoch: 026/050 | Batch 1000/1582 | Gen/Dis Loss: 5.9265/0.0391
Epoch: 026/050 | Batch 1100/1582 | Gen/Dis Loss: 4.7412/0.1742
Epoch: 026/050 | Batch 1200/1582 | Gen/Dis Loss: 6.1503/0.0148
Epoch: 026/050 | Batch 1300/1582 | Gen/Dis Loss: 6.4479/0.0097
Epoch: 026/050 | Batch 1400/1582 | Gen/Dis Loss: 5.6819/0.0237
Epoch: 026/050 | Batch 1500/1582 | Gen/Dis Loss: 3.8954/0.1257
Time elapsed: 63.60 min
Epoch: 027/050 | Batch 000/1582 | Gen/Dis Loss: 6.7038/0.0148
Epoch: 027/050 | Batch 100/1582 | Gen/Dis Loss: 5.0880/0.0228
Epoch: 027/050 | Batch 200/1582 | Gen/Dis Loss: 4.8876/0.0536
Epoch: 027/050 | Batch 300/1582 | Gen/Dis Loss: 1.0262/0.6385
Epoch: 027/050 | Batch 400/1582 | Gen/Dis Loss: 5.3424/0.0216
Epoch: 027/050 | Batch 500/1582 | Gen/Dis Loss: 1.7988/0.2676
Epoch: 027/050 | Batch 600/1582 | Gen/Dis Loss: 5.1085/0.0333
Epoch: 027/050 | Batch 700/1582 | Gen/Dis Loss: 5.2286/0.0207
Epoch: 027/050 | Batch 800/1582 | Gen/Dis Loss: 6.0626/0.0098
Epoch: 027/050 | Batch 900/1582 | Gen/Dis Loss: 5.3478/0.0228
Epoch: 027/050 | Batch 1000/1582 | Gen/Dis Loss: 6.3787/0.0149
Epoch: 027/050 | Batch 1100/1582 | Gen/Dis Loss: 4.8965/0.0571
Epoch: 027/050 | Batch 1200/1582 | Gen/Dis Loss: 4.7348/0.1258
Epoch: 027/050 | Batch 1300/1582 | Gen/Dis Loss: 4.8652/0.0435
Epoch: 027/050 | Batch 1400/1582 | Gen/Dis Loss: 5.8077/0.0237
Epoch: 027/050 | Batch 1500/1582 | Gen/Dis Loss: 1.6066/0.2670
Time elapsed: 66.06 min
Epoch: 028/050 | Batch 000/1582 | Gen/Dis Loss: 5.2867/0.0192
Epoch: 028/050 | Batch 100/1582 | Gen/Dis Loss: 6.6656/0.0111
Epoch: 028/050 | Batch 200/1582 | Gen/Dis Loss: 6.0257/0.0174
Epoch: 028/050 | Batch 300/1582 | Gen/Dis Loss: 5.8953/0.0112
Epoch: 028/050 | Batch 400/1582 | Gen/Dis Loss: 5.7179/0.0191
Epoch: 028/050 | Batch 500/1582 | Gen/Dis Loss: 4.4501/0.2658
Epoch: 028/050 | Batch 600/1582 | Gen/Dis Loss: 7.3415/0.0120
Epoch: 028/050 | Batch 700/1582 | Gen/Dis Loss: 5.6033/0.0638
Epoch: 028/050 | Batch 800/1582 | Gen/Dis Loss: 4.8410/0.0327
Epoch: 028/050 | Batch 900/1582 | Gen/Dis Loss: 5.4292/0.0183
Epoch: 028/050 | Batch 1000/1582 | Gen/Dis Loss: 4.9808/0.0188
Epoch: 028/050 | Batch 1100/1582 | Gen/Dis Loss: 6.0032/0.0160
Epoch: 028/050 | Batch 1200/1582 | Gen/Dis Loss: 7.7352/0.1132
Epoch: 028/050 | Batch 1300/1582 | Gen/Dis Loss: 3.6423/0.1839
Epoch: 028/050 | Batch 1400/1582 | Gen/Dis Loss: 6.1671/0.0433
Epoch: 028/050 | Batch 1500/1582 | Gen/Dis Loss: 5.7572/0.0158
Time elapsed: 68.50 min
Epoch: 029/050 | Batch 000/1582 | Gen/Dis Loss: 1.8663/0.1841
Epoch: 029/050 | Batch 100/1582 | Gen/Dis Loss: 6.6766/0.0204
Epoch: 029/050 | Batch 200/1582 | Gen/Dis Loss: 5.4332/0.0192
Epoch: 029/050 | Batch 300/1582 | Gen/Dis Loss: 2.3104/0.4955
Epoch: 029/050 | Batch 400/1582 | Gen/Dis Loss: 7.2644/0.1680
Epoch: 029/050 | Batch 500/1582 | Gen/Dis Loss: 5.4169/0.0388
Epoch: 029/050 | Batch 600/1582 | Gen/Dis Loss: 6.1557/0.0111
Epoch: 029/050 | Batch 700/1582 | Gen/Dis Loss: 0.0364/0.6043
Epoch: 029/050 | Batch 800/1582 | Gen/Dis Loss: 5.1974/0.0945
Epoch: 029/050 | Batch 900/1582 | Gen/Dis Loss: 5.0138/0.0327
Epoch: 029/050 | Batch 1000/1582 | Gen/Dis Loss: 2.7399/0.2710
Epoch: 029/050 | Batch 1100/1582 | Gen/Dis Loss: 8.8934/0.1888
Epoch: 029/050 | Batch 1200/1582 | Gen/Dis Loss: 5.2410/0.0308
Epoch: 029/050 | Batch 1300/1582 | Gen/Dis Loss: 4.3340/0.0482
Epoch: 029/050 | Batch 1400/1582 | Gen/Dis Loss: 4.0296/0.0369
Epoch: 029/050 | Batch 1500/1582 | Gen/Dis Loss: 5.3129/0.0334
Time elapsed: 70.95 min
Epoch: 030/050 | Batch 000/1582 | Gen/Dis Loss: 6.1805/0.0135
Epoch: 030/050 | Batch 100/1582 | Gen/Dis Loss: 5.5793/0.0205
Epoch: 030/050 | Batch 200/1582 | Gen/Dis Loss: 4.0595/0.1394
Epoch: 030/050 | Batch 300/1582 | Gen/Dis Loss: 5.2896/0.0574
Epoch: 030/050 | Batch 400/1582 | Gen/Dis Loss: 6.0667/0.0165
Epoch: 030/050 | Batch 500/1582 | Gen/Dis Loss: 6.7009/0.0363
Epoch: 030/050 | Batch 600/1582 | Gen/Dis Loss: 5.6573/0.0137
Epoch: 030/050 | Batch 700/1582 | Gen/Dis Loss: 4.5025/0.0385
Epoch: 030/050 | Batch 800/1582 | Gen/Dis Loss: 2.8865/0.2654
Epoch: 030/050 | Batch 900/1582 | Gen/Dis Loss: 3.4805/0.0438
Epoch: 030/050 | Batch 1000/1582 | Gen/Dis Loss: 5.5590/0.0336
Epoch: 030/050 | Batch 1100/1582 | Gen/Dis Loss: 5.0269/0.0567
Epoch: 030/050 | Batch 1200/1582 | Gen/Dis Loss: 5.8014/0.0117
Epoch: 030/050 | Batch 1300/1582 | Gen/Dis Loss: 5.9323/0.0078
Epoch: 030/050 | Batch 1400/1582 | Gen/Dis Loss: 6.8289/0.0253
Epoch: 030/050 | Batch 1500/1582 | Gen/Dis Loss: 6.0800/0.0136
Time elapsed: 73.40 min
Epoch: 031/050 | Batch 000/1582 | Gen/Dis Loss: 6.0010/0.0072
Epoch: 031/050 | Batch 100/1582 | Gen/Dis Loss: 5.9420/0.0108
Epoch: 031/050 | Batch 200/1582 | Gen/Dis Loss: 4.3021/0.0474
Epoch: 031/050 | Batch 300/1582 | Gen/Dis Loss: 6.3305/0.0174
Epoch: 031/050 | Batch 400/1582 | Gen/Dis Loss: 5.2712/0.0952
Epoch: 031/050 | Batch 500/1582 | Gen/Dis Loss: 4.0086/0.1346
Epoch: 031/050 | Batch 600/1582 | Gen/Dis Loss: 4.6696/0.0593
Epoch: 031/050 | Batch 700/1582 | Gen/Dis Loss: 5.3198/0.0269
Epoch: 031/050 | Batch 800/1582 | Gen/Dis Loss: 5.5776/0.0330
Epoch: 031/050 | Batch 900/1582 | Gen/Dis Loss: 6.4492/0.1061
Epoch: 031/050 | Batch 1000/1582 | Gen/Dis Loss: 5.1193/0.0196
Epoch: 031/050 | Batch 1100/1582 | Gen/Dis Loss: 3.3789/0.1705
Epoch: 031/050 | Batch 1200/1582 | Gen/Dis Loss: 6.0538/0.0168
Epoch: 031/050 | Batch 1300/1582 | Gen/Dis Loss: 6.0455/0.0106
Epoch: 031/050 | Batch 1400/1582 | Gen/Dis Loss: 3.7110/0.1136
Epoch: 031/050 | Batch 1500/1582 | Gen/Dis Loss: 6.9264/0.0350
Time elapsed: 75.84 min
Epoch: 032/050 | Batch 000/1582 | Gen/Dis Loss: 0.8650/0.5780
Epoch: 032/050 | Batch 100/1582 | Gen/Dis Loss: 5.7331/0.0266
Epoch: 032/050 | Batch 200/1582 | Gen/Dis Loss: 5.7171/0.0309
Epoch: 032/050 | Batch 300/1582 | Gen/Dis Loss: 5.2264/0.0214
Epoch: 032/050 | Batch 400/1582 | Gen/Dis Loss: 6.2854/0.0228
Epoch: 032/050 | Batch 500/1582 | Gen/Dis Loss: 6.3046/0.0131
Epoch: 032/050 | Batch 600/1582 | Gen/Dis Loss: 5.9976/0.0110
Epoch: 032/050 | Batch 700/1582 | Gen/Dis Loss: 3.6490/0.1606
Epoch: 032/050 | Batch 800/1582 | Gen/Dis Loss: 4.5044/0.0372
Epoch: 032/050 | Batch 900/1582 | Gen/Dis Loss: 5.8151/0.0309
Epoch: 032/050 | Batch 1000/1582 | Gen/Dis Loss: 6.1373/0.0478
Epoch: 032/050 | Batch 1100/1582 | Gen/Dis Loss: 6.3456/0.0205
Epoch: 032/050 | Batch 1200/1582 | Gen/Dis Loss: 6.0097/0.0080
Epoch: 032/050 | Batch 1300/1582 | Gen/Dis Loss: 6.3631/0.0216
Epoch: 032/050 | Batch 1400/1582 | Gen/Dis Loss: 6.3053/0.0099
Epoch: 032/050 | Batch 1500/1582 | Gen/Dis Loss: 13.2360/1.3253
Time elapsed: 78.29 min
Epoch: 033/050 | Batch 000/1582 | Gen/Dis Loss: 5.6050/0.0153
Epoch: 033/050 | Batch 100/1582 | Gen/Dis Loss: 7.0881/0.0359
Epoch: 033/050 | Batch 200/1582 | Gen/Dis Loss: 3.7740/0.1014
Epoch: 033/050 | Batch 300/1582 | Gen/Dis Loss: 5.9804/0.2011
Epoch: 033/050 | Batch 400/1582 | Gen/Dis Loss: 4.4562/0.2763
Epoch: 033/050 | Batch 500/1582 | Gen/Dis Loss: 6.2417/0.0271
Epoch: 033/050 | Batch 600/1582 | Gen/Dis Loss: 6.2067/0.0263
Epoch: 033/050 | Batch 700/1582 | Gen/Dis Loss: 2.2468/0.1897
Epoch: 033/050 | Batch 800/1582 | Gen/Dis Loss: 5.6435/0.0300
Epoch: 033/050 | Batch 900/1582 | Gen/Dis Loss: 5.7382/0.0205
Epoch: 033/050 | Batch 1000/1582 | Gen/Dis Loss: 6.3039/0.0145
Epoch: 033/050 | Batch 1100/1582 | Gen/Dis Loss: 5.9467/0.0111
Epoch: 033/050 | Batch 1200/1582 | Gen/Dis Loss: 5.5233/0.0173
Epoch: 033/050 | Batch 1300/1582 | Gen/Dis Loss: 5.4247/0.0138
Epoch: 033/050 | Batch 1400/1582 | Gen/Dis Loss: 6.3367/0.0099
Epoch: 033/050 | Batch 1500/1582 | Gen/Dis Loss: 6.0143/0.0129
Time elapsed: 80.74 min
Epoch: 034/050 | Batch 000/1582 | Gen/Dis Loss: 5.0762/0.0932
Epoch: 034/050 | Batch 100/1582 | Gen/Dis Loss: 6.6954/0.0144
Epoch: 034/050 | Batch 200/1582 | Gen/Dis Loss: 7.1748/0.0691
Epoch: 034/050 | Batch 300/1582 | Gen/Dis Loss: 7.5106/0.0745
Epoch: 034/050 | Batch 400/1582 | Gen/Dis Loss: 4.3071/0.0781
Epoch: 034/050 | Batch 500/1582 | Gen/Dis Loss: 6.0902/0.0210
Epoch: 034/050 | Batch 600/1582 | Gen/Dis Loss: 5.1856/0.1309
Epoch: 034/050 | Batch 700/1582 | Gen/Dis Loss: 4.9114/0.0273
Epoch: 034/050 | Batch 800/1582 | Gen/Dis Loss: 5.8739/0.0096
Epoch: 034/050 | Batch 900/1582 | Gen/Dis Loss: 6.5104/0.0103
Epoch: 034/050 | Batch 1000/1582 | Gen/Dis Loss: 7.2543/0.0364
Epoch: 034/050 | Batch 1100/1582 | Gen/Dis Loss: 6.4442/0.0144
Epoch: 034/050 | Batch 1200/1582 | Gen/Dis Loss: 6.3123/0.0110
Epoch: 034/050 | Batch 1300/1582 | Gen/Dis Loss: 6.8601/0.0088
Epoch: 034/050 | Batch 1400/1582 | Gen/Dis Loss: 3.9485/0.1375
Epoch: 034/050 | Batch 1500/1582 | Gen/Dis Loss: 4.1981/0.0856
Time elapsed: 83.19 min
Epoch: 035/050 | Batch 000/1582 | Gen/Dis Loss: 5.9095/0.0405
Epoch: 035/050 | Batch 100/1582 | Gen/Dis Loss: 6.0299/0.0153
Epoch: 035/050 | Batch 200/1582 | Gen/Dis Loss: 3.5901/0.1777
Epoch: 035/050 | Batch 300/1582 | Gen/Dis Loss: 2.7746/0.2926
Epoch: 035/050 | Batch 400/1582 | Gen/Dis Loss: 7.0399/0.0331
Epoch: 035/050 | Batch 500/1582 | Gen/Dis Loss: 5.9150/0.0234
Epoch: 035/050 | Batch 600/1582 | Gen/Dis Loss: 4.5815/0.0254
Epoch: 035/050 | Batch 700/1582 | Gen/Dis Loss: 6.2178/0.0178
Epoch: 035/050 | Batch 800/1582 | Gen/Dis Loss: 6.1513/0.0108
Epoch: 035/050 | Batch 900/1582 | Gen/Dis Loss: 5.0363/0.2522
Epoch: 035/050 | Batch 1000/1582 | Gen/Dis Loss: 2.4822/0.2586
Epoch: 035/050 | Batch 1100/1582 | Gen/Dis Loss: 6.0145/0.0119
Epoch: 035/050 | Batch 1200/1582 | Gen/Dis Loss: 6.7920/0.0423
Epoch: 035/050 | Batch 1300/1582 | Gen/Dis Loss: 4.6653/0.0857
Epoch: 035/050 | Batch 1400/1582 | Gen/Dis Loss: 5.4006/0.0390
Epoch: 035/050 | Batch 1500/1582 | Gen/Dis Loss: 4.4711/0.0544
Time elapsed: 85.64 min
Epoch: 036/050 | Batch 000/1582 | Gen/Dis Loss: 4.5065/0.0846
Epoch: 036/050 | Batch 100/1582 | Gen/Dis Loss: 5.7673/0.0291
Epoch: 036/050 | Batch 200/1582 | Gen/Dis Loss: 5.4545/0.0120
Epoch: 036/050 | Batch 300/1582 | Gen/Dis Loss: 5.4334/0.0180
Epoch: 036/050 | Batch 400/1582 | Gen/Dis Loss: 5.2544/0.0343
Epoch: 036/050 | Batch 500/1582 | Gen/Dis Loss: 6.4659/0.0044
Epoch: 036/050 | Batch 600/1582 | Gen/Dis Loss: 6.2612/0.0157
Epoch: 036/050 | Batch 700/1582 | Gen/Dis Loss: 6.9814/0.0239
Epoch: 036/050 | Batch 800/1582 | Gen/Dis Loss: 6.5577/0.0089
Epoch: 036/050 | Batch 900/1582 | Gen/Dis Loss: 7.2245/0.0095
Epoch: 036/050 | Batch 1000/1582 | Gen/Dis Loss: 4.9265/0.1003
Epoch: 036/050 | Batch 1100/1582 | Gen/Dis Loss: 5.7495/0.0181
Epoch: 036/050 | Batch 1200/1582 | Gen/Dis Loss: 3.4029/0.1435
Epoch: 036/050 | Batch 1300/1582 | Gen/Dis Loss: 5.5204/0.0301
Epoch: 036/050 | Batch 1400/1582 | Gen/Dis Loss: 5.9284/0.0340
Epoch: 036/050 | Batch 1500/1582 | Gen/Dis Loss: 5.7977/0.0567
Time elapsed: 88.09 min
Epoch: 037/050 | Batch 000/1582 | Gen/Dis Loss: 4.4827/0.0983
Epoch: 037/050 | Batch 100/1582 | Gen/Dis Loss: 4.8477/0.0490
Epoch: 037/050 | Batch 200/1582 | Gen/Dis Loss: 6.4570/0.0138
Epoch: 037/050 | Batch 300/1582 | Gen/Dis Loss: 7.1059/0.0195
Epoch: 037/050 | Batch 400/1582 | Gen/Dis Loss: 6.1184/0.0091
Epoch: 037/050 | Batch 500/1582 | Gen/Dis Loss: 7.4339/0.0127
Epoch: 037/050 | Batch 600/1582 | Gen/Dis Loss: 5.9236/0.0141
Epoch: 037/050 | Batch 700/1582 | Gen/Dis Loss: 6.5181/0.0060
Epoch: 037/050 | Batch 800/1582 | Gen/Dis Loss: 6.9213/0.0104
Epoch: 037/050 | Batch 900/1582 | Gen/Dis Loss: 0.2208/2.0700
Epoch: 037/050 | Batch 1000/1582 | Gen/Dis Loss: 4.6870/0.0295
Epoch: 037/050 | Batch 1100/1582 | Gen/Dis Loss: 4.4261/0.0984
Epoch: 037/050 | Batch 1200/1582 | Gen/Dis Loss: 6.7367/0.0825
Epoch: 037/050 | Batch 1300/1582 | Gen/Dis Loss: 7.5870/0.0572
Epoch: 037/050 | Batch 1400/1582 | Gen/Dis Loss: 5.0440/0.0302
Epoch: 037/050 | Batch 1500/1582 | Gen/Dis Loss: 5.9439/0.0706
Time elapsed: 90.54 min
Epoch: 038/050 | Batch 000/1582 | Gen/Dis Loss: 6.3843/0.0183
Epoch: 038/050 | Batch 100/1582 | Gen/Dis Loss: 6.2960/0.0209
Epoch: 038/050 | Batch 200/1582 | Gen/Dis Loss: 5.6029/0.0176
Epoch: 038/050 | Batch 300/1582 | Gen/Dis Loss: 6.6377/0.0055
Epoch: 038/050 | Batch 400/1582 | Gen/Dis Loss: 6.5151/0.0177
Epoch: 038/050 | Batch 500/1582 | Gen/Dis Loss: 7.2962/0.0473
Epoch: 038/050 | Batch 600/1582 | Gen/Dis Loss: 1.7832/0.3439
Epoch: 038/050 | Batch 700/1582 | Gen/Dis Loss: 5.2883/0.0382
Epoch: 038/050 | Batch 800/1582 | Gen/Dis Loss: 5.0870/0.0360
Epoch: 038/050 | Batch 900/1582 | Gen/Dis Loss: 5.9886/0.0269
Epoch: 038/050 | Batch 1000/1582 | Gen/Dis Loss: 7.8415/0.0205
Epoch: 038/050 | Batch 1100/1582 | Gen/Dis Loss: 5.5182/0.0333
Epoch: 038/050 | Batch 1200/1582 | Gen/Dis Loss: 5.1848/0.0410
Epoch: 038/050 | Batch 1300/1582 | Gen/Dis Loss: 9.5418/0.1805
Epoch: 038/050 | Batch 1400/1582 | Gen/Dis Loss: 7.3106/0.0154
Epoch: 038/050 | Batch 1500/1582 | Gen/Dis Loss: 2.7334/0.1831
Time elapsed: 92.98 min
Epoch: 039/050 | Batch 000/1582 | Gen/Dis Loss: 5.3955/0.0416
Epoch: 039/050 | Batch 100/1582 | Gen/Dis Loss: 6.2430/0.0226
Epoch: 039/050 | Batch 200/1582 | Gen/Dis Loss: 6.0323/0.0089
Epoch: 039/050 | Batch 300/1582 | Gen/Dis Loss: 6.8587/0.0131
Epoch: 039/050 | Batch 400/1582 | Gen/Dis Loss: 5.8336/0.0094
Epoch: 039/050 | Batch 500/1582 | Gen/Dis Loss: 6.6405/0.0053
Epoch: 039/050 | Batch 600/1582 | Gen/Dis Loss: 7.3175/0.0251
Epoch: 039/050 | Batch 700/1582 | Gen/Dis Loss: 3.9456/0.3295
Epoch: 039/050 | Batch 800/1582 | Gen/Dis Loss: 5.7911/0.0465
Epoch: 039/050 | Batch 900/1582 | Gen/Dis Loss: 6.2065/0.0844
Epoch: 039/050 | Batch 1000/1582 | Gen/Dis Loss: 6.2923/0.0058
Epoch: 039/050 | Batch 1100/1582 | Gen/Dis Loss: 6.2191/0.0234
Epoch: 039/050 | Batch 1200/1582 | Gen/Dis Loss: 6.9780/0.0189
Epoch: 039/050 | Batch 1300/1582 | Gen/Dis Loss: 4.3855/0.1290
Epoch: 039/050 | Batch 1400/1582 | Gen/Dis Loss: 7.2525/0.0183
Epoch: 039/050 | Batch 1500/1582 | Gen/Dis Loss: 6.6123/0.0266
Time elapsed: 95.44 min
Epoch: 040/050 | Batch 000/1582 | Gen/Dis Loss: 5.7872/0.0321
Epoch: 040/050 | Batch 100/1582 | Gen/Dis Loss: 7.1974/0.0317
Epoch: 040/050 | Batch 200/1582 | Gen/Dis Loss: 6.4420/0.0095
Epoch: 040/050 | Batch 300/1582 | Gen/Dis Loss: 3.5331/0.1466
Epoch: 040/050 | Batch 400/1582 | Gen/Dis Loss: 5.5956/0.0155
Epoch: 040/050 | Batch 500/1582 | Gen/Dis Loss: 3.3049/0.1202
Epoch: 040/050 | Batch 600/1582 | Gen/Dis Loss: 5.7216/0.0181
Epoch: 040/050 | Batch 700/1582 | Gen/Dis Loss: 7.4099/0.0143
Epoch: 040/050 | Batch 800/1582 | Gen/Dis Loss: 2.6694/0.2057
Epoch: 040/050 | Batch 900/1582 | Gen/Dis Loss: 4.5261/0.0766
Epoch: 040/050 | Batch 1000/1582 | Gen/Dis Loss: 6.5555/0.0195
Epoch: 040/050 | Batch 1100/1582 | Gen/Dis Loss: 5.2850/0.0262
Epoch: 040/050 | Batch 1200/1582 | Gen/Dis Loss: 6.7921/0.0076
Epoch: 040/050 | Batch 1300/1582 | Gen/Dis Loss: 6.4736/0.0105
Epoch: 040/050 | Batch 1400/1582 | Gen/Dis Loss: 5.6351/0.0213
Epoch: 040/050 | Batch 1500/1582 | Gen/Dis Loss: 6.4784/0.0170
Time elapsed: 97.87 min
Epoch: 041/050 | Batch 000/1582 | Gen/Dis Loss: 6.4215/0.0115
Epoch: 041/050 | Batch 100/1582 | Gen/Dis Loss: 6.7611/0.0042
Epoch: 041/050 | Batch 200/1582 | Gen/Dis Loss: 5.4773/0.0155
Epoch: 041/050 | Batch 300/1582 | Gen/Dis Loss: 5.3107/0.1071
Epoch: 041/050 | Batch 400/1582 | Gen/Dis Loss: 4.9638/0.0478
Epoch: 041/050 | Batch 500/1582 | Gen/Dis Loss: 7.3731/0.0138
Epoch: 041/050 | Batch 600/1582 | Gen/Dis Loss: 7.5458/0.1103
Epoch: 041/050 | Batch 700/1582 | Gen/Dis Loss: 6.3871/0.0233
Epoch: 041/050 | Batch 800/1582 | Gen/Dis Loss: 5.8460/0.1494
Epoch: 041/050 | Batch 900/1582 | Gen/Dis Loss: 5.0105/0.0748
Epoch: 041/050 | Batch 1000/1582 | Gen/Dis Loss: 6.4955/0.0148
Epoch: 041/050 | Batch 1100/1582 | Gen/Dis Loss: 6.2726/0.0132
Epoch: 041/050 | Batch 1200/1582 | Gen/Dis Loss: 4.0179/0.0891
Epoch: 041/050 | Batch 1300/1582 | Gen/Dis Loss: 5.8027/0.0608
Epoch: 041/050 | Batch 1400/1582 | Gen/Dis Loss: 5.7328/0.0221
Epoch: 041/050 | Batch 1500/1582 | Gen/Dis Loss: 4.2242/0.1641
Time elapsed: 100.32 min
Epoch: 042/050 | Batch 000/1582 | Gen/Dis Loss: 5.8151/0.0141
Epoch: 042/050 | Batch 100/1582 | Gen/Dis Loss: 2.3969/0.1349
Epoch: 042/050 | Batch 200/1582 | Gen/Dis Loss: 5.1363/0.0346
Epoch: 042/050 | Batch 300/1582 | Gen/Dis Loss: 5.8609/0.0198
Epoch: 042/050 | Batch 400/1582 | Gen/Dis Loss: 5.7023/0.0123
Epoch: 042/050 | Batch 500/1582 | Gen/Dis Loss: 6.4460/0.0071
Epoch: 042/050 | Batch 600/1582 | Gen/Dis Loss: 6.9499/0.0158
Epoch: 042/050 | Batch 700/1582 | Gen/Dis Loss: 6.7048/0.0073
Epoch: 042/050 | Batch 800/1582 | Gen/Dis Loss: 5.9881/0.0084
Epoch: 042/050 | Batch 900/1582 | Gen/Dis Loss: 6.0304/0.0076
Epoch: 042/050 | Batch 1000/1582 | Gen/Dis Loss: 4.3048/0.1250
Epoch: 042/050 | Batch 1100/1582 | Gen/Dis Loss: 4.6075/0.1132
Epoch: 042/050 | Batch 1200/1582 | Gen/Dis Loss: 7.4926/0.0241
Epoch: 042/050 | Batch 1300/1582 | Gen/Dis Loss: 6.1920/0.0412
Epoch: 042/050 | Batch 1400/1582 | Gen/Dis Loss: 6.5437/0.0145
Epoch: 042/050 | Batch 1500/1582 | Gen/Dis Loss: 7.2353/0.0137
Time elapsed: 102.76 min
Epoch: 043/050 | Batch 000/1582 | Gen/Dis Loss: 3.5005/0.0511
Epoch: 043/050 | Batch 100/1582 | Gen/Dis Loss: 4.3836/0.0771
Epoch: 043/050 | Batch 200/1582 | Gen/Dis Loss: 6.1994/0.0256
Epoch: 043/050 | Batch 300/1582 | Gen/Dis Loss: 5.4481/0.0163
Epoch: 043/050 | Batch 400/1582 | Gen/Dis Loss: 7.1767/0.0200
Epoch: 043/050 | Batch 500/1582 | Gen/Dis Loss: 5.1836/0.0626
Epoch: 043/050 | Batch 600/1582 | Gen/Dis Loss: 6.1915/0.0164
Epoch: 043/050 | Batch 700/1582 | Gen/Dis Loss: 6.9366/0.0181
Epoch: 043/050 | Batch 800/1582 | Gen/Dis Loss: 6.0139/0.0213
Epoch: 043/050 | Batch 900/1582 | Gen/Dis Loss: 2.5311/0.2242
Epoch: 043/050 | Batch 1000/1582 | Gen/Dis Loss: 6.5078/0.0169
Epoch: 043/050 | Batch 1100/1582 | Gen/Dis Loss: 5.6311/0.0420
Epoch: 043/050 | Batch 1200/1582 | Gen/Dis Loss: 6.3984/0.0073
Epoch: 043/050 | Batch 1300/1582 | Gen/Dis Loss: 5.2437/0.0415
Epoch: 043/050 | Batch 1400/1582 | Gen/Dis Loss: 6.6940/0.5485
Epoch: 043/050 | Batch 1500/1582 | Gen/Dis Loss: 4.2813/0.2127
Time elapsed: 105.20 min
Epoch: 044/050 | Batch 000/1582 | Gen/Dis Loss: 4.8672/0.0373
Epoch: 044/050 | Batch 100/1582 | Gen/Dis Loss: 5.9904/0.0276
Epoch: 044/050 | Batch 200/1582 | Gen/Dis Loss: 6.3593/0.0118
Epoch: 044/050 | Batch 300/1582 | Gen/Dis Loss: 6.0497/0.0180
Epoch: 044/050 | Batch 400/1582 | Gen/Dis Loss: 6.4911/0.0059
Epoch: 044/050 | Batch 500/1582 | Gen/Dis Loss: 6.3909/0.0105
Epoch: 044/050 | Batch 600/1582 | Gen/Dis Loss: 7.1409/0.0116
Epoch: 044/050 | Batch 700/1582 | Gen/Dis Loss: 7.2195/0.0567
Epoch: 044/050 | Batch 800/1582 | Gen/Dis Loss: 5.6320/0.0358
Epoch: 044/050 | Batch 900/1582 | Gen/Dis Loss: 5.8120/0.0121
Epoch: 044/050 | Batch 1000/1582 | Gen/Dis Loss: 7.0075/0.0098
Epoch: 044/050 | Batch 1100/1582 | Gen/Dis Loss: 6.7760/0.0157
Epoch: 044/050 | Batch 1200/1582 | Gen/Dis Loss: 5.6765/0.0265
Epoch: 044/050 | Batch 1300/1582 | Gen/Dis Loss: 4.1986/0.1981
Epoch: 044/050 | Batch 1400/1582 | Gen/Dis Loss: 8.1159/0.0402
Epoch: 044/050 | Batch 1500/1582 | Gen/Dis Loss: 8.0663/0.0097
Time elapsed: 107.65 min
Epoch: 045/050 | Batch 000/1582 | Gen/Dis Loss: 5.8820/0.0474
Epoch: 045/050 | Batch 100/1582 | Gen/Dis Loss: 7.1296/0.0083
Epoch: 045/050 | Batch 200/1582 | Gen/Dis Loss: 7.2789/0.0077
Epoch: 045/050 | Batch 300/1582 | Gen/Dis Loss: 7.7761/0.0178
Epoch: 045/050 | Batch 400/1582 | Gen/Dis Loss: 7.0312/0.0159
Epoch: 045/050 | Batch 500/1582 | Gen/Dis Loss: 5.1721/0.0495
Epoch: 045/050 | Batch 600/1582 | Gen/Dis Loss: 7.1418/0.0110
Epoch: 045/050 | Batch 700/1582 | Gen/Dis Loss: 6.1093/0.0292
Epoch: 045/050 | Batch 800/1582 | Gen/Dis Loss: 4.6533/0.0825
Epoch: 045/050 | Batch 900/1582 | Gen/Dis Loss: 6.7539/0.0068
Epoch: 045/050 | Batch 1000/1582 | Gen/Dis Loss: 5.1218/0.0452
Epoch: 045/050 | Batch 1100/1582 | Gen/Dis Loss: 8.1557/0.0148
Epoch: 045/050 | Batch 1200/1582 | Gen/Dis Loss: 5.6853/0.0395
Epoch: 045/050 | Batch 1300/1582 | Gen/Dis Loss: 5.0322/0.0359
Epoch: 045/050 | Batch 1400/1582 | Gen/Dis Loss: 3.2788/0.1475
Epoch: 045/050 | Batch 1500/1582 | Gen/Dis Loss: 6.1631/0.0163
Time elapsed: 110.10 min
Epoch: 046/050 | Batch 000/1582 | Gen/Dis Loss: 4.0055/0.1013
Epoch: 046/050 | Batch 100/1582 | Gen/Dis Loss: 6.4872/0.0138
Epoch: 046/050 | Batch 200/1582 | Gen/Dis Loss: 5.8815/0.0137
Epoch: 046/050 | Batch 300/1582 | Gen/Dis Loss: 6.6904/0.0207
Epoch: 046/050 | Batch 400/1582 | Gen/Dis Loss: 6.1291/0.0131
Epoch: 046/050 | Batch 500/1582 | Gen/Dis Loss: 7.1279/0.0071
Epoch: 046/050 | Batch 600/1582 | Gen/Dis Loss: 5.1027/0.0598
Epoch: 046/050 | Batch 700/1582 | Gen/Dis Loss: 5.5336/0.0215
Epoch: 046/050 | Batch 800/1582 | Gen/Dis Loss: 0.2062/0.1871
Epoch: 046/050 | Batch 900/1582 | Gen/Dis Loss: 6.2193/0.0484
Epoch: 046/050 | Batch 1000/1582 | Gen/Dis Loss: 2.9839/0.2445
Epoch: 046/050 | Batch 1100/1582 | Gen/Dis Loss: 5.8552/0.0195
Epoch: 046/050 | Batch 1200/1582 | Gen/Dis Loss: 3.6481/0.0782
Epoch: 046/050 | Batch 1300/1582 | Gen/Dis Loss: 5.6223/0.0311
Epoch: 046/050 | Batch 1400/1582 | Gen/Dis Loss: 5.8726/0.0085
Epoch: 046/050 | Batch 1500/1582 | Gen/Dis Loss: 7.4263/0.0241
Time elapsed: 112.55 min
Epoch: 047/050 | Batch 000/1582 | Gen/Dis Loss: 8.8129/0.0464
Epoch: 047/050 | Batch 100/1582 | Gen/Dis Loss: 6.0695/0.0155
Epoch: 047/050 | Batch 200/1582 | Gen/Dis Loss: 6.1803/0.0225
Epoch: 047/050 | Batch 300/1582 | Gen/Dis Loss: 5.0649/0.0681
Epoch: 047/050 | Batch 400/1582 | Gen/Dis Loss: 6.7654/0.0088
Epoch: 047/050 | Batch 500/1582 | Gen/Dis Loss: 3.7208/0.3658
Epoch: 047/050 | Batch 600/1582 | Gen/Dis Loss: 8.3161/0.0697
Epoch: 047/050 | Batch 700/1582 | Gen/Dis Loss: 7.5471/0.0305
Epoch: 047/050 | Batch 800/1582 | Gen/Dis Loss: 6.1861/0.0115
Epoch: 047/050 | Batch 900/1582 | Gen/Dis Loss: 5.9707/0.0321
Epoch: 047/050 | Batch 1000/1582 | Gen/Dis Loss: 2.7419/0.1369
Epoch: 047/050 | Batch 1100/1582 | Gen/Dis Loss: 6.0946/0.0108
Epoch: 047/050 | Batch 1200/1582 | Gen/Dis Loss: 5.2454/0.0413
Epoch: 047/050 | Batch 1300/1582 | Gen/Dis Loss: 4.6562/0.0529
Epoch: 047/050 | Batch 1400/1582 | Gen/Dis Loss: 5.8049/0.0123
Epoch: 047/050 | Batch 1500/1582 | Gen/Dis Loss: 6.0520/0.0157
Time elapsed: 114.99 min
Epoch: 048/050 | Batch 000/1582 | Gen/Dis Loss: 4.9745/0.2188
Epoch: 048/050 | Batch 100/1582 | Gen/Dis Loss: 5.3769/0.0305
Epoch: 048/050 | Batch 200/1582 | Gen/Dis Loss: 8.2124/0.0940
Epoch: 048/050 | Batch 300/1582 | Gen/Dis Loss: 6.0560/0.1842
Epoch: 048/050 | Batch 400/1582 | Gen/Dis Loss: 6.3704/0.0153
Epoch: 048/050 | Batch 500/1582 | Gen/Dis Loss: 6.5244/0.0064
Epoch: 048/050 | Batch 600/1582 | Gen/Dis Loss: 6.4808/0.0041
Epoch: 048/050 | Batch 700/1582 | Gen/Dis Loss: 5.0122/0.0871
Epoch: 048/050 | Batch 800/1582 | Gen/Dis Loss: 6.1364/0.0170
Epoch: 048/050 | Batch 900/1582 | Gen/Dis Loss: 5.2422/0.0389
Epoch: 048/050 | Batch 1000/1582 | Gen/Dis Loss: 7.1602/0.0166
Epoch: 048/050 | Batch 1100/1582 | Gen/Dis Loss: 11.5645/1.8945
Epoch: 048/050 | Batch 1200/1582 | Gen/Dis Loss: 5.6504/0.0170
Epoch: 048/050 | Batch 1300/1582 | Gen/Dis Loss: 5.9460/0.0232
Epoch: 048/050 | Batch 1400/1582 | Gen/Dis Loss: 6.1778/0.0118
Epoch: 048/050 | Batch 1500/1582 | Gen/Dis Loss: 6.6627/0.0058
Time elapsed: 117.44 min
Epoch: 049/050 | Batch 000/1582 | Gen/Dis Loss: 6.6040/0.0105
Epoch: 049/050 | Batch 100/1582 | Gen/Dis Loss: 6.7983/0.0093
Epoch: 049/050 | Batch 200/1582 | Gen/Dis Loss: 6.3327/0.0133
Epoch: 049/050 | Batch 300/1582 | Gen/Dis Loss: 7.1165/0.0038
Epoch: 049/050 | Batch 400/1582 | Gen/Dis Loss: 7.4795/0.0045
Epoch: 049/050 | Batch 500/1582 | Gen/Dis Loss: 7.4721/0.0470
Epoch: 049/050 | Batch 600/1582 | Gen/Dis Loss: 7.4102/0.0043
Epoch: 049/050 | Batch 700/1582 | Gen/Dis Loss: 5.2226/0.0359
Epoch: 049/050 | Batch 800/1582 | Gen/Dis Loss: 8.4378/0.0125
Epoch: 049/050 | Batch 900/1582 | Gen/Dis Loss: 6.5429/0.0093
Epoch: 049/050 | Batch 1000/1582 | Gen/Dis Loss: 4.6924/0.1834
Epoch: 049/050 | Batch 1100/1582 | Gen/Dis Loss: 4.9575/0.1008
Epoch: 049/050 | Batch 1200/1582 | Gen/Dis Loss: 6.3343/0.1021
Epoch: 049/050 | Batch 1300/1582 | Gen/Dis Loss: 6.7143/0.0139
Epoch: 049/050 | Batch 1400/1582 | Gen/Dis Loss: 5.4402/0.0673
Epoch: 049/050 | Batch 1500/1582 | Gen/Dis Loss: 6.6034/0.0168
Time elapsed: 119.88 min
Epoch: 050/050 | Batch 000/1582 | Gen/Dis Loss: 6.2819/0.0147
Epoch: 050/050 | Batch 100/1582 | Gen/Dis Loss: 6.5657/0.0147
Epoch: 050/050 | Batch 200/1582 | Gen/Dis Loss: 6.8208/0.0146
Epoch: 050/050 | Batch 300/1582 | Gen/Dis Loss: 9.2332/0.0466
Epoch: 050/050 | Batch 400/1582 | Gen/Dis Loss: 4.3463/0.0593
Epoch: 050/050 | Batch 500/1582 | Gen/Dis Loss: 6.1466/0.0311
Epoch: 050/050 | Batch 600/1582 | Gen/Dis Loss: 6.5712/0.0120
Epoch: 050/050 | Batch 700/1582 | Gen/Dis Loss: 3.2893/0.0847
Epoch: 050/050 | Batch 800/1582 | Gen/Dis Loss: 5.8033/0.0270
Epoch: 050/050 | Batch 900/1582 | Gen/Dis Loss: 2.7801/0.4407
Epoch: 050/050 | Batch 1000/1582 | Gen/Dis Loss: 6.0085/0.0193
Epoch: 050/050 | Batch 1100/1582 | Gen/Dis Loss: 6.0947/0.0362
Epoch: 050/050 | Batch 1200/1582 | Gen/Dis Loss: 4.6609/0.0700
Epoch: 050/050 | Batch 1300/1582 | Gen/Dis Loss: 6.9875/0.0234
Epoch: 050/050 | Batch 1400/1582 | Gen/Dis Loss: 5.4024/0.0244
Epoch: 050/050 | Batch 1500/1582 | Gen/Dis Loss: 5.1951/0.1657
Time elapsed: 122.33 min
Total Training Time: 122.33 min

Evaluation

In [15]:
%matplotlib inline
import matplotlib.pyplot as plt
In [16]:
ax1 = plt.subplot(1, 1, 1)
ax1.plot(range(len(gener_costs)), gener_costs, label='Generator loss')
ax1.plot(range(len(discr_costs)), discr_costs, label='Discriminator loss')
ax1.set_xlabel('Iterations')
ax1.set_ylabel('Loss')
ax1.legend()

###################
# Set scond x-axis
ax2 = ax1.twiny()
newlabel = list(range(NUM_EPOCHS+1))
iter_per_epoch = len(train_loader)
newpos = [e*iter_per_epoch for e in newlabel]

ax2.set_xticklabels(newlabel[::10])
ax2.set_xticks(newpos[::10])

ax2.xaxis.set_ticks_position('bottom')
ax2.xaxis.set_label_position('bottom')
ax2.spines['bottom'].set_position(('outward', 45))
ax2.set_xlabel('Epochs')
ax2.set_xlim(ax1.get_xlim())
###################

plt.show()
In [21]:
##########################
### VISUALIZATION
##########################

for i in range(0, NUM_EPOCHS, 5):
    plt.imshow(np.transpose(images_from_noise[i], (1, 2, 0)))
    plt.show()