Deep Learning Models -- A collection of various deep learning architectures, models, and tips for TensorFlow and PyTorch in Jupyter Notebooks.

In [1]:
%load_ext watermark
%watermark -a 'Sebastian Raschka' -v -p torch
Sebastian Raschka 

CPython 3.6.8
IPython 7.2.0

torch 1.1.0
  • Runs on CPU or GPU (if available)

Model Zoo -- Ordinal Regression CNN -- Niu et al. 2016

Implementation of a method for ordinal regression by Niu et al. [1] applied to predicting age from face images in the AFAD [1] (Asian Face) dataset using a simple ResNet34 [2] convolutional network architecture.

Note that in order to reduce training time, only a subset of AFAD (AFAD-Lite) is being used.

Imports

In [2]:
import time
import numpy as np
import pandas as pd
import os

import torch.nn as nn
import torch.nn.functional as F
import torch

from torch.utils.data import Dataset
from torch.utils.data import DataLoader
from torchvision import transforms
from PIL import Image



if torch.cuda.is_available():
    torch.backends.cudnn.deterministic = True

Downloading the Dataset

In [3]:
!git clone https://github.com/afad-dataset/tarball-lite.git
Cloning into 'tarball-lite'...
remote: Enumerating objects: 37, done.
remote: Total 37 (delta 0), reused 0 (delta 0), pack-reused 37
Unpacking objects: 100% (37/37), done.
Checking out files: 100% (30/30), done.
In [4]:
!cat tarball-lite/AFAD-Lite.tar.xz* > tarball-lite/AFAD-Lite.tar.xz
In [5]:
!tar xf tarball-lite/AFAD-Lite.tar.xz
In [6]:
rootDir = 'AFAD-Lite'

files = [os.path.relpath(os.path.join(dirpath, file), rootDir)
         for (dirpath, dirnames, filenames) in os.walk(rootDir) 
         for file in filenames if file.endswith('.jpg')]
In [7]:
len(files)
Out[7]:
59344
In [8]:
d = {}

d['age'] = []
d['gender'] = []
d['file'] = []
d['path'] = []

for f in files:
    age, gender, fname = f.split('/')
    if gender == '111':
        gender = 'male'
    else:
        gender = 'female'
        
    d['age'].append(age)
    d['gender'].append(gender)
    d['file'].append(fname)
    d['path'].append(f)
In [9]:
df = pd.DataFrame.from_dict(d)
df.head()
Out[9]:
age gender file path
0 39 female 474596-0.jpg 39/112/474596-0.jpg
1 39 female 397477-0.jpg 39/112/397477-0.jpg
2 39 female 576466-0.jpg 39/112/576466-0.jpg
3 39 female 399405-0.jpg 39/112/399405-0.jpg
4 39 female 410524-0.jpg 39/112/410524-0.jpg
In [10]:
df['age'].min()
Out[10]:
'18'
In [11]:
df['age'] = df['age'].values.astype(int) - 18
In [12]:
np.random.seed(123)
msk = np.random.rand(len(df)) < 0.8
df_train = df[msk]
df_test = df[~msk]
In [13]:
df_train.set_index('file', inplace=True)
df_train.to_csv('training_set_lite.csv')
In [14]:
df_test.set_index('file', inplace=True)
df_test.to_csv('test_set_lite.csv')
In [15]:
num_ages = np.unique(df['age'].values).shape[0]
print(num_ages)
22

Settings

In [16]:
##########################
### SETTINGS
##########################

# Device
DEVICE = torch.device("cuda:1" if torch.cuda.is_available() else "cpu")

NUM_WORKERS = 8

NUM_CLASSES = 22
BATCH_SIZE = 512
NUM_EPOCHS = 150
LEARNING_RATE = 0.0005
RANDOM_SEED = 123

TRAIN_CSV_PATH = 'training_set_lite.csv'
TEST_CSV_PATH = 'test_set_lite.csv'
IMAGE_PATH = 'AFAD-Lite'

Dataset Loaders

In [17]:
class AFADDatasetAge(Dataset):
    """Custom Dataset for loading AFAD face images"""

    def __init__(self, csv_path, img_dir, transform=None):

        df = pd.read_csv(csv_path, index_col=0)
        self.img_dir = img_dir
        self.csv_path = csv_path
        self.img_paths = df['path']
        self.y = df['age'].values
        self.transform = transform

    def __getitem__(self, index):
        img = Image.open(os.path.join(self.img_dir,
                                      self.img_paths[index]))

        if self.transform is not None:
            img = self.transform(img)

        label = self.y[index]
        levels = [1]*label + [0]*(NUM_CLASSES - 1 - label)
        levels = torch.tensor(levels, dtype=torch.float32)

        return img, label, levels

    def __len__(self):
        return self.y.shape[0]


custom_transform = transforms.Compose([transforms.Resize((128, 128)),
                                       transforms.RandomCrop((120, 120)),
                                       transforms.ToTensor()])

train_dataset = AFADDatasetAge(csv_path=TRAIN_CSV_PATH,
                               img_dir=IMAGE_PATH,
                               transform=custom_transform)


custom_transform2 = transforms.Compose([transforms.Resize((128, 128)),
                                        transforms.CenterCrop((120, 120)),
                                        transforms.ToTensor()])

test_dataset = AFADDatasetAge(csv_path=TEST_CSV_PATH,
                              img_dir=IMAGE_PATH,
                              transform=custom_transform2)


train_loader = DataLoader(dataset=train_dataset,
                          batch_size=BATCH_SIZE,
                          shuffle=True,
                          num_workers=NUM_WORKERS)

test_loader = DataLoader(dataset=test_dataset,
                         batch_size=BATCH_SIZE,
                         shuffle=False,
                         num_workers=NUM_WORKERS)

Model

In [18]:
##########################
# MODEL
##########################

class AlexNet(nn.Module):

    def __init__(self, num_classes):
        super(AlexNet, self).__init__()
        self.num_classes = num_classes
        self.features = nn.Sequential(
            nn.Conv2d(3, 64, kernel_size=11, stride=4, padding=2),
            nn.ReLU(inplace=True),
            nn.MaxPool2d(kernel_size=3, stride=2),
            nn.Conv2d(64, 192, kernel_size=5, padding=2),
            nn.ReLU(inplace=True),
            nn.MaxPool2d(kernel_size=3, stride=2),
            nn.Conv2d(192, 384, kernel_size=3, padding=1),
            nn.ReLU(inplace=True),
            nn.Conv2d(384, 256, kernel_size=3, padding=1),
            nn.ReLU(inplace=True),
            nn.Conv2d(256, 256, kernel_size=3, padding=1),
            nn.ReLU(inplace=True),
            nn.MaxPool2d(kernel_size=3, stride=2),
        )
        self.avgpool = nn.AdaptiveAvgPool2d((6, 6))
        self.classifier = nn.Sequential(
            nn.Dropout(0.5),
            nn.Linear(256 * 6 * 6, 4096),
            nn.ReLU(inplace=True),
            nn.Dropout(0.5),
            nn.Linear(4096, 4096),
            nn.ReLU(inplace=True),
        )

        self.fc = nn.Linear(4096, (self.num_classes-1)*2)

    def forward(self, x):
        x = self.features(x)
        x = self.avgpool(x)
        x = x.view(x.size(0), 256 * 6 * 6)
        x = self.classifier(x)

        logits = self.fc(x)
        logits = logits.view(-1, (self.num_classes-1), 2)
        probas = F.softmax(logits, dim=2)[:, :, 1]
        return logits, probas
In [19]:
###########################################
# Initialize Cost, Model, and Optimizer
###########################################

def cost_fn(logits, levels):
    val = (-torch.sum((F.log_softmax(logits, dim=2)[:, :, 1]*levels
                      + F.log_softmax(logits, dim=2)[:, :, 0]*(1-levels)), dim=1))
    return torch.mean(val)


torch.manual_seed(RANDOM_SEED)
torch.cuda.manual_seed(RANDOM_SEED)
model = AlexNet(NUM_CLASSES)

model.to(DEVICE)
optimizer = torch.optim.Adam(model.parameters(), lr=LEARNING_RATE)

Training

In [20]:
def compute_mae_and_mse(model, data_loader, device):
    mae, mse, num_examples = 0, 0, 0
    for i, (features, targets, levels) in enumerate(data_loader):

        features = features.to(device)
        targets = targets.to(device)

        logits, probas = model(features)
        predict_levels = probas > 0.5
        predicted_labels = torch.sum(predict_levels, dim=1)
        num_examples += targets.size(0)
        mae += torch.sum(torch.abs(predicted_labels - targets))
        mse += torch.sum((predicted_labels - targets)**2)
    mae = mae.float() / num_examples
    mse = mse.float() / num_examples
    return mae, mse


start_time = time.time()
for epoch in range(NUM_EPOCHS):

    model.train()
    for batch_idx, (features, targets, levels) in enumerate(train_loader):

        features = features.to(DEVICE)
        targets = targets
        targets = targets.to(DEVICE)
        levels = levels.to(DEVICE)

        # FORWARD AND BACK PROP
        logits, probas = model(features)
        cost = cost_fn(logits, levels)
        optimizer.zero_grad()

        cost.backward()

        # UPDATE MODEL PARAMETERS
        optimizer.step()

        # LOGGING
        if not batch_idx % 150:
            s = ('Epoch: %03d/%03d | Batch %04d/%04d | Cost: %.4f'
                 % (epoch+1, NUM_EPOCHS, batch_idx,
                     len(train_dataset)//BATCH_SIZE, cost))
            print(s)

    s = 'Time elapsed: %.2f min' % ((time.time() - start_time)/60)
    print(s)
Epoch: 001/150 | Batch 0000/0092 | Cost: 14.5346
Time elapsed: 0.47 min
Epoch: 002/150 | Batch 0000/0092 | Cost: 10.1990
Time elapsed: 0.93 min
Epoch: 003/150 | Batch 0000/0092 | Cost: 9.3695
Time elapsed: 1.41 min
Epoch: 004/150 | Batch 0000/0092 | Cost: 8.7663
Time elapsed: 1.88 min
Epoch: 005/150 | Batch 0000/0092 | Cost: 8.5821
Time elapsed: 2.36 min
Epoch: 006/150 | Batch 0000/0092 | Cost: 8.8506
Time elapsed: 2.80 min
Epoch: 007/150 | Batch 0000/0092 | Cost: 8.1522
Time elapsed: 3.25 min
Epoch: 008/150 | Batch 0000/0092 | Cost: 9.3045
Time elapsed: 3.74 min
Epoch: 009/150 | Batch 0000/0092 | Cost: 8.2951
Time elapsed: 4.21 min
Epoch: 010/150 | Batch 0000/0092 | Cost: 8.1094
Time elapsed: 4.67 min
Epoch: 011/150 | Batch 0000/0092 | Cost: 8.3870
Time elapsed: 5.15 min
Epoch: 012/150 | Batch 0000/0092 | Cost: 8.1078
Time elapsed: 5.62 min
Epoch: 013/150 | Batch 0000/0092 | Cost: 7.6846
Time elapsed: 6.11 min
Epoch: 014/150 | Batch 0000/0092 | Cost: 7.7015
Time elapsed: 6.56 min
Epoch: 015/150 | Batch 0000/0092 | Cost: 7.5693
Time elapsed: 7.02 min
Epoch: 016/150 | Batch 0000/0092 | Cost: 7.9339
Time elapsed: 7.51 min
Epoch: 017/150 | Batch 0000/0092 | Cost: 7.7916
Time elapsed: 7.97 min
Epoch: 018/150 | Batch 0000/0092 | Cost: 7.1257
Time elapsed: 8.46 min
Epoch: 019/150 | Batch 0000/0092 | Cost: 6.8873
Time elapsed: 8.89 min
Epoch: 020/150 | Batch 0000/0092 | Cost: 7.1145
Time elapsed: 9.36 min
Epoch: 021/150 | Batch 0000/0092 | Cost: 7.3858
Time elapsed: 9.78 min
Epoch: 022/150 | Batch 0000/0092 | Cost: 7.5949
Time elapsed: 10.25 min
Epoch: 023/150 | Batch 0000/0092 | Cost: 6.9696
Time elapsed: 10.68 min
Epoch: 024/150 | Batch 0000/0092 | Cost: 7.1473
Time elapsed: 11.15 min
Epoch: 025/150 | Batch 0000/0092 | Cost: 6.6147
Time elapsed: 11.61 min
Epoch: 026/150 | Batch 0000/0092 | Cost: 6.3201
Time elapsed: 12.12 min
Epoch: 027/150 | Batch 0000/0092 | Cost: 6.5789
Time elapsed: 12.53 min
Epoch: 028/150 | Batch 0000/0092 | Cost: 6.4532
Time elapsed: 12.99 min
Epoch: 029/150 | Batch 0000/0092 | Cost: 6.5590
Time elapsed: 13.44 min
Epoch: 030/150 | Batch 0000/0092 | Cost: 6.4204
Time elapsed: 13.91 min
Epoch: 031/150 | Batch 0000/0092 | Cost: 6.1485
Time elapsed: 14.37 min
Epoch: 032/150 | Batch 0000/0092 | Cost: 6.6225
Time elapsed: 14.87 min
Epoch: 033/150 | Batch 0000/0092 | Cost: 5.9400
Time elapsed: 15.32 min
Epoch: 034/150 | Batch 0000/0092 | Cost: 6.0526
Time elapsed: 15.82 min
Epoch: 035/150 | Batch 0000/0092 | Cost: 5.9100
Time elapsed: 16.28 min
Epoch: 036/150 | Batch 0000/0092 | Cost: 5.8563
Time elapsed: 16.75 min
Epoch: 037/150 | Batch 0000/0092 | Cost: 5.4942
Time elapsed: 17.20 min
Epoch: 038/150 | Batch 0000/0092 | Cost: 5.3825
Time elapsed: 17.69 min
Epoch: 039/150 | Batch 0000/0092 | Cost: 5.4557
Time elapsed: 18.15 min
Epoch: 040/150 | Batch 0000/0092 | Cost: 5.4534
Time elapsed: 18.66 min
Epoch: 041/150 | Batch 0000/0092 | Cost: 5.2443
Time elapsed: 19.08 min
Epoch: 042/150 | Batch 0000/0092 | Cost: 5.2351
Time elapsed: 19.57 min
Epoch: 043/150 | Batch 0000/0092 | Cost: 5.1354
Time elapsed: 20.02 min
Epoch: 044/150 | Batch 0000/0092 | Cost: 5.1245
Time elapsed: 20.50 min
Epoch: 045/150 | Batch 0000/0092 | Cost: 5.0352
Time elapsed: 20.96 min
Epoch: 046/150 | Batch 0000/0092 | Cost: 4.7361
Time elapsed: 21.47 min
Epoch: 047/150 | Batch 0000/0092 | Cost: 4.6973
Time elapsed: 21.93 min
Epoch: 048/150 | Batch 0000/0092 | Cost: 4.6416
Time elapsed: 22.44 min
Epoch: 049/150 | Batch 0000/0092 | Cost: 4.6076
Time elapsed: 22.89 min
Epoch: 050/150 | Batch 0000/0092 | Cost: 4.5119
Time elapsed: 23.37 min
Epoch: 051/150 | Batch 0000/0092 | Cost: 4.2692
Time elapsed: 23.82 min
Epoch: 052/150 | Batch 0000/0092 | Cost: 4.2506
Time elapsed: 24.33 min
Epoch: 053/150 | Batch 0000/0092 | Cost: 4.2682
Time elapsed: 24.79 min
Epoch: 054/150 | Batch 0000/0092 | Cost: 4.7041
Time elapsed: 25.30 min
Epoch: 055/150 | Batch 0000/0092 | Cost: 3.9781
Time elapsed: 25.75 min
Epoch: 056/150 | Batch 0000/0092 | Cost: 4.4825
Time elapsed: 26.23 min
Epoch: 057/150 | Batch 0000/0092 | Cost: 3.8956
Time elapsed: 26.70 min
Epoch: 058/150 | Batch 0000/0092 | Cost: 3.8620
Time elapsed: 27.17 min
Epoch: 059/150 | Batch 0000/0092 | Cost: 4.3340
Time elapsed: 27.64 min
Epoch: 060/150 | Batch 0000/0092 | Cost: 3.6278
Time elapsed: 28.13 min
Epoch: 061/150 | Batch 0000/0092 | Cost: 3.8466
Time elapsed: 28.61 min
Epoch: 062/150 | Batch 0000/0092 | Cost: 4.0128
Time elapsed: 29.06 min
Epoch: 063/150 | Batch 0000/0092 | Cost: 3.6341
Time elapsed: 29.54 min
Epoch: 064/150 | Batch 0000/0092 | Cost: 3.7518
Time elapsed: 29.99 min
Epoch: 065/150 | Batch 0000/0092 | Cost: 3.4808
Time elapsed: 30.47 min
Epoch: 066/150 | Batch 0000/0092 | Cost: 3.8870
Time elapsed: 30.94 min
Epoch: 067/150 | Batch 0000/0092 | Cost: 3.3818
Time elapsed: 31.42 min
Epoch: 068/150 | Batch 0000/0092 | Cost: 3.2848
Time elapsed: 31.90 min
Epoch: 069/150 | Batch 0000/0092 | Cost: 3.3755
Time elapsed: 32.40 min
Epoch: 070/150 | Batch 0000/0092 | Cost: 3.3649
Time elapsed: 32.87 min
Epoch: 071/150 | Batch 0000/0092 | Cost: 3.3467
Time elapsed: 33.35 min
Epoch: 072/150 | Batch 0000/0092 | Cost: 2.9973
Time elapsed: 33.82 min
Epoch: 073/150 | Batch 0000/0092 | Cost: 3.0707
Time elapsed: 34.29 min
Epoch: 074/150 | Batch 0000/0092 | Cost: 3.3438
Time elapsed: 34.76 min
Epoch: 075/150 | Batch 0000/0092 | Cost: 3.0139
Time elapsed: 35.23 min
Epoch: 076/150 | Batch 0000/0092 | Cost: 3.0865
Time elapsed: 35.70 min
Epoch: 077/150 | Batch 0000/0092 | Cost: 3.1229
Time elapsed: 36.16 min
Epoch: 078/150 | Batch 0000/0092 | Cost: 2.9919
Time elapsed: 36.61 min
Epoch: 079/150 | Batch 0000/0092 | Cost: 3.0086
Time elapsed: 37.09 min
Epoch: 080/150 | Batch 0000/0092 | Cost: 2.8564
Time elapsed: 37.55 min
Epoch: 081/150 | Batch 0000/0092 | Cost: 2.9466
Time elapsed: 38.00 min
Epoch: 082/150 | Batch 0000/0092 | Cost: 2.7265
Time elapsed: 38.45 min
Epoch: 083/150 | Batch 0000/0092 | Cost: 2.8810
Time elapsed: 38.91 min
Epoch: 084/150 | Batch 0000/0092 | Cost: 2.7061
Time elapsed: 39.37 min
Epoch: 085/150 | Batch 0000/0092 | Cost: 2.6701
Time elapsed: 39.87 min
Epoch: 086/150 | Batch 0000/0092 | Cost: 2.6754
Time elapsed: 40.30 min
Epoch: 087/150 | Batch 0000/0092 | Cost: 2.6844
Time elapsed: 40.79 min
Epoch: 088/150 | Batch 0000/0092 | Cost: 2.6610
Time elapsed: 41.23 min
Epoch: 089/150 | Batch 0000/0092 | Cost: 2.9059
Time elapsed: 41.72 min
Epoch: 090/150 | Batch 0000/0092 | Cost: 2.6932
Time elapsed: 42.16 min
Epoch: 091/150 | Batch 0000/0092 | Cost: 2.4559
Time elapsed: 42.64 min
Epoch: 092/150 | Batch 0000/0092 | Cost: 2.6640
Time elapsed: 43.11 min
Epoch: 093/150 | Batch 0000/0092 | Cost: 2.5860
Time elapsed: 43.59 min
Epoch: 094/150 | Batch 0000/0092 | Cost: 2.6070
Time elapsed: 44.06 min
Epoch: 095/150 | Batch 0000/0092 | Cost: 2.4515
Time elapsed: 44.52 min
Epoch: 096/150 | Batch 0000/0092 | Cost: 2.5023
Time elapsed: 44.97 min
Epoch: 097/150 | Batch 0000/0092 | Cost: 2.3886
Time elapsed: 45.45 min
Epoch: 098/150 | Batch 0000/0092 | Cost: 2.6237
Time elapsed: 45.90 min
Epoch: 099/150 | Batch 0000/0092 | Cost: 2.3156
Time elapsed: 46.37 min
Epoch: 100/150 | Batch 0000/0092 | Cost: 2.2186
Time elapsed: 46.85 min
Epoch: 101/150 | Batch 0000/0092 | Cost: 2.4205
Time elapsed: 47.32 min
Epoch: 102/150 | Batch 0000/0092 | Cost: 2.4243
Time elapsed: 47.79 min
Epoch: 103/150 | Batch 0000/0092 | Cost: 2.4262
Time elapsed: 48.23 min
Epoch: 104/150 | Batch 0000/0092 | Cost: 2.4243
Time elapsed: 48.69 min
Epoch: 105/150 | Batch 0000/0092 | Cost: 2.1756
Time elapsed: 49.15 min
Epoch: 106/150 | Batch 0000/0092 | Cost: 2.1816
Time elapsed: 49.61 min
Epoch: 107/150 | Batch 0000/0092 | Cost: 2.3446
Time elapsed: 50.07 min
Epoch: 108/150 | Batch 0000/0092 | Cost: 2.2174
Time elapsed: 50.51 min
Epoch: 109/150 | Batch 0000/0092 | Cost: 2.2063
Time elapsed: 50.99 min
Epoch: 110/150 | Batch 0000/0092 | Cost: 2.3621
Time elapsed: 51.47 min
Epoch: 111/150 | Batch 0000/0092 | Cost: 2.2048
Time elapsed: 51.93 min
Epoch: 112/150 | Batch 0000/0092 | Cost: 1.9002
Time elapsed: 52.39 min
Epoch: 113/150 | Batch 0000/0092 | Cost: 2.3146
Time elapsed: 52.88 min
Epoch: 114/150 | Batch 0000/0092 | Cost: 2.2218
Time elapsed: 53.32 min
Epoch: 115/150 | Batch 0000/0092 | Cost: 2.5772
Time elapsed: 53.79 min
Epoch: 116/150 | Batch 0000/0092 | Cost: 1.9954
Time elapsed: 54.22 min
Epoch: 117/150 | Batch 0000/0092 | Cost: 2.2189
Time elapsed: 54.72 min
Epoch: 118/150 | Batch 0000/0092 | Cost: 2.0534
Time elapsed: 55.17 min
Epoch: 119/150 | Batch 0000/0092 | Cost: 2.1909
Time elapsed: 55.67 min
Epoch: 120/150 | Batch 0000/0092 | Cost: 1.9588
Time elapsed: 56.10 min
Epoch: 121/150 | Batch 0000/0092 | Cost: 1.8609
Time elapsed: 56.57 min
Epoch: 122/150 | Batch 0000/0092 | Cost: 2.2825
Time elapsed: 57.04 min
Epoch: 123/150 | Batch 0000/0092 | Cost: 2.2175
Time elapsed: 57.55 min
Epoch: 124/150 | Batch 0000/0092 | Cost: 2.0279
Time elapsed: 57.98 min
Epoch: 125/150 | Batch 0000/0092 | Cost: 1.9375
Time elapsed: 58.46 min
Epoch: 126/150 | Batch 0000/0092 | Cost: 2.0164
Time elapsed: 58.89 min
Epoch: 127/150 | Batch 0000/0092 | Cost: 2.1515
Time elapsed: 59.39 min
Epoch: 128/150 | Batch 0000/0092 | Cost: 1.9873
Time elapsed: 59.83 min
Epoch: 129/150 | Batch 0000/0092 | Cost: 1.8686
Time elapsed: 60.30 min
Epoch: 130/150 | Batch 0000/0092 | Cost: 1.9796
Time elapsed: 60.73 min
Epoch: 131/150 | Batch 0000/0092 | Cost: 1.7672
Time elapsed: 61.23 min
Epoch: 132/150 | Batch 0000/0092 | Cost: 1.9022
Time elapsed: 61.70 min
Epoch: 133/150 | Batch 0000/0092 | Cost: 1.8617
Time elapsed: 62.19 min
Epoch: 134/150 | Batch 0000/0092 | Cost: 1.7341
Time elapsed: 62.66 min
Epoch: 135/150 | Batch 0000/0092 | Cost: 1.7973
Time elapsed: 63.17 min
Epoch: 136/150 | Batch 0000/0092 | Cost: 1.7751
Time elapsed: 63.63 min
Epoch: 137/150 | Batch 0000/0092 | Cost: 1.9271
Time elapsed: 64.14 min
Epoch: 138/150 | Batch 0000/0092 | Cost: 1.6380
Time elapsed: 64.59 min
Epoch: 139/150 | Batch 0000/0092 | Cost: 1.7169
Time elapsed: 65.07 min
Epoch: 140/150 | Batch 0000/0092 | Cost: 1.8063
Time elapsed: 65.53 min
Epoch: 141/150 | Batch 0000/0092 | Cost: 1.8708
Time elapsed: 66.03 min
Epoch: 142/150 | Batch 0000/0092 | Cost: 1.4449
Time elapsed: 66.48 min
Epoch: 143/150 | Batch 0000/0092 | Cost: 1.8047
Time elapsed: 66.95 min
Epoch: 144/150 | Batch 0000/0092 | Cost: 1.9332
Time elapsed: 67.40 min
Epoch: 145/150 | Batch 0000/0092 | Cost: 1.8951
Time elapsed: 67.85 min
Epoch: 146/150 | Batch 0000/0092 | Cost: 1.6895
Time elapsed: 68.33 min
Epoch: 147/150 | Batch 0000/0092 | Cost: 1.7324
Time elapsed: 68.77 min
Epoch: 148/150 | Batch 0000/0092 | Cost: 1.6677
Time elapsed: 69.24 min
Epoch: 149/150 | Batch 0000/0092 | Cost: 1.6663
Time elapsed: 69.71 min
Epoch: 150/150 | Batch 0000/0092 | Cost: 1.7063
Time elapsed: 70.18 min

Evaluation

In [21]:
model.eval()
with torch.set_grad_enabled(False):  # save memory during inference

    train_mae, train_mse = compute_mae_and_mse(model, train_loader,
                                               device=DEVICE)
    test_mae, test_mse = compute_mae_and_mse(model, test_loader,
                                             device=DEVICE)

    s = 'MAE/RMSE: | Train: %.2f/%.2f | Test: %.2f/%.2f' % (
        train_mae, torch.sqrt(train_mse), test_mae, torch.sqrt(test_mse))
    print(s)

s = 'Total Training Time: %.2f min' % ((time.time() - start_time)/60)
print(s)
MAE/RMSE: | Train: 0.65/1.13 | Test: 3.91/5.40
Total Training Time: 70.77 min
In [22]:
%watermark -iv
numpy       1.15.4
pandas      0.23.4
torch       1.1.0
PIL.Image   5.3.0