Open In Colab

Fast AI Image Classification using Mixed Precision and Progressive Resizing


This notebook is based on a jupyter notebook by Aayush Agrawal which was modified to be run in Colab, a free platform for deep learning research and education.

This uses the Plant Village dataset to try to achieve world class results as quickly as possible using's mixed precision capabilities as well as progressive resizing techniques suggested by Jeremy Howard in the Practical Deep Learning for Coders forum and class discussions.

The dataset used is the PlantVillage Dataset.

The PlantVillage dataset has images of plant leaves which consist of 38 disease classes which are commonly found on crops and one background class from Stanford's open dataset of background images - DAGS.

The dataset was downloaded from links given on this Github Repo as the original notebook suggested.

The actual preprocessed dataset can be downloaded from here.

It might also be useful to compare the results achieved here with the other results collated by the original author Marko Arsenovic for the book Deep Learning for Plant Diseases: Detection and Saliency Map Visualisation

Conclusions (TLDR; )

  • Possible State of the Art Accuracy Record (as of March 2019) : 99.8001 percent (see this for comparison)
  • Low number of training epochs: 36
  • Low memory requirements: Max GPU Memory 11 GB, Average 3-4 GB for first 2 stages
  • Average to Medium Training Time on a low end hardware (Nvidia K80): 6.36 hrs
  • Platform: Colab with GPU
  • Yes, you can get SOTA results on free DL platforms using FastAI, mixed precision, progressive resizing and simple fine tuning using the fastai learning rate finder.

  • I have not been able to find updates to the published accuracy records since 2017 so this record still needs to be verified

Some Notes about the Notebook

  • In order to be able to reproduce the steps quickly, the data and the intermediate models were backed up into and restored from Google Drive (using Colab's integration with google drive)
  • Due to Colab's memory resource allocation limits, the kernel was always restarted after every training run or after running the Learning Rate finder or even the classification interpretation.
  • Also, since the notebook was run over several sessions, when each session was terminated, the files (data and intermediate models) were deleted, so before each session (or even after every training run), the intermediate models were backed into Google Drive.
  • In addition to .pth formats, the intermediate and final models were also exported as .pkl files and backed up into Google Drive so they can be used for inference later.

Setup Colab Environment for Fastai

Install latest fastai version, create standard directories (/content/data and /content/models) and delete the default sample_data directory created by Colab.


In [0]:
# !curl | bash
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
100   321  100   321    0     0   2018      0 --:--:-- --:--:-- --:--:--  2018
Updating fastai...
featuretools 0.4.1 has requirement pandas>=0.23.0, but you'll have pandas 0.22.0 which is incompatible.
datascience 0.10.6 has requirement folium==0.2.1, but you'll have folium 0.8.3 which is incompatible.
albumentations 0.1.12 has requirement imgaug<0.2.7,>=0.2.5, but you'll have imgaug 0.2.8 which is incompatible.

Run magic directives to autoreload and inline matplots

In [0]:
# %reload_ext autoreload
# %autoreload 2
# %matplotlib inline

Connect Colab to Google

Since I already created a shared file link (using Google Drive) to the original dataset), I can use this link to later copy the data into my Colab data directory.

The nice thing about the shared file link is that it doesn't use up my own personal Google Drive quota (as the dataset is quite large even in compressed format).

In [0]:
# from google.colab import drive
# drive.mount('/content/gdrive',force_remount=True)

Importing Fast AI library

In [0]:
from fastai import *
from import *
from fastai.metrics import error_rate, accuracy

Set some useful variables and utility functions

I store my backup data and models in the folders /fastai_v3/data and /fastai_v3/models so that I can reproduce my work and also sometimes reduce the need to rebuild the data and models I am studying.

In [0]:
gdrive = Path('/content/gdrive/My Drive/fastai_v3')
In [0]:
escdrive = lambda x : x.as_posix().replace(' ','\ ') # useful utility to escape spaces

Load the dataset

In [0]:
dataset = 'PlantVillage'

Copy compressed dataset into the work data directory

In [0]:
# !cp {escdrive(gdrive/'data'/(dataset + '.tar.gz'))} {Config.data_path()}

Decompress the data (remove the verbose in tar option as the listing can be very large and might cause the page to hang)

In [0]:
# !tar xzf {(Config.data_path()/(dataset + '.tar.gz')).as_posix()} -C {Config.data_path()}

Setup Default Metrics, Path, Image Size and Transforms

Set metrics for tracking stats while training the model

In [0]:
metrics = [error_rate, accuracy]

Set the path of the dataset

In [0]:
## Declaring path of dataset
path_img = Config.data_path()/dataset; path_img

Set default image size to 224

In [0]:

Setup transforms

In [0]:
ds_tfms = get_transforms()

Prepare Labeled Datasets

Setup labelled datasets

In [0]:
## Loading data and Normalizing data based on Image net parameters
src  = (ImageList.from_folder(path_img)
        .split_by_folder(train='train', valid='val')

Skip the following section on Setting up the Databunch if doubling in image size and kernel restarted

Setup Databunch

Set batchsize

In [0]:
bs = 256 # 256 might be possible with mixed precision

Set image size ( Progressive Sizing means train initially with small image sizes)

In [0]:
size = default_size // 2 # 112 as initial size
In [0]:
data = (src
        .transform(tfms=ds_tfms, size=size)

Show Data

In [0]:
data.show_batch(rows=3, figsize=(10,8))

Data Classes

In [0]:
['Apple___Apple_scab', 'Apple___Black_rot', 'Apple___Cedar_apple_rust', 'Apple___healthy', 'Blueberry___healthy', 'Cherry_(including_sour)___Powdery_mildew', 'Cherry_(including_sour)___healthy', 'Corn_(maize)___Cercospora_leaf_spot Gray_leaf_spot', 'Corn_(maize)___Common_rust_', 'Corn_(maize)___Northern_Leaf_Blight', 'Corn_(maize)___healthy', 'Grape___Black_rot', 'Grape___Esca_(Black_Measles)', 'Grape___Leaf_blight_(Isariopsis_Leaf_Spot)', 'Grape___healthy', 'Orange___Haunglongbing_(Citrus_greening)', 'Peach___Bacterial_spot', 'Peach___healthy', 'Pepper,_bell___Bacterial_spot', 'Pepper,_bell___healthy', 'Potato___Early_blight', 'Potato___Late_blight', 'Potato___healthy', 'Raspberry___healthy', 'Soybean___healthy', 'Squash___Powdery_mildew', 'Strawberry___Leaf_scorch', 'Strawberry___healthy', 'Tomato___Bacterial_spot', 'Tomato___Early_blight', 'Tomato___Late_blight', 'Tomato___Leaf_Mold', 'Tomato___Septoria_leaf_spot', 'Tomato___Spider_mites Two-spotted_spider_mite', 'Tomato___Target_Spot', 'Tomato___Tomato_Yellow_Leaf_Curl_Virus', 'Tomato___Tomato_mosaic_virus', 'Tomato___healthy', 'background']
(39, 39)

Create Model

Create model with ResNET50

In [0]:
## To create a ResNET 50 with pretrained weights
learn = cnn_learner(data, models.resnet50, metrics=metrics)
Downloading: "" to /root/.torch/models/resnet50-19c8e357.pth
102502400it [00:01, 99988865.92it/s]

Convert to use mixed precision

In [0]:
## convert to use mixed precision
In [0]:

Train Stage 1 Model Using Frozen Model and Default Max LR

Train using 5 epochs, max learning = 1e-3 (default)

In [0]:
Total time: 29:58

epoch train_loss valid_loss error_rate accuracy time
0 0.580583 0.231167 0.070611 0.929389 06:12
1 0.219099 0.114973 0.039622 0.960378 05:51
2 0.129548 0.073774 0.023083 0.976918 05:56
3 0.083857 0.053838 0.018720 0.981280 05:58
4 0.068667 0.050738 0.017085 0.982915 06:00

Backup Stage 1 Model

Having a consistent naming scheme for the model makes it easy to version and review what it contains and how it was built.

In [0]:
model_name = 'plant-vintage-mixedprecision-stage1-fp16-sz112-bs256-resnet50-lr1e3-cycle5'
In [0]:
In [0]:
# !cp {(path_img/'models'/(model_name + '.pth')).as_posix()} {escdrive(gdrive/'models')}

Study Stage 1 Model

Review training stats

In [0]:

Convert to fp32 for interpretation

In [0]:

Create interpreter

In [0]:
interp = ClassificationInterpretation.from_learner(learn)
In [0]:
interp.plot_top_losses(4, figsize=(9,9))

See top right image - the model maybe activating on the background, not the leaf itself.

It may make sense to review the quality of the training data to prevent bias since the model is starting to "learn" the wrong parts for classification.

In [0]:
interp.plot_confusion_matrix(figsize=(20,20), dpi=60)
In [0]:
[('Corn_(maize)___Cercospora_leaf_spot Gray_leaf_spot',
 ('Tomato___Target_Spot', 'Tomato___Spider_mites Two-spotted_spider_mite', 9),
  'Corn_(maize)___Cercospora_leaf_spot Gray_leaf_spot',
 ('Tomato___Spider_mites Two-spotted_spider_mite', 'Tomato___Target_Spot', 8),
 ('Grape___Black_rot', 'Grape___Esca_(Black_Measles)', 7),
 ('Tomato___Early_blight', 'Tomato___Target_Spot', 7),
 ('Tomato___Late_blight', 'Tomato___Early_blight', 7),
 ('Apple___healthy', 'Soybean___healthy', 5),
 ('Tomato___Early_blight', 'Tomato___Late_blight', 5),
 ('Tomato___Early_blight', 'Tomato___Bacterial_spot', 4),
 ('Tomato___Late_blight', 'Potato___Late_blight', 4),
 ('Tomato___Target_Spot', 'Tomato___Septoria_leaf_spot', 4),
 ('Apple___healthy', 'Blueberry___healthy', 3),
 ('Grape___Esca_(Black_Measles)', 'Grape___Black_rot', 3),
 ('Tomato___Bacterial_spot', 'Tomato___Late_blight', 3),
 ('Tomato___Bacterial_spot', 'Tomato___Target_Spot', 3),
 ('Tomato___Early_blight', 'Tomato___Spider_mites Two-spotted_spider_mite', 3),
 ('Tomato___Late_blight', 'Tomato___Target_Spot', 3),
 ('Tomato___Leaf_Mold', 'Tomato___Spider_mites Two-spotted_spider_mite', 3),
 ('Tomato___Septoria_leaf_spot', 'Tomato___Target_Spot', 3),
  'Tomato___Spider_mites Two-spotted_spider_mite',
 ('Peach___Bacterial_spot', 'Peach___healthy', 2),
 ('Pepper,_bell___Bacterial_spot', 'Pepper,_bell___healthy', 2),
 ('Pepper,_bell___healthy', 'Orange___Haunglongbing_(Citrus_greening)', 2),
 ('Potato___Late_blight', 'Tomato___Late_blight', 2),
 ('Tomato___Bacterial_spot', 'Soybean___healthy', 2),
 ('Tomato___Bacterial_spot', 'Tomato___Tomato_Yellow_Leaf_Curl_Virus', 2),
 ('Tomato___Early_blight', 'Potato___Early_blight', 2),
 ('Tomato___Late_blight', 'Tomato___Leaf_Mold', 2),
 ('Tomato___Late_blight', 'Tomato___Spider_mites Two-spotted_spider_mite', 2),
 ('Tomato___Septoria_leaf_spot', 'Tomato___Leaf_Mold', 2),
 ('Tomato___Spider_mites Two-spotted_spider_mite', 'Tomato___Leaf_Mold', 2),
 ('Tomato___Target_Spot', 'Tomato___Bacterial_spot', 2),
 ('Tomato___healthy', 'Tomato___Spider_mites Two-spotted_spider_mite', 2),
 ('Tomato___healthy', 'Tomato___Target_Spot', 2)]

Continue Training Model

Convert back to mixed precision for learning rate finding and further training

In [0]:

Prepare Stage 2 (Unfrozen all layers)

Reload model

In [0]:

Run LR Finder

In [0]:
LR Finder is complete, type {learner_name}.recorder.plot() to see the graph.
In [0]:

Unfreeze the Model for Stage 2

In [0]:

Train Unfrozen Model (Stage 2) at Half-size (112x112)

In [0]:
learn.fit_one_cycle(4, max_lr=slice(1e-6,5e-4))
Total time: 25:22

epoch train_loss valid_loss error_rate accuracy time
0 0.062152 0.043785 0.013631 0.986369 06:21
1 0.043953 0.032484 0.009724 0.990276 06:22
2 0.030082 0.024337 0.007361 0.992639 06:24
3 0.023557 0.023675 0.007179 0.992821 06:14

Backup Unfrozen Halfsize Model (Stage 2)

In [0]:
model_name = (model_name
In [0]:
In [0]:
!cp {(path_img/'models'/(model_name + '.pth')).as_posix()} {escdrive(gdrive/'models')}
In [0]:
learn.export(model_name + '.pkl')
In [0]:
!cp {(path_img/(model_name + '.pkl')).as_posix()} {escdrive(gdrive/'models')}

Review training stats

In [0]:

RESTART KERNEL and Reload local variables

Resize Image to Full and Reduce Batch Size

In [0]:
model_name = 'plant-vintage-mixedprecision-stage2-fp16-sz112-bs256-resnet50-lrs1e6s5e4-cycle4'

Copy from gdrive

In [0]:
# !cp {escdrive(gdrive/'models'/(model_name + '.pth'))}  {(path_img/'models').as_posix()}

Setup Databunch with Image Size Full (Default Data)

Reduce batchsize

In [0]:
bs = 128 # half of 256 might be possible with mixed precision

Set image size to default ( Progressive Sizing means train initially with small image sizes and retrain model on larger image size)

In [0]:
size = default_size  # 224 as progressive size
In [0]