Fighting Wildfires Using a Cloud-based Supercomputer

Let's use the Descartes Labs platform to assess and quantify the degree of damage a wildfire inflicts upon the land. This demo focuses on California's most expensive fire in history, the 2016 Soberanes Fire on the beautiful Monterey coast and Big Sur National Park.

In [1]:
# Let's start with importing all the toolboxes we will need for both analysis and vizualization
from IPython.display import display, Image
from import Places
from import Metadata
from mpl_toolkits.axes_grid1 import make_axes_locatable
from copy import deepcopy
from skimage import measure
from skimage.morphology import dilation #, erosion, opening, closing, white_tophat
from skimage.morphology import disk
from pprint import pprint
from pylab import *
import descarteslabs as dl
import numpy as np
import matplotlib.pyplot as plt

%matplotlib inline

Retreive imagery

In [2]:
# First let's define the AOI as the county in which the Soberanes fire occurred

# Find potential AOI matches
matches = dl.places.find('california_monterey')
# The first one looks good, so let's make that our area of interest.
aoi = matches[0]
shape = dl.places.shape(aoi['slug'], geom='low')
[{u'bbox': [-122.051878, 35.789276, -120.213979, 36.919683],
  u'id': 102080859,
  u'name': u'Monterey',
  u'path': u'continent:north-america_country:united-states_region:california_district:central-coast_county:monterey',
  u'placetype': u'county',
  u'slug': u'north-america_united-states_california_central-coast_monterey'}]
In [3]:
# Check for imagery before the start date of July 22nd

feature_collection ='L8', start_time='2016-07-22', end_time='2016-07-31',
                                        limit=10, place=aoi['slug'])
# As the variable name implies, this returns a FeatureCollection GeoJSON dictionary.
# Its 'features' are the available scenes.

# The 'id' associated with each feature is a unique identifier into our imagery database.
# In this case there are 4 L8 scenes from adjoining WRS rows.
print([f['id'] for f in feature_collection['features']])

# Now check for imagery in late October, i.e., towards the end of the fire
feature_collection ='L8', start_time='2016-10-15', end_time='2016-10-31',
                                        limit=10, place=aoi['slug'])

print([f['id'] for f in feature_collection['features']])

# Let's print out all the available bands we have for Landsat 8
L8_bands = dl.raster.get_bands_by_constellation("L8").keys()
# Even though the 'bai' listed here stands for Burn Area Index, we need a normalized version of this index
# We get the NBR (normalized burn ratio) by using the swir2 and nir bands 
[u'meta_LC80430342016204_v1', u'meta_LC80430352016204_v1', u'meta_LC80440342016211_v1', u'meta_LC80440352016211_v1']
[u'meta_LC80440342016291_v1', u'meta_LC80440352016291_v1', u'meta_LC80420352016293_v1', u'meta_LC80430342016300_v1', u'meta_LC80430352016300_v1']
[u'thermal', u'ndvi', u'cloud', u'blue', u'qa_water', u'visual_cloud_mask', u'ndwi2', u'ndwi1', u'qa_snow', u'qa_cirrus', u'aerosol', u'red', u'rsqrt', u'nir', u'alpha', u'ndwi', u'evi', u'swir1', u'swir2', u'bright', u'green', u'qa_cloud', u'cirrus', u'bai']
In [4]:
# Collect the id's for each feature
ids = [f['id'] for f in feature_collection['features']]
# Rasterize the features.
#  * Select red, green, blue, alpha
#  * Scale the incoming data with range [0, 10000] down to [0, 4000] (40% TOAR)
#  * Choose an output type of "Byte" (uint8)
#  * Choose 60m resolution
#  * Apply a cutline of Taos county
arr, meta = dl.raster.ndarray(
    bands=['red', 'green', 'blue', 'alpha'],
    scales=[[0,2048], [0, 2048], [0, 2048], None],

# Note: A value of 1 in the alpha channel signifies where there is valid data.
# We use this throughout the majority of our imagery as a standard way of specifying
# valid or nodata regions. This is particularly helpful if a value of 0 in a particular
# band has meaning, rather than specifying a lack of data.
In [5]:
# We'll use matplotlib to make a quick plot of the image.