Currently, there are two spatial entropy have been implemented:
Here we are going to compare the difference between Shannon's entropy and the spatial entropy
Given a random variable X, with possible outcomes xi, each with probability P(xi), the entropy H(X) of X is as follows:
H(X)=−n∑i=1P(xi)log(P(xi))%config InlineBackend.figure_format = 'retina'
import numpy as np
import pandas as pd
import matplotlib.pyplot as plt
from spatialentropy import leibovici_entropy, altieri_entropy
from utils import plot_points, random_data, cluster_data
Define a window where we want to put our points, it's a 100*100 square
window = [(0,0),(100,0),(100,100),(0,100)]
# generate random types
types = np.random.choice(["lion", "tiger", "rabbit"], 200)
# complete random data
pp = random_data(window, 200)
# cluster data
cpp1, cpp1_types = cluster_data(window, 200, 10, 10, types)
# more cluster data
cpp2, cpp2_types = cluster_data(window, 200, 3, 5, types)
fig, (ax1, ax2, ax3) = plt.subplots(1,3, figsize=(15,5))
plot_points(pp, types, ax=ax1, title="Random")
plot_points(cpp1, cpp1_types, ax=ax2, title="Cluster")
plot_points(cpp2, cpp2_types, ax=ax3, title="More cluster")
<AxesSubplot:title={'center':'More cluster'}, xlabel='x', ylabel='y'>
Calculate shannon's entropy
from scipy.stats import entropy
from collections import Counter
c = np.asarray(list(Counter(types).values()))
c = c / c.sum()
shan_ent = entropy(c)
Calculate leibovici's entropy
lb_ent1 = leibovici_entropy(pp, types, d=10)
lb_ent2 = leibovici_entropy(cpp1, cpp1_types, d=10)
lb_ent3 = leibovici_entropy(cpp2, cpp2_types, d=10)
print("Entropy:", lb_ent1.entropy, lb_ent2.entropy, lb_ent3.entropy)
Entropy: 1.7767668644374706 1.4681168019374684 1.4196408288471043
Calculate altieri's entropy
ae1 = altieri_entropy(pp, types, cut=10)
ae2 = altieri_entropy(cpp1, cpp1_types, cut=10)
ae3 = altieri_entropy(cpp2, cpp2_types, cut=10)
print("Entropy:", ae1.entropy, ae2.entropy, ae3.entropy)
print("Mutual_info:", ae1.mutual_info, ae2.mutual_info, ae3.mutual_info)
print("Residue:", ae1.residue, ae2.residue, ae3.residue)
Entropy: 2.186109277106866 1.9671075741966284 1.5818311588856502 Mutual_info: 0.028982719745068133 0.4586378752332529 0.524962330283742 Residue: 2.157126557361798 1.5084696989633755 1.0568688286019081
fig, (ax1, ax2, ax3) = plt.subplots(1,3, figsize=(15,5))
plot_points(pp, types, ax=ax1,
title=f"Shannon entropy: {shan_ent:.3f}\nLeibovici entropy: {lb_ent1.entropy:.3f}\nAltieri entropy: {ae1.entropy:.3f}")
plot_points(cpp1, cpp1_types, ax=ax2,
title=f"Shannon entropy: {shan_ent:.3f}\nLeibovici entropy: {lb_ent2.entropy:.3f}\nAltieri entropy: {ae2.entropy:.3f}")
plot_points(cpp2, cpp2_types, ax=ax3,
title=f"Shannon entropy: {shan_ent:.3f}\nLeibovici entropy: {lb_ent3.entropy:.3f}\nAltieri entropy: {ae3.entropy:.3f}")
<AxesSubplot:title={'center':'Shannon entropy: 1.096\nLeibovici entropy: 1.420\nAltieri entropy: 1.582'}, xlabel='x', ylabel='y'>
If you are interested in the details, feel free to read what's below.
I try to explain it easier than what's in the paper, but you should refer to the paper for more details
A new variable Z is introduced. Z is defined as co-occurrences across the space.
For example, we have I types of cells. The combination of any two type of cells is (xi,xi′), the number of all combinations is denoted as R.
If order is preserved, R=P2I=I2; If the combinations are unordered, R=C2I=(I2+I)/2
To a more general situation, consider the combination of m types of cell, R=PmI=Im or R=CmI
At a user defined distance d, only co-occurrences with the distance d will take into consideration.
H(Z|d)=Im∑r=1p(zr|d)log(1p(zr|d))Reference:
Leibovici, D. G., Claramunt, C., Le Guyader, D., & Brosset, D. (2014). Local and global spatio-temporal entropy indices based on distance-ratios and co-occurrences distributions. International Journal of Geographical Information Science, 28(5), 1061-1084. link
This introduce another new vairable W. wk represents a series of sample window, i.e. [0,2][2,4][4,10],[10,...] while k=1,...,K
The purpose of this entropy is to decompose it into Spatial mutual information MI(Z,W) and Spatial residual entropy H(Z)W.
H(Z)=R∑r=1p(zr)log(1p(zr))=MI(Z,W)+H(Z)WReference:
Altieri, L., Cocchi, D., & Roli, G. (2018). A new approach to spatial entropy measures. Environmental and ecological statistics, 25(1), 95-110. link