In [2]:
import numpy as np
import numpy.random as nr
import matplotlib.pyplot as pl
%matplotlib inline

# This notebook is based on an excellent tutorial by Kostis Gourgoulias (http://kgourgou.me/)

# Specify size of plot
pl.rcParams['figure.figsize'] = (12.0, 10.0)

Playing around with the linear perceptron algorithm

The linear perceptron algorithm can be used to classify data points according to pre-selected features they have. The idea is to find a curve (or hyperplane) that separates points with different features. Once we have the curve, we can use it to decide if future points are of feature A or B based on where they are with respect to the curve (above or below it).

Now, let generate a collection of points and then paint them according to a line. If the points are above the line, they are blue, if they are below, green.

In [3]:
# Generate some points
N = 100
xn = nr.rand(N,2)

x = np.linspace(0,1)

# Pick a line 
#a, b = nr.rand(), nr.rand()
a, b = 0.8, 0.2
f = lambda x : a*x + b

fig = pl.figure()
figa = pl.gca()

pl.plot(xn[:,0],xn[:,1],'bo')
pl.plot(x,f(x),'r')

# Linearly separate the points by the line
yn = np.zeros([N,1])

for i in xrange(N):
    if(f(xn[i,0])>xn[i,1]):
        # Point is below line
        yn[i] = 1
        pl.plot(xn[i,0],xn[i,1],'go')
    else:
        # Point is above line
        yn[i] = -1
        
        
pl.legend(['Above','Separator','Below'],loc=0)
pl.title('Selected points with their separating line.')
#figa.axes.get_xaxis().set_visible(False)
#figa.axes.get_yaxis().set_visible(False)
Out[3]:
<matplotlib.text.Text at 0x7f5d96b7e050>

The curve naturally separates the space into two regions, one of green points and one of blue points. Thus, if I am given a new point, I can assign it a color based on where it is with respect to the curve. It is really that simple.

What is not so simple is to find the curve given the points. However, if the points are linearly separable, i.e. if a line exists that does the job, then I can just move a line around until I get it to the correct position. This is what the linear perceptron algorithm is doing.

In [16]:
def perceptron(xn, yn, max_iter=1000, w=np.zeros(3)):
    '''
        A very simple implementation of the perceptron algorithm for two dimensional data.
        
        Given points (x,y) with x in R^{2} and y in {-1,1}, the perceptron learning algorithm searches for the best
        line that separates the data points according to the difference classes defined in y. 
        
        Input: 
            xn : Data points, an Nx2 vector. 
            yn : Classification of the previous data points, an Nx1 vector. 
            max_iter : Maximum number of iterations (optional).
            w  : Initial vector of parameters (optional).
            
        Output: 
            w : Parameters of the best line, y = ax+b, that linearly separates the data. 
        
        Note:
            Convergence will be slower than expected, since this implementation picks points
            to update without a specific plan (randomly). This is enough for a demonstration, not 
            so good for actual work. 
'''
    
    N = xn.shape[0]
    
    # Separating curve
    f = lambda x: np.sign(w[0]+w[1]*x[0]+w[2]*x[1])

    for _ in xrange(max_iter):
        i = nr.randint(N) # try a random sample from the dataset
        print i, xn[i,0], xn[i,1], f(xn[i,:]), yn[i]
        if(yn[i] != f(xn[i,:])): # If not classified correctly, adjust the line to account for that point.
            w[0] = w[0] + yn[i] # the first weight is effectively the bias
            w[1] = w[1] + yn[i] * xn[i,0]
            w[2] = w[2] + yn[i] * xn[i,1]
            
                
                
            
    return w

w = perceptron(xn, yn, max_iter=5)
30 0.879833009026 0.8233558434 0.0 [ 1.]
19 0.174161776555 0.597111911245 1.0 [-1.]
97 0.045875351594 0.423093427763 1.0 [-1.]
97 0.045875351594 0.423093427763 -1.0 [-1.]
21 0.840331401342 0.629225436542 -1.0 [ 1.]

Now that we have an implementation, let's see how close it gets.

In [15]:
w = perceptron(xn, yn)

# Using weights w to compute a,b for a line y=a*x+b
bnew = -w[0]/w[2];
anew = -w[1]/w[2];
y = lambda x: anew * x + bnew;

# Computing the colors for the points
sep_color = (yn+1)/2.0;

pl.figure();
figa = pl.gca()

pl.scatter(xn[:,0],xn[:,1],c=sep_color, s=30)
pl.plot(x,y(x),'b--',label='Line from perceptron implementation.')
pl.plot(x,f(x),'r',label='Original line.')
pl.legend()

pl.title('Comparison between the linear separator and the perceptron approximation.')
91 0.0890732533084 0.00244232163674 1.0 [ 1.]
41 0.81173758157 0.668956447099 -1.0 [ 1.]
81 0.381810773963 0.0490773641201 1.0 [ 1.]
53 0.548329869102 0.0624212455486 1.0 [ 1.]
77 0.496264957964 0.845378385805 -1.0 [-1.]
57 0.293664761481 0.729828763353 -1.0 [-1.]
56 0.353738600475 0.00756391261736 1.0 [ 1.]
86 0.196119994596 0.0285439625255 1.0 [ 1.]
1 0.892296748113 0.177735339239 1.0 [ 1.]
92 0.779426089377 0.893706883543 1.0 [-1.]
5 0.919137310532 0.405530139276 1.0 [ 1.]
11 0.239075832595 0.37766247145 -1.0 [ 1.]
48 0.371056092459 0.163945871083 1.0 [ 1.]
26 0.120163718153 0.761187461488 -1.0 [-1.]
25 0.952509705642 0.0608161332277 1.0 [ 1.]
82 0.535520226808 0.534943007047 1.0 [ 1.]
81 0.381810773963 0.0490773641201 1.0 [ 1.]
89 0.576238788847 0.30669210164 1.0 [ 1.]
22 0.999495269627 0.373928049906 1.0 [ 1.]
96 0.271118145609 0.402583553886 1.0 [ 1.]
98 0.953279166254 0.0164775745715 1.0 [ 1.]
40 0.390781201264 0.693559596939 -1.0 [-1.]
27 0.281856192374 0.30011411922 1.0 [ 1.]
77 0.496264957964 0.845378385805 -1.0 [-1.]
21 0.840331401342 0.629225436542 1.0 [ 1.]
77 0.496264957964 0.845378385805 -1.0 [-1.]
88 0.18145704597 0.453838269468 -1.0 [-1.]
82 0.535520226808 0.534943007047 1.0 [ 1.]
62 0.554190410985 0.732632567499 -1.0 [-1.]
29 0.990887462299 0.940005389776 -1.0 [ 1.]
18 0.232510784476 0.0380953109131 1.0 [ 1.]
16 0.88702319713 0.0768101145928 1.0 [ 1.]
67 0.8570472191 0.386760791026 1.0 [ 1.]
59 0.694236822751 0.960496982481 1.0 [-1.]
89 0.576238788847 0.30669210164 1.0 [ 1.]
58 0.843473715833 0.142090447691 1.0 [ 1.]
53 0.548329869102 0.0624212455486 1.0 [ 1.]
75 0.683534546985 0.0629488637525 1.0 [ 1.]
25 0.952509705642 0.0608161332277 1.0 [ 1.]
86 0.196119994596 0.0285439625255 1.0 [ 1.]
61 0.0648544437806 0.82881802564 -1.0 [-1.]
45 0.513588266283 0.519593913864 1.0 [ 1.]
36 0.135587591581 0.611580384356 -1.0 [-1.]
18 0.232510784476 0.0380953109131 1.0 [ 1.]
17 0.616746606008 0.691849572172 1.0 [ 1.]
81 0.381810773963 0.0490773641201 1.0 [ 1.]
19 0.174161776555 0.597111911245 -1.0 [-1.]
12 0.987358760495 0.695492327125 1.0 [ 1.]
54 0.923614619505 0.907369666747 -1.0 [ 1.]
17 0.616746606008 0.691849572172 1.0 [ 1.]
19 0.174161776555 0.597111911245 1.0 [-1.]
40 0.390781201264 0.693559596939 -1.0 [-1.]
21 0.840331401342 0.629225436542 1.0 [ 1.]
3 0.489900278109 0.0361629135948 1.0 [ 1.]
57 0.293664761481 0.729828763353 -1.0 [-1.]
71 0.736917167 0.0831379698757 1.0 [ 1.]
65 0.724016071598 0.868408371425 1.0 [-1.]
2 0.272429858097 0.0996488660833 1.0 [ 1.]
40 0.390781201264 0.693559596939 -1.0 [-1.]
47 0.765856295185 0.973839204595 -1.0 [-1.]
62 0.554190410985 0.732632567499 -1.0 [-1.]
31 0.353749480424 0.222712938048 1.0 [ 1.]
84 0.511960485424 0.492128947069 -1.0 [ 1.]
0 0.112686349571 0.595819718002 -1.0 [-1.]
13 0.734323610514 0.59395724502 1.0 [ 1.]
69 0.686860222636 0.567885278963 1.0 [ 1.]
43 0.930071803683 0.668773150417 1.0 [ 1.]
65 0.724016071598 0.868408371425 -1.0 [-1.]
1 0.892296748113 0.177735339239 1.0 [ 1.]
67 0.8570472191 0.386760791026 1.0 [ 1.]
30 0.879833009026 0.8233558434 1.0 [ 1.]
13 0.734323610514 0.59395724502 1.0 [ 1.]
95 0.00496862368636 0.979399577186 -1.0 [-1.]
70 0.647294095653 0.79905420063 -1.0 [-1.]
37 0.115846816632 0.656502340474 -1.0 [-1.]
88 0.18145704597 0.453838269468 -1.0 [-1.]
96 0.271118145609 0.402583553886 1.0 [ 1.]
83 0.143100616452 0.47401273593 -1.0 [-1.]
36 0.135587591581 0.611580384356 -1.0 [-1.]
84 0.511960485424 0.492128947069 1.0 [ 1.]
35 0.511014542106 0.656675802281 1.0 [-1.]
22 0.999495269627 0.373928049906 1.0 [ 1.]
27 0.281856192374 0.30011411922 1.0 [ 1.]
41 0.81173758157 0.668956447099 -1.0 [ 1.]
48 0.371056092459 0.163945871083 1.0 [ 1.]
23 0.513922567866 0.152461236624 1.0 [ 1.]
51 0.163476432452 0.201643499134 1.0 [ 1.]
95 0.00496862368636 0.979399577186 -1.0 [-1.]
5 0.919137310532 0.405530139276 1.0 [ 1.]
35 0.511014542106 0.656675802281 1.0 [-1.]
71 0.736917167 0.0831379698757 1.0 [ 1.]
86 0.196119994596 0.0285439625255 1.0 [ 1.]
41 0.81173758157 0.668956447099 -1.0 [ 1.]
85 0.93641898673 0.192563260228 1.0 [ 1.]
10 0.184646086615 0.258210259342 1.0 [ 1.]
81 0.381810773963 0.0490773641201 1.0 [ 1.]
2 0.272429858097 0.0996488660833 1.0 [ 1.]
47 0.765856295185 0.973839204595 -1.0 [-1.]
92 0.779426089377 0.893706883543 1.0 [-1.]
12 0.987358760495 0.695492327125 1.0 [ 1.]
54 0.923614619505 0.907369666747 -1.0 [ 1.]
83 0.143100616452 0.47401273593 -1.0 [-1.]
7 0.0652883845189 0.186617724961 1.0 [ 1.]
55 0.0247822099079 0.791406591631 -1.0 [-1.]
54 0.923614619505 0.907369666747 1.0 [ 1.]
48 0.371056092459 0.163945871083 1.0 [ 1.]
66 0.452572628372 0.622648553801 1.0 [-1.]
12 0.987358760495 0.695492327125 1.0 [ 1.]
92 0.779426089377 0.893706883543 -1.0 [-1.]
58 0.843473715833 0.142090447691 1.0 [ 1.]
45 0.513588266283 0.519593913864 -1.0 [ 1.]
35 0.511014542106 0.656675802281 1.0 [-1.]
79 0.597297307701 0.823493915838 -1.0 [-1.]
41 0.81173758157 0.668956447099 1.0 [ 1.]
84 0.511960485424 0.492128947069 -1.0 [ 1.]
6 0.67151033146 0.7554830267 1.0 [-1.]
32 0.469096656183 0.552402853665 -1.0 [ 1.]
59 0.694236822751 0.960496982481 -1.0 [-1.]
24 0.967621042854 0.346893379289 1.0 [ 1.]
50 0.991234656298 0.976972742406 1.0 [ 1.]
86 0.196119994596 0.0285439625255 1.0 [ 1.]
20 0.329549870376 0.969372894276 -1.0 [-1.]
60 0.683969845028 0.409171762867 1.0 [ 1.]
11 0.239075832595 0.37766247145 1.0 [ 1.]
30 0.879833009026 0.8233558434 1.0 [ 1.]
79 0.597297307701 0.823493915838 -1.0 [-1.]
51 0.163476432452 0.201643499134 1.0 [ 1.]
92 0.779426089377 0.893706883543 -1.0 [-1.]
29 0.990887462299 0.940005389776 1.0 [ 1.]
18 0.232510784476 0.0380953109131 1.0 [ 1.]
22 0.999495269627 0.373928049906 1.0 [ 1.]
60 0.683969845028 0.409171762867 1.0 [ 1.]
68 0.354333276088 0.930061890577 -1.0 [-1.]
31 0.353749480424 0.222712938048 1.0 [ 1.]
12 0.987358760495 0.695492327125 1.0 [ 1.]
13 0.734323610514 0.59395724502 1.0 [ 1.]
32 0.469096656183 0.552402853665 1.0 [ 1.]
79 0.597297307701 0.823493915838 -1.0 [-1.]
82 0.535520226808 0.534943007047 1.0 [ 1.]
16 0.88702319713 0.0768101145928 1.0 [ 1.]
11 0.239075832595 0.37766247145 1.0 [ 1.]
57 0.293664761481 0.729828763353 -1.0 [-1.]
36 0.135587591581 0.611580384356 -1.0 [-1.]
35 0.511014542106 0.656675802281 1.0 [-1.]
39 0.831000884785 0.913121763713 -1.0 [-1.]
90 0.306303132227 0.508438341563 -1.0 [-1.]
74 0.866475955985 0.6472912176 1.0 [ 1.]
51 0.163476432452 0.201643499134 1.0 [ 1.]
21 0.840331401342 0.629225436542 1.0 [ 1.]
53 0.548329869102 0.0624212455486 1.0 [ 1.]
15 0.56293522383 0.541372509145 -1.0 [ 1.]
63 0.575638390701 0.934658331896 -1.0 [-1.]
17 0.616746606008 0.691849572172 1.0 [ 1.]
29 0.990887462299 0.940005389776 1.0 [ 1.]
51 0.163476432452 0.201643499134 1.0 [ 1.]
53 0.548329869102 0.0624212455486 1.0 [ 1.]
7 0.0652883845189 0.186617724961 1.0 [ 1.]
56 0.353738600475 0.00756391261736 1.0 [ 1.]
64 0.0697482249646 0.239849688301 1.0 [ 1.]
69 0.686860222636 0.567885278963 1.0 [ 1.]
57 0.293664761481 0.729828763353 -1.0 [-1.]
20 0.329549870376 0.969372894276 -1.0 [-1.]
24 0.967621042854 0.346893379289 1.0 [ 1.]
1 0.892296748113 0.177735339239 1.0 [ 1.]
81 0.381810773963 0.0490773641201 1.0 [ 1.]
26 0.120163718153 0.761187461488 -1.0 [-1.]
80 0.848160387085 0.581767032246 1.0 [ 1.]
32 0.469096656183 0.552402853665 1.0 [ 1.]
24 0.967621042854 0.346893379289 1.0 [ 1.]
14 0.45751773429 0.168761153349 1.0 [ 1.]
0 0.112686349571 0.595819718002 -1.0 [-1.]
80 0.848160387085 0.581767032246 1.0 [ 1.]
44 0.67030153368 0.434654882032 1.0 [ 1.]
26 0.120163718153 0.761187461488 -1.0 [-1.]
22 0.999495269627 0.373928049906 1.0 [ 1.]
92 0.779426089377 0.893706883543 -1.0 [-1.]
31 0.353749480424 0.222712938048 1.0 [ 1.]
72 0.642924042961 0.274031947171 1.0 [ 1.]
34 0.771153929748 0.790737142338 1.0 [ 1.]
71 0.736917167 0.0831379698757 1.0 [ 1.]
96 0.271118145609 0.402583553886 1.0 [ 1.]
35 0.511014542106 0.656675802281 -1.0 [-1.]
60 0.683969845028 0.409171762867 1.0 [ 1.]
31 0.353749480424 0.222712938048 1.0 [ 1.]
9 0.173135674059 0.293647642084 1.0 [ 1.]
18 0.232510784476 0.0380953109131 1.0 [ 1.]
55 0.0247822099079 0.791406591631 -1.0 [-1.]
62 0.554190410985 0.732632567499 -1.0 [-1.]
30 0.879833009026 0.8233558434 1.0 [ 1.]
17 0.616746606008 0.691849572172 1.0 [ 1.]
30 0.879833009026 0.8233558434 1.0 [ 1.]
84 0.511960485424 0.492128947069 1.0 [ 1.]
6 0.67151033146 0.7554830267 1.0 [-1.]
59 0.694236822751 0.960496982481 -1.0 [-1.]
39 0.831000884785 0.913121763713 -1.0 [-1.]
32 0.469096656183 0.552402853665 -1.0 [ 1.]
57 0.293664761481 0.729828763353 -1.0 [-1.]
48 0.371056092459 0.163945871083 1.0 [ 1.]
25 0.952509705642 0.0608161332277 1.0 [ 1.]
89 0.576238788847 0.30669210164 1.0 [ 1.]
52 0.521511147846 0.474250194048 1.0 [ 1.]
83 0.143100616452 0.47401273593 -1.0 [-1.]
80 0.848160387085 0.581767032246 1.0 [ 1.]
26 0.120163718153 0.761187461488 -1.0 [-1.]
98 0.953279166254 0.0164775745715 1.0 [ 1.]
7 0.0652883845189 0.186617724961 1.0 [ 1.]
9 0.173135674059 0.293647642084 1.0 [ 1.]
6 0.67151033146 0.7554830267 -1.0 [-1.]
41 0.81173758157 0.668956447099 1.0 [ 1.]
57 0.293664761481 0.729828763353 -1.0 [-1.]
30 0.879833009026 0.8233558434 1.0 [ 1.]
71 0.736917167 0.0831379698757 1.0 [ 1.]
38 0.571551605706 0.624716789126 1.0 [ 1.]
97 0.045875351594 0.423093427763 -1.0 [-1.]
51 0.163476432452 0.201643499134 1.0 [ 1.]
66 0.452572628372 0.622648553801 -1.0 [-1.]
96 0.271118145609 0.402583553886 1.0 [ 1.]
28 0.0806789269102 0.0659012909087 1.0 [ 1.]
0 0.112686349571 0.595819718002 -1.0 [-1.]
28 0.0806789269102 0.0659012909087 1.0 [ 1.]
3 0.489900278109 0.0361629135948 1.0 [ 1.]
34 0.771153929748 0.790737142338 1.0 [ 1.]
60 0.683969845028 0.409171762867 1.0 [ 1.]
37 0.115846816632 0.656502340474 -1.0 [-1.]
74 0.866475955985 0.6472912176 1.0 [ 1.]
51 0.163476432452 0.201643499134 1.0 [ 1.]
8 0.8265617107 0.852185434141 1.0 [ 1.]
76 0.768955390969 0.903366052993 -1.0 [-1.]
50 0.991234656298 0.976972742406 -1.0 [ 1.]
10 0.184646086615 0.258210259342 1.0 [ 1.]
68 0.354333276088 0.930061890577 -1.0 [-1.]
1 0.892296748113 0.177735339239 1.0 [ 1.]
82 0.535520226808 0.534943007047 1.0 [ 1.]
26 0.120163718153 0.761187461488 -1.0 [-1.]
35 0.511014542106 0.656675802281 1.0 [-1.]
99 0.746488035775 0.171535146624 1.0 [ 1.]
31 0.353749480424 0.222712938048 1.0 [ 1.]
49 0.0107333010178 0.855353886455 -1.0 [-1.]
91 0.0890732533084 0.00244232163674 1.0 [ 1.]
52 0.521511147846 0.474250194048 1.0 [ 1.]
31 0.353749480424 0.222712938048 1.0 [ 1.]
8 0.8265617107 0.852185434141 1.0 [ 1.]
92 0.779426089377 0.893706883543 1.0 [-1.]
79 0.597297307701 0.823493915838 -1.0 [-1.]
38 0.571551605706 0.624716789126 -1.0 [ 1.]
83 0.143100616452 0.47401273593 -1.0 [-1.]
79 0.597297307701 0.823493915838 -1.0 [-1.]
80 0.848160387085 0.581767032246 1.0 [ 1.]
11 0.239075832595 0.37766247145 1.0 [ 1.]
60 0.683969845028 0.409171762867 1.0 [ 1.]
9 0.173135674059 0.293647642084 1.0 [ 1.]
60 0.683969845028 0.409171762867 1.0 [ 1.]
74 0.866475955985 0.6472912176 1.0 [ 1.]
53 0.548329869102 0.0624212455486 1.0 [ 1.]
82 0.535520226808 0.534943007047 1.0 [ 1.]
19 0.174161776555 0.597111911245 -1.0 [-1.]
97 0.045875351594 0.423093427763 -1.0 [-1.]
91 0.0890732533084 0.00244232163674 1.0 [ 1.]
88 0.18145704597 0.453838269468 -1.0 [-1.]
87 0.944976803419 0.443707231164 1.0 [ 1.]
99 0.746488035775 0.171535146624 1.0 [ 1.]
74 0.866475955985 0.6472912176 1.0 [ 1.]
71 0.736917167 0.0831379698757 1.0 [ 1.]
5 0.919137310532 0.405530139276 1.0 [ 1.]
12 0.987358760495 0.695492327125 1.0 [ 1.]
46 0.605808326697 0.870041821958 -1.0 [-1.]
88 0.18145704597 0.453838269468 -1.0 [-1.]
80 0.848160387085 0.581767032246 1.0 [ 1.]
54 0.923614619505 0.907369666747 1.0 [ 1.]
36 0.135587591581 0.611580384356 -1.0 [-1.]
27 0.281856192374 0.30011411922 1.0 [ 1.]
36 0.135587591581 0.611580384356 -1.0 [-1.]
28 0.0806789269102 0.0659012909087 1.0 [ 1.]
60 0.683969845028 0.409171762867 1.0 [ 1.]
33 0.794230305109 0.591895838 1.0 [ 1.]
46 0.605808326697 0.870041821958 -1.0 [-1.]
55 0.0247822099079 0.791406591631 -1.0 [-1.]
77 0.496264957964 0.845378385805 -1.0 [-1.]
8 0.8265617107 0.852185434141 1.0 [ 1.]
22 0.999495269627 0.373928049906 1.0 [ 1.]
36 0.135587591581 0.611580384356 -1.0 [-1.]
77 0.496264957964 0.845378385805 -1.0 [-1.]
92 0.779426089377 0.893706883543 -1.0 [-1.]
52 0.521511147846 0.474250194048 1.0 [ 1.]
46 0.605808326697 0.870041821958 -1.0 [-1.]
53 0.548329869102 0.0624212455486 1.0 [ 1.]
72 0.642924042961 0.274031947171 1.0 [ 1.]
53 0.548329869102 0.0624212455486 1.0 [ 1.]
38 0.571551605706 0.624716789126 1.0 [ 1.]
21 0.840331401342 0.629225436542 1.0 [ 1.]
80 0.848160387085 0.581767032246 1.0 [ 1.]
52 0.521511147846 0.474250194048 1.0 [ 1.]
45 0.513588266283 0.519593913864 1.0 [ 1.]
89 0.576238788847 0.30669210164 1.0 [ 1.]
73 0.290734591188 0.424810284665 1.0 [ 1.]
64 0.0697482249646 0.239849688301 1.0 [ 1.]
10 0.184646086615 0.258210259342 1.0 [ 1.]
17 0.616746606008 0.691849572172 1.0 [ 1.]
55 0.0247822099079 0.791406591631 -1.0 [-1.]
53 0.548329869102 0.0624212455486 1.0 [ 1.]
72 0.642924042961 0.274031947171 1.0 [ 1.]
50 0.991234656298 0.976972742406 1.0 [ 1.]
68 0.354333276088 0.930061890577 -1.0 [-1.]
78 0.776495471284 0.404143870584 1.0 [ 1.]
73 0.290734591188 0.424810284665 1.0 [ 1.]
80 0.848160387085 0.581767032246 1.0 [ 1.]
62 0.554190410985 0.732632567499 -1.0 [-1.]
36 0.135587591581 0.611580384356 -1.0 [-1.]
7 0.0652883845189 0.186617724961 1.0 [ 1.]
29 0.990887462299 0.940005389776 1.0 [ 1.]
15 0.56293522383 0.541372509145 1.0 [ 1.]
3 0.489900278109 0.0361629135948 1.0 [ 1.]
35 0.511014542106 0.656675802281 -1.0 [-1.]
42 0.764365085549 0.571570338649 1.0 [ 1.]
38 0.571551605706 0.624716789126 1.0 [ 1.]
23 0.513922567866 0.152461236624 1.0 [ 1.]
52 0.521511147846 0.474250194048 1.0 [ 1.]
53 0.548329869102 0.0624212455486 1.0 [ 1.]
25 0.952509705642 0.0608161332277 1.0 [ 1.]
34 0.771153929748 0.790737142338 1.0 [ 1.]
62 0.554190410985 0.732632567499 -1.0 [-1.]
57 0.293664761481 0.729828763353 -1.0 [-1.]
6 0.67151033146 0.7554830267 1.0 [-1.]
50 0.991234656298 0.976972742406 -1.0 [ 1.]
37 0.115846816632 0.656502340474 -1.0 [-1.]
39 0.831000884785 0.913121763713 1.0 [-1.]
70 0.647294095653 0.79905420063 -1.0 [-1.]
40 0.390781201264 0.693559596939 -1.0 [-1.]
38 0.571551605706 0.624716789126 -1.0 [ 1.]
67 0.8570472191 0.386760791026 1.0 [ 1.]
90 0.306303132227 0.508438341563 -1.0 [-1.]
13 0.734323610514 0.59395724502 1.0 [ 1.]
58 0.843473715833 0.142090447691 1.0 [ 1.]
16 0.88702319713 0.0768101145928 1.0 [ 1.]
80 0.848160387085 0.581767032246 1.0 [ 1.]
85 0.93641898673 0.192563260228 1.0 [ 1.]
23 0.513922567866 0.152461236624 1.0 [ 1.]
62 0.554190410985 0.732632567499 -1.0 [-1.]
48 0.371056092459 0.163945871083 1.0 [ 1.]
13 0.734323610514 0.59395724502 1.0 [ 1.]
23 0.513922567866 0.152461236624 1.0 [ 1.]
90 0.306303132227 0.508438341563 -1.0 [-1.]
14 0.45751773429 0.168761153349 1.0 [ 1.]
77 0.496264957964 0.845378385805 -1.0 [-1.]
89 0.576238788847 0.30669210164 1.0 [ 1.]
71 0.736917167 0.0831379698757 1.0 [ 1.]
3 0.489900278109 0.0361629135948 1.0 [ 1.]
67 0.8570472191 0.386760791026 1.0 [ 1.]
2 0.272429858097 0.0996488660833 1.0 [ 1.]
48 0.371056092459 0.163945871083 1.0 [ 1.]
80 0.848160387085 0.581767032246 1.0 [ 1.]
10 0.184646086615 0.258210259342 1.0 [ 1.]
9 0.173135674059 0.293647642084 1.0 [ 1.]
4 0.294849676429 0.477137560987 1.0 [-1.]
96 0.271118145609 0.402583553886 -1.0 [ 1.]
4 0.294849676429 0.477137560987 -1.0 [-1.]
13 0.734323610514 0.59395724502 1.0 [ 1.]
5 0.919137310532 0.405530139276 1.0 [ 1.]
86 0.196119994596 0.0285439625255 1.0 [ 1.]
55 0.0247822099079 0.791406591631 -1.0 [-1.]
8 0.8265617107 0.852185434141 1.0 [ 1.]
20 0.329549870376 0.969372894276 -1.0 [-1.]
26 0.120163718153 0.761187461488 -1.0 [-1.]
20 0.329549870376 0.969372894276 -1.0 [-1.]
52 0.521511147846 0.474250194048 1.0 [ 1.]
62 0.554190410985 0.732632567499 -1.0 [-1.]
51 0.163476432452 0.201643499134 1.0 [ 1.]
23 0.513922567866 0.152461236624 1.0 [ 1.]
81 0.381810773963 0.0490773641201 1.0 [ 1.]
53 0.548329869102 0.0624212455486 1.0 [ 1.]
26 0.120163718153 0.761187461488 -1.0 [-1.]
33 0.794230305109 0.591895838 1.0 [ 1.]
49 0.0107333010178 0.855353886455 -1.0 [-1.]
55 0.0247822099079 0.791406591631 -1.0 [-1.]
14 0.45751773429 0.168761153349 1.0 [ 1.]
31 0.353749480424 0.222712938048 1.0 [ 1.]
57 0.293664761481 0.729828763353 -1.0 [-1.]
65 0.724016071598 0.868408371425 -1.0 [-1.]
37 0.115846816632 0.656502340474 -1.0 [-1.]
30 0.879833009026 0.8233558434 1.0 [ 1.]
14 0.45751773429 0.168761153349 1.0 [ 1.]
87 0.944976803419 0.443707231164 1.0 [ 1.]
31 0.353749480424 0.222712938048 1.0 [ 1.]
9 0.173135674059 0.293647642084 1.0 [ 1.]
52 0.521511147846 0.474250194048 1.0 [ 1.]
68 0.354333276088 0.930061890577 -1.0 [-1.]
20 0.329549870376 0.969372894276 -1.0 [-1.]
28 0.0806789269102 0.0659012909087 1.0 [ 1.]
54 0.923614619505 0.907369666747 1.0 [ 1.]
97 0.045875351594 0.423093427763 -1.0 [-1.]
47 0.765856295185 0.973839204595 -1.0 [-1.]
75 0.683534546985 0.0629488637525 1.0 [ 1.]
64 0.0697482249646 0.239849688301 1.0 [ 1.]
72 0.642924042961 0.274031947171 1.0 [ 1.]
16 0.88702319713 0.0768101145928 1.0 [ 1.]
89 0.576238788847 0.30669210164 1.0 [ 1.]
51 0.163476432452 0.201643499134 1.0 [ 1.]
89 0.576238788847 0.30669210164 1.0 [ 1.]
2 0.272429858097 0.0996488660833 1.0 [ 1.]
69 0.686860222636 0.567885278963 1.0 [ 1.]
22 0.999495269627 0.373928049906 1.0 [ 1.]
7 0.0652883845189 0.186617724961 1.0 [ 1.]
88 0.18145704597 0.453838269468 -1.0 [-1.]
81 0.381810773963 0.0490773641201 1.0 [ 1.]
63 0.575638390701 0.934658331896 -1.0 [-1.]
13 0.734323610514 0.59395724502 1.0 [ 1.]
23 0.513922567866 0.152461236624 1.0 [ 1.]
9 0.173135674059 0.293647642084 1.0 [ 1.]
39 0.831000884785 0.913121763713 -1.0 [-1.]
79 0.597297307701 0.823493915838 -1.0 [-1.]
19 0.174161776555 0.597111911245 -1.0 [-1.]
57 0.293664761481 0.729828763353 -1.0 [-1.]
89 0.576238788847 0.30669210164 1.0 [ 1.]
17 0.616746606008 0.691849572172 1.0 [ 1.]
30 0.879833009026 0.8233558434 1.0 [ 1.]
94 0.461860723844 0.457741457449 1.0 [ 1.]
0 0.112686349571 0.595819718002 -1.0 [-1.]
1 0.892296748113 0.177735339239 1.0 [ 1.]
97 0.045875351594 0.423093427763 -1.0 [-1.]
62 0.554190410985 0.732632567499 -1.0 [-1.]
12 0.987358760495 0.695492327125 1.0 [ 1.]
80 0.848160387085 0.581767032246 1.0 [ 1.]
8 0.8265617107 0.852185434141 1.0 [ 1.]
1 0.892296748113 0.177735339239 1.0 [ 1.]
36 0.135587591581 0.611580384356 -1.0 [-1.]
9 0.173135674059 0.293647642084 1.0 [ 1.]
16 0.88702319713 0.0768101145928 1.0 [ 1.]
92 0.779426089377 0.893706883543 -1.0 [-1.]
78 0.776495471284 0.404143870584 1.0 [ 1.]
85 0.93641898673 0.192563260228 1.0 [ 1.]
28 0.0806789269102 0.0659012909087 1.0 [ 1.]
99 0.746488035775 0.171535146624 1.0 [ 1.]
66 0.452572628372 0.622648553801 -1.0 [-1.]
87 0.944976803419 0.443707231164 1.0 [ 1.]
47 0.765856295185 0.973839204595 -1.0 [-1.]
28 0.0806789269102 0.0659012909087 1.0 [ 1.]
44 0.67030153368 0.434654882032 1.0 [ 1.]
62 0.554190410985 0.732632567499 -1.0 [-1.]
10 0.184646086615 0.258210259342 1.0 [ 1.]
7 0.0652883845189 0.186617724961 1.0 [ 1.]
91 0.0890732533084 0.00244232163674 1.0 [ 1.]
38 0.571551605706 0.624716789126 1.0 [ 1.]
49 0.0107333010178 0.855353886455 -1.0 [-1.]
14 0.45751773429 0.168761153349 1.0 [ 1.]
50 0.991234656298 0.976972742406 1.0 [ 1.]
88 0.18145704597 0.453838269468 -1.0 [-1.]
71 0.736917167 0.0831379698757 1.0 [ 1.]
78 0.776495471284 0.404143870584 1.0 [ 1.]
47 0.765856295185 0.973839204595 -1.0 [-1.]
78 0.776495471284 0.404143870584 1.0 [ 1.]
82 0.535520226808 0.534943007047 1.0 [ 1.]
75 0.683534546985 0.0629488637525 1.0 [ 1.]
41 0.81173758157 0.668956447099 1.0 [ 1.]
71 0.736917167 0.0831379698757 1.0 [ 1.]
19 0.174161776555 0.597111911245 -1.0 [-1.]
33 0.794230305109 0.591895838 1.0 [ 1.]
95 0.00496862368636 0.979399577186 -1.0 [-1.]
80 0.848160387085 0.581767032246 1.0 [ 1.]
53 0.548329869102 0.0624212455486 1.0 [ 1.]
87 0.944976803419 0.443707231164 1.0 [ 1.]
80 0.848160387085 0.581767032246 1.0 [ 1.]
68 0.354333276088 0.930061890577 -1.0 [-1.]
25 0.952509705642 0.0608161332277 1.0 [ 1.]
88 0.18145704597 0.453838269468 -1.0 [-1.]
52 0.521511147846 0.474250194048 1.0 [ 1.]
97 0.045875351594 0.423093427763 -1.0 [-1.]
45 0.513588266283 0.519593913864 1.0 [ 1.]
23 0.513922567866 0.152461236624 1.0 [ 1.]
42 0.764365085549 0.571570338649 1.0 [ 1.]
1 0.892296748113 0.177735339239 1.0 [ 1.]
46 0.605808326697 0.870041821958 -1.0 [-1.]
1 0.892296748113 0.177735339239 1.0 [ 1.]
25 0.952509705642 0.0608161332277 1.0 [ 1.]
72 0.642924042961 0.274031947171 1.0 [ 1.]
19 0.174161776555 0.597111911245 -1.0 [-1.]
4 0.294849676429 0.477137560987 -1.0 [-1.]
84 0.511960485424 0.492128947069 1.0 [ 1.]
5 0.919137310532 0.405530139276 1.0 [ 1.]
45 0.513588266283 0.519593913864 1.0 [ 1.]
26 0.120163718153 0.761187461488 -1.0 [-1.]
39 0.831000884785 0.913121763713 -1.0 [-1.]
91 0.0890732533084 0.00244232163674 1.0 [ 1.]
6 0.67151033146 0.7554830267 1.0 [-1.]
6 0.67151033146 0.7554830267 -1.0 [-1.]
38 0.571551605706 0.624716789126 -1.0 [ 1.]
68 0.354333276088 0.930061890577 -1.0 [-1.]
81 0.381810773963 0.0490773641201 1.0 [ 1.]
40 0.390781201264 0.693559596939 -1.0 [-1.]
54 0.923614619505 0.907369666747 1.0 [ 1.]
37 0.115846816632 0.656502340474 -1.0 [-1.]
64 0.0697482249646 0.239849688301 1.0 [ 1.]
50 0.991234656298 0.976972742406 -1.0 [ 1.]
6 0.67151033146 0.7554830267 1.0 [-1.]
66 0.452572628372 0.622648553801 -1.0 [-1.]
43 0.930071803683 0.668773150417 1.0 [ 1.]
87 0.944976803419 0.443707231164 1.0 [ 1.]
55 0.0247822099079 0.791406591631 -1.0 [-1.]
50 0.991234656298 0.976972742406 1.0 [ 1.]
94 0.461860723844 0.457741457449 1.0 [ 1.]
98 0.953279166254 0.0164775745715 1.0 [ 1.]
16 0.88702319713 0.0768101145928 1.0 [ 1.]
66 0.452572628372 0.622648553801 -1.0 [-1.]
80 0.848160387085 0.581767032246 1.0 [ 1.]
94 0.461860723844 0.457741457449 1.0 [ 1.]
95 0.00496862368636 0.979399577186 -1.0 [-1.]
55 0.0247822099079 0.791406591631 -1.0 [-1.]
28 0.0806789269102 0.0659012909087 1.0 [ 1.]
32 0.469096656183 0.552402853665 1.0 [ 1.]
41 0.81173758157 0.668956447099 1.0 [ 1.]
28 0.0806789269102 0.0659012909087 1.0 [ 1.]
41 0.81173758157 0.668956447099 1.0 [ 1.]
28 0.0806789269102 0.0659012909087 1.0 [ 1.]
89 0.576238788847 0.30669210164 1.0 [ 1.]
79 0.597297307701 0.823493915838 -1.0 [-1.]
95 0.00496862368636 0.979399577186 -1.0 [-1.]
25 0.952509705642 0.0608161332277 1.0 [ 1.]
9 0.173135674059 0.293647642084 1.0 [ 1.]
70 0.647294095653 0.79905420063 -1.0 [-1.]
84 0.511960485424 0.492128947069 1.0 [ 1.]
13 0.734323610514 0.59395724502 1.0 [ 1.]
0 0.112686349571 0.595819718002 -1.0 [-1.]
21 0.840331401342 0.629225436542 1.0 [ 1.]
1 0.892296748113 0.177735339239 1.0 [ 1.]
63 0.575638390701 0.934658331896 -1.0 [-1.]
49 0.0107333010178 0.855353886455 -1.0 [-1.]
18 0.232510784476 0.0380953109131 1.0 [ 1.]
64 0.0697482249646 0.239849688301 1.0 [ 1.]
51 0.163476432452 0.201643499134 1.0 [ 1.]
10 0.184646086615 0.258210259342 1.0 [ 1.]
92 0.779426089377 0.893706883543 -1.0 [-1.]
79 0.597297307701 0.823493915838 -1.0 [-1.]
87 0.944976803419 0.443707231164 1.0 [ 1.]
26 0.120163718153 0.761187461488 -1.0 [-1.]
66 0.452572628372 0.622648553801 -1.0 [-1.]
92 0.779426089377 0.893706883543 -1.0 [-1.]
71 0.736917167 0.0831379698757 1.0 [ 1.]
50 0.991234656298 0.976972742406 1.0 [ 1.]
17 0.616746606008 0.691849572172 1.0 [ 1.]
41 0.81173758157 0.668956447099 1.0 [ 1.]
17 0.616746606008 0.691849572172 1.0 [ 1.]
90 0.306303132227 0.508438341563 -1.0 [-1.]
88 0.18145704597 0.453838269468 -1.0 [-1.]
25 0.952509705642 0.0608161332277 1.0 [ 1.]
82 0.535520226808 0.534943007047 1.0 [ 1.]
20 0.329549870376 0.969372894276 -1.0 [-1.]
51 0.163476432452 0.201643499134 1.0 [ 1.]
2 0.272429858097 0.0996488660833 1.0 [ 1.]
29 0.990887462299 0.940005389776 1.0 [ 1.]
60 0.683969845028 0.409171762867 1.0 [ 1.]
18 0.232510784476 0.0380953109131 1.0 [ 1.]
6 0.67151033146 0.7554830267 1.0 [-1.]
57 0.293664761481 0.729828763353 -1.0 [-1.]
57 0.293664761481 0.729828763353 -1.0 [-1.]
54 0.923614619505 0.907369666747 -1.0 [ 1.]
59 0.694236822751 0.960496982481 -1.0 [-1.]
92 0.779426089377 0.893706883543 1.0 [-1.]
65 0.724016071598 0.868408371425 -1.0 [-1.]
1 0.892296748113 0.177735339239 1.0 [ 1.]
59 0.694236822751 0.960496982481 -1.0 [-1.]
2 0.272429858097 0.0996488660833 1.0 [ 1.]
81 0.381810773963 0.0490773641201 1.0 [ 1.]
37 0.115846816632 0.656502340474 -1.0 [-1.]
48 0.371056092459 0.163945871083 1.0 [ 1.]
93 0.370136828682 0.918092060306 -1.0 [-1.]
60 0.683969845028 0.409171762867 1.0 [ 1.]
91 0.0890732533084 0.00244232163674 1.0 [ 1.]
61 0.0648544437806 0.82881802564 -1.0 [-1.]
42 0.764365085549 0.571570338649 1.0 [ 1.]
72 0.642924042961 0.274031947171 1.0 [ 1.]
42 0.764365085549 0.571570338649 1.0 [ 1.]
56 0.353738600475 0.00756391261736 1.0 [ 1.]
33 0.794230305109 0.591895838 1.0 [ 1.]
10 0.184646086615 0.258210259342 -1.0 [ 1.]
99 0.746488035775 0.171535146624 1.0 [ 1.]
11 0.239075832595 0.37766247145 1.0 [ 1.]
33 0.794230305109 0.591895838 1.0 [ 1.]
21 0.840331401342 0.629225436542 1.0 [ 1.]
13 0.734323610514 0.59395724502 1.0 [ 1.]
30 0.879833009026 0.8233558434 1.0 [ 1.]
65 0.724016071598 0.868408371425 -1.0 [-1.]
78 0.776495471284 0.404143870584 1.0 [ 1.]
32 0.469096656183 0.552402853665 1.0 [ 1.]
81 0.381810773963 0.0490773641201 1.0 [ 1.]
85 0.93641898673 0.192563260228 1.0 [ 1.]
34 0.771153929748 0.790737142338 -1.0 [ 1.]
79 0.597297307701 0.823493915838 1.0 [-1.]
55 0.0247822099079 0.791406591631 -1.0 [-1.]
71 0.736917167 0.0831379698757 1.0 [ 1.]
22 0.999495269627 0.373928049906 1.0 [ 1.]
4 0.294849676429 0.477137560987 -1.0 [-1.]
91 0.0890732533084 0.00244232163674 1.0 [ 1.]
53 0.548329869102 0.0624212455486 1.0 [ 1.]
46 0.605808326697 0.870041821958 -1.0 [-1.]
59 0.694236822751 0.960496982481 -1.0 [-1.]
49 0.0107333010178 0.855353886455 -1.0 [-1.]
56 0.353738600475 0.00756391261736 1.0 [ 1.]
29 0.990887462299 0.940005389776 1.0 [ 1.]
28 0.0806789269102 0.0659012909087 1.0 [ 1.]
42 0.764365085549 0.571570338649 1.0 [ 1.]
72 0.642924042961 0.274031947171 1.0 [ 1.]
38 0.571551605706 0.624716789126 1.0 [ 1.]
70 0.647294095653 0.79905420063 -1.0 [-1.]
5 0.919137310532 0.405530139276 1.0 [ 1.]
79 0.597297307701 0.823493915838 -1.0 [-1.]
14 0.45751773429 0.168761153349 1.0 [ 1.]
22 0.999495269627 0.373928049906 1.0 [ 1.]
56 0.353738600475 0.00756391261736 1.0 [ 1.]
40 0.390781201264 0.693559596939 -1.0 [-1.]
38 0.571551605706 0.624716789126 1.0 [ 1.]
84 0.511960485424 0.492128947069 1.0 [ 1.]
82 0.535520226808 0.534943007047 1.0 [ 1.]
69 0.686860222636 0.567885278963 1.0 [ 1.]
99 0.746488035775 0.171535146624 1.0 [ 1.]
69 0.686860222636 0.567885278963 1.0 [ 1.]
63 0.575638390701 0.934658331896 -1.0 [-1.]
83 0.143100616452 0.47401273593 -1.0 [-1.]
50 0.991234656298 0.976972742406 -1.0 [ 1.]
95 0.00496862368636 0.979399577186 -1.0 [-1.]
97 0.045875351594 0.423093427763 1.0 [-1.]
72 0.642924042961 0.274031947171 1.0 [ 1.]
19 0.174161776555 0.597111911245 -1.0 [-1.]
56 0.353738600475 0.00756391261736 1.0 [ 1.]
68 0.354333276088 0.930061890577 -1.0 [-1.]
2 0.272429858097 0.0996488660833 1.0 [ 1.]
42 0.764365085549 0.571570338649 1.0 [ 1.]
90 0.306303132227 0.508438341563 1.0 [-1.]
24 0.967621042854 0.346893379289 1.0 [ 1.]
43 0.930071803683 0.668773150417 1.0 [ 1.]
80 0.848160387085 0.581767032246 1.0 [ 1.]
17 0.616746606008 0.691849572172 -1.0 [ 1.]
1 0.892296748113 0.177735339239 1.0 [ 1.]
39 0.831000884785 0.913121763713 1.0 [-1.]
45 0.513588266283 0.519593913864 -1.0 [ 1.]
91 0.0890732533084 0.00244232163674 1.0 [ 1.]
8 0.8265617107 0.852185434141 1.0 [ 1.]
89 0.576238788847 0.30669210164 1.0 [ 1.]
14 0.45751773429 0.168761153349 1.0 [ 1.]
97 0.045875351594 0.423093427763 -1.0 [-1.]
52 0.521511147846 0.474250194048 1.0 [ 1.]
34 0.771153929748 0.790737142338 1.0 [ 1.]
12 0.987358760495 0.695492327125 1.0 [ 1.]
94 0.461860723844 0.457741457449 1.0 [ 1.]
65 0.724016071598 0.868408371425 1.0 [-1.]
8 0.8265617107 0.852185434141 -1.0 [ 1.]
24 0.967621042854 0.346893379289 1.0 [ 1.]
14 0.45751773429 0.168761153349 1.0 [ 1.]
25 0.952509705642 0.0608161332277 1.0 [ 1.]
5 0.919137310532 0.405530139276 1.0 [ 1.]
54 0.923614619505 0.907369666747 1.0 [ 1.]
75 0.683534546985 0.0629488637525 1.0 [ 1.]
84 0.511960485424 0.492128947069 1.0 [ 1.]
22 0.999495269627 0.373928049906 1.0 [ 1.]
73 0.290734591188 0.424810284665 1.0 [ 1.]
44 0.67030153368 0.434654882032 1.0 [ 1.]
42 0.764365085549 0.571570338649 1.0 [ 1.]
87 0.944976803419 0.443707231164 1.0 [ 1.]
29 0.990887462299 0.940005389776 1.0 [ 1.]
69 0.686860222636 0.567885278963 1.0 [ 1.]
4 0.294849676429 0.477137560987 1.0 [-1.]
11 0.239075832595 0.37766247145 -1.0 [ 1.]
48 0.371056092459 0.163945871083 1.0 [ 1.]
32 0.469096656183 0.552402853665 1.0 [ 1.]
86 0.196119994596 0.0285439625255 1.0 [ 1.]
12 0.987358760495 0.695492327125 1.0 [ 1.]
51 0.163476432452 0.201643499134 1.0 [ 1.]
48 0.371056092459 0.163945871083 1.0 [ 1.]
72 0.642924042961 0.274031947171 1.0 [ 1.]
51 0.163476432452 0.201643499134 1.0 [ 1.]
86 0.196119994596 0.0285439625255 1.0 [ 1.]
12 0.987358760495 0.695492327125 1.0 [ 1.]
51 0.163476432452 0.201643499134 1.0 [ 1.]
82 0.535520226808 0.534943007047 1.0 [ 1.]
69 0.686860222636 0.567885278963 1.0 [ 1.]
22 0.999495269627 0.373928049906 1.0 [ 1.]
78 0.776495471284 0.404143870584 1.0 [ 1.]
70 0.647294095653 0.79905420063 1.0 [-1.]
47 0.765856295185 0.973839204595 -1.0 [-1.]
78 0.776495471284 0.404143870584 1.0 [ 1.]
97 0.045875351594 0.423093427763 -1.0 [-1.]
19 0.174161776555 0.597111911245 -1.0 [-1.]
71 0.736917167 0.0831379698757 1.0 [ 1.]
63 0.575638390701 0.934658331896 -1.0 [-1.]
99 0.746488035775 0.171535146624 1.0 [ 1.]
56 0.353738600475 0.00756391261736 1.0 [ 1.]
25 0.952509705642 0.0608161332277 1.0 [ 1.]
58 0.843473715833 0.142090447691 1.0 [ 1.]
34 0.771153929748 0.790737142338 -1.0 [ 1.]
73 0.290734591188 0.424810284665 1.0 [ 1.]
89 0.576238788847 0.30669210164 1.0 [ 1.]
90 0.306303132227 0.508438341563 1.0 [-1.]
81 0.381810773963 0.0490773641201 1.0 [ 1.]
78 0.776495471284 0.404143870584 1.0 [ 1.]
31 0.353749480424 0.222712938048 1.0 [ 1.]
60 0.683969845028 0.409171762867 1.0 [ 1.]
16 0.88702319713 0.0768101145928 1.0 [ 1.]
10 0.184646086615 0.258210259342 1.0 [ 1.]
14 0.45751773429 0.168761153349 1.0 [ 1.]
46 0.605808326697 0.870041821958 -1.0 [-1.]
33 0.794230305109 0.591895838 1.0 [ 1.]
58 0.843473715833 0.142090447691 1.0 [ 1.]
56 0.353738600475 0.00756391261736 1.0 [ 1.]
80 0.848160387085 0.581767032246 1.0 [ 1.]
72 0.642924042961 0.274031947171 1.0 [ 1.]
51 0.163476432452 0.201643499134 1.0 [ 1.]
95 0.00496862368636 0.979399577186 -1.0 [-1.]
78 0.776495471284 0.404143870584 1.0 [ 1.]
91 0.0890732533084 0.00244232163674 1.0 [ 1.]
19 0.174161776555 0.597111911245 -1.0 [-1.]
25 0.952509705642 0.0608161332277 1.0 [ 1.]
21 0.840331401342 0.629225436542 1.0 [ 1.]
56 0.353738600475 0.00756391261736 1.0 [ 1.]
78 0.776495471284 0.404143870584 1.0 [ 1.]
80 0.848160387085 0.581767032246 1.0 [ 1.]
22 0.999495269627 0.373928049906 1.0 [ 1.]
48 0.371056092459 0.163945871083 1.0 [ 1.]
13 0.734323610514 0.59395724502 1.0 [ 1.]
53 0.548329869102 0.0624212455486 1.0 [ 1.]
52 0.521511147846 0.474250194048 1.0 [ 1.]
55 0.0247822099079 0.791406591631 -1.0 [-1.]
63 0.575638390701 0.934658331896 -1.0 [-1.]
65 0.724016071598 0.868408371425 -1.0 [-1.]
57 0.293664761481 0.729828763353 -1.0 [-1.]
97 0.045875351594 0.423093427763 -1.0 [-1.]
23 0.513922567866 0.152461236624 1.0 [ 1.]
97 0.045875351594 0.423093427763 -1.0 [-1.]
50 0.991234656298 0.976972742406 -1.0 [ 1.]
8 0.8265617107 0.852185434141 1.0 [ 1.]
53 0.548329869102 0.0624212455486 1.0 [ 1.]
92 0.779426089377 0.893706883543 1.0 [-1.]
11 0.239075832595 0.37766247145 -1.0 [ 1.]
49 0.0107333010178 0.855353886455 -1.0 [-1.]
83 0.143100616452 0.47401273593 -1.0 [-1.]
57 0.293664761481 0.729828763353 -1.0 [-1.]
97 0.045875351594 0.423093427763 -1.0 [-1.]
94 0.461860723844 0.457741457449 1.0 [ 1.]
41 0.81173758157 0.668956447099 1.0 [ 1.]
17 0.616746606008 0.691849572172 1.0 [ 1.]
24 0.967621042854 0.346893379289 1.0 [ 1.]
55 0.0247822099079 0.791406591631 -1.0 [-1.]
77 0.496264957964 0.845378385805 -1.0 [-1.]
25 0.952509705642 0.0608161332277 1.0 [ 1.]
25 0.952509705642 0.0608161332277 1.0 [ 1.]
8 0.8265617107 0.852185434141 1.0 [ 1.]
38 0.571551605706 0.624716789126 1.0 [ 1.]
23 0.513922567866 0.152461236624 1.0 [ 1.]
79 0.597297307701 0.823493915838 -1.0 [-1.]
30 0.879833009026 0.8233558434 1.0 [ 1.]
78 0.776495471284 0.404143870584 1.0 [ 1.]
66 0.452572628372 0.622648553801 1.0 [-1.]
23 0.513922567866 0.152461236624 1.0 [ 1.]
51 0.163476432452 0.201643499134 1.0 [ 1.]
33 0.794230305109 0.591895838 1.0 [ 1.]
49 0.0107333010178 0.855353886455 -1.0 [-1.]
13 0.734323610514 0.59395724502 1.0 [ 1.]
3 0.489900278109 0.0361629135948 1.0 [ 1.]
60 0.683969845028 0.409171762867 1.0 [ 1.]
71 0.736917167 0.0831379698757 1.0 [ 1.]
77 0.496264957964 0.845378385805 -1.0 [-1.]
51 0.163476432452 0.201643499134 1.0 [ 1.]
53 0.548329869102 0.0624212455486 1.0 [ 1.]
76 0.768955390969 0.903366052993 -1.0 [-1.]
82 0.535520226808 0.534943007047 -1.0 [ 1.]
2 0.272429858097 0.0996488660833 1.0 [ 1.]
38 0.571551605706 0.624716789126 1.0 [ 1.]
22 0.999495269627 0.373928049906 1.0 [ 1.]
0 0.112686349571 0.595819718002 -1.0 [-1.]
17 0.616746606008 0.691849572172 1.0 [ 1.]
71 0.736917167 0.0831379698757 1.0 [ 1.]
56 0.353738600475 0.00756391261736 1.0 [ 1.]
22 0.999495269627 0.373928049906 1.0 [ 1.]
31 0.353749480424 0.222712938048 1.0 [ 1.]
18 0.232510784476 0.0380953109131 1.0 [ 1.]
21 0.840331401342 0.629225436542 1.0 [ 1.]
92 0.779426089377 0.893706883543 1.0 [-1.]
17 0.616746606008 0.691849572172 -1.0 [ 1.]
64 0.0697482249646 0.239849688301 1.0 [ 1.]
62 0.554190410985 0.732632567499 -1.0 [-1.]
59 0.694236822751 0.960496982481 -1.0 [-1.]
52 0.521511147846 0.474250194048 1.0 [ 1.]
40 0.390781201264 0.693559596939 -1.0 [-1.]
11 0.239075832595 0.37766247145 1.0 [ 1.]
79 0.597297307701 0.823493915838 -1.0 [-1.]
6 0.67151033146 0.7554830267 1.0 [-1.]
18 0.232510784476 0.0380953109131 1.0 [ 1.]
46 0.605808326697 0.870041821958 -1.0 [-1.]
12 0.987358760495 0.695492327125 1.0 [ 1.]
75 0.683534546985 0.0629488637525 1.0 [ 1.]
27 0.281856192374 0.30011411922 1.0 [ 1.]
99 0.746488035775 0.171535146624 1.0 [ 1.]
12 0.987358760495 0.695492327125 1.0 [ 1.]
76 0.768955390969 0.903366052993 -1.0 [-1.]
73 0.290734591188 0.424810284665 -1.0 [ 1.]
84 0.511960485424 0.492128947069 1.0 [ 1.]
32 0.469096656183 0.552402853665 1.0 [ 1.]
29 0.990887462299 0.940005389776 1.0 [ 1.]
59 0.694236822751 0.960496982481 -1.0 [-1.]
99 0.746488035775 0.171535146624 1.0 [ 1.]
15 0.56293522383 0.541372509145 1.0 [ 1.]
75 0.683534546985 0.0629488637525 1.0 [ 1.]
93 0.370136828682 0.918092060306 -1.0 [-1.]
67 0.8570472191 0.386760791026 1.0 [ 1.]
5 0.919137310532 0.405530139276 1.0 [ 1.]
59 0.694236822751 0.960496982481 -1.0 [-1.]
1 0.892296748113 0.177735339239 1.0 [ 1.]
95 0.00496862368636 0.979399577186 -1.0 [-1.]
81 0.381810773963 0.0490773641201 1.0 [ 1.]
77 0.496264957964 0.845378385805 -1.0 [-1.]
79 0.597297307701 0.823493915838 -1.0 [-1.]
8 0.8265617107 0.852185434141 1.0 [ 1.]
96 0.271118145609 0.402583553886 1.0 [ 1.]
59 0.694236822751 0.960496982481 -1.0 [-1.]
57 0.293664761481 0.729828763353 -1.0 [-1.]
12 0.987358760495 0.695492327125 1.0 [ 1.]
72 0.642924042961 0.274031947171 1.0 [ 1.]
35 0.511014542106 0.656675802281 -1.0 [-1.]
74 0.866475955985 0.6472912176 1.0 [ 1.]
12 0.987358760495 0.695492327125 1.0 [ 1.]
70 0.647294095653 0.79905420063 -1.0 [-1.]
94 0.461860723844 0.457741457449 1.0 [ 1.]
30 0.879833009026 0.8233558434 1.0 [ 1.]
46 0.605808326697 0.870041821958 -1.0 [-1.]
32 0.469096656183 0.552402853665 1.0 [ 1.]
1 0.892296748113 0.177735339239 1.0 [ 1.]
73 0.290734591188 0.424810284665 1.0 [ 1.]
57 0.293664761481 0.729828763353 -1.0 [-1.]
68 0.354333276088 0.930061890577 -1.0 [-1.]
15 0.56293522383 0.541372509145 1.0 [ 1.]
72 0.642924042961 0.274031947171 1.0 [ 1.]
81 0.381810773963 0.0490773641201 1.0 [ 1.]
68 0.354333276088 0.930061890577 -1.0 [-1.]
76 0.768955390969 0.903366052993 -1.0 [-1.]
72 0.642924042961 0.274031947171 1.0 [ 1.]
69 0.686860222636 0.567885278963 1.0 [ 1.]
28 0.0806789269102 0.0659012909087 1.0 [ 1.]
54 0.923614619505 0.907369666747 1.0 [ 1.]
51 0.163476432452 0.201643499134 1.0 [ 1.]
35 0.511014542106 0.656675802281 -1.0 [-1.]
57 0.293664761481 0.729828763353 -1.0 [-1.]
12 0.987358760495 0.695492327125 1.0 [ 1.]
37 0.115846816632 0.656502340474 -1.0 [-1.]
51 0.163476432452 0.201643499134 1.0 [ 1.]
73 0.290734591188 0.424810284665 1.0 [ 1.]
58 0.843473715833 0.142090447691 1.0 [ 1.]
2 0.272429858097 0.0996488660833 1.0 [ 1.]
84 0.511960485424 0.492128947069 1.0 [ 1.]
48 0.371056092459 0.163945871083 1.0 [ 1.]
93 0.370136828682 0.918092060306 -1.0 [-1.]
4 0.294849676429 0.477137560987 -1.0 [-1.]
52 0.521511147846 0.474250194048 1.0 [ 1.]
56 0.353738600475 0.00756391261736 1.0 [ 1.]
65 0.724016071598 0.868408371425 -1.0 [-1.]
7 0.0652883845189 0.186617724961 1.0 [ 1.]
96 0.271118145609 0.402583553886 1.0 [ 1.]
53 0.548329869102 0.0624212455486 1.0 [ 1.]
88 0.18145704597 0.453838269468 -1.0 [-1.]
17 0.616746606008 0.691849572172 1.0 [ 1.]
34 0.771153929748 0.790737142338 1.0 [ 1.]
2 0.272429858097 0.0996488660833 1.0 [ 1.]
43 0.930071803683 0.668773150417 1.0 [ 1.]
91 0.0890732533084 0.00244232163674 1.0 [ 1.]
99 0.746488035775 0.171535146624 1.0 [ 1.]
38 0.571551605706 0.624716789126 1.0 [ 1.]
46 0.605808326697 0.870041821958 -1.0 [-1.]
16 0.88702319713 0.0768101145928 1.0 [ 1.]
25 0.952509705642 0.0608161332277 1.0 [ 1.]
77 0.496264957964 0.845378385805 -1.0 [-1.]
91 0.0890732533084 0.00244232163674 1.0 [ 1.]
8 0.8265617107 0.852185434141 1.0 [ 1.]
77 0.496264957964 0.845378385805 -1.0 [-1.]
1 0.892296748113 0.177735339239 1.0 [ 1.]
28 0.0806789269102 0.0659012909087 1.0 [ 1.]
85 0.93641898673 0.192563260228 1.0 [ 1.]
77 0.496264957964 0.845378385805 -1.0 [-1.]
35 0.511014542106 0.656675802281 -1.0 [-1.]
15 0.56293522383 0.541372509145 1.0 [ 1.]
8 0.8265617107 0.852185434141 1.0 [ 1.]
89 0.576238788847 0.30669210164 1.0 [ 1.]
36 0.135587591581 0.611580384356 -1.0 [-1.]
68 0.354333276088 0.930061890577 -1.0 [-1.]
42 0.764365085549 0.571570338649 1.0 [ 1.]
76 0.768955390969 0.903366052993 -1.0 [-1.]
86 0.196119994596 0.0285439625255 1.0 [ 1.]
56 0.353738600475 0.00756391261736 1.0 [ 1.]
98 0.953279166254 0.0164775745715 1.0 [ 1.]
80 0.848160387085 0.581767032246 1.0 [ 1.]
78 0.776495471284 0.404143870584 1.0 [ 1.]
75 0.683534546985 0.0629488637525 1.0 [ 1.]
49 0.0107333010178 0.855353886455 -1.0 [-1.]
26 0.120163718153 0.761187461488 -1.0 [-1.]
9 0.173135674059 0.293647642084 1.0 [ 1.]
75 0.683534546985 0.0629488637525 1.0 [ 1.]
68 0.354333276088 0.930061890577 -1.0 [-1.]
28 0.0806789269102 0.0659012909087 1.0 [ 1.]
19 0.174161776555 0.597111911245 -1.0 [-1.]
51 0.163476432452 0.201643499134 1.0 [ 1.]
77 0.496264957964 0.845378385805 -1.0 [-1.]
49 0.0107333010178 0.855353886455 -1.0 [-1.]
19 0.174161776555 0.597111911245 -1.0 [-1.]
4 0.294849676429 0.477137560987 -1.0 [-1.]
56 0.353738600475 0.00756391261736 1.0 [ 1.]
28 0.0806789269102 0.0659012909087 1.0 [ 1.]
13 0.734323610514 0.59395724502 1.0 [ 1.]
31 0.353749480424 0.222712938048 1.0 [ 1.]
90 0.306303132227 0.508438341563 -1.0 [-1.]
20 0.329549870376 0.969372894276 -1.0 [-1.]
92 0.779426089377 0.893706883543 -1.0 [-1.]
65 0.724016071598 0.868408371425 -1.0 [-1.]
77 0.496264957964 0.845378385805 -1.0 [-1.]
35 0.511014542106 0.656675802281 -1.0 [-1.]
51 0.163476432452 0.201643499134 1.0 [ 1.]
73 0.290734591188 0.424810284665 1.0 [ 1.]
50 0.991234656298 0.976972742406 1.0 [ 1.]
75 0.683534546985 0.0629488637525 1.0 [ 1.]
84 0.511960485424 0.492128947069 1.0 [ 1.]
12 0.987358760495 0.695492327125 1.0 [ 1.]
34 0.771153929748 0.790737142338 1.0 [ 1.]
28 0.0806789269102 0.0659012909087 1.0 [ 1.]
65 0.724016071598 0.868408371425 -1.0 [-1.]
78 0.776495471284 0.404143870584 1.0 [ 1.]
66 0.452572628372 0.622648553801 -1.0 [-1.]
31 0.353749480424 0.222712938048 1.0 [ 1.]
21 0.840331401342 0.629225436542 1.0 [ 1.]
9 0.173135674059 0.293647642084 1.0 [ 1.]
4 0.294849676429 0.477137560987 -1.0 [-1.]
73 0.290734591188 0.424810284665 1.0 [ 1.]
25 0.952509705642 0.0608161332277 1.0 [ 1.]
53 0.548329869102 0.0624212455486 1.0 [ 1.]
45 0.513588266283 0.519593913864 1.0 [ 1.]
38 0.571551605706 0.624716789126 1.0 [ 1.]
17 0.616746606008 0.691849572172 1.0 [ 1.]
15 0.56293522383 0.541372509145 1.0 [ 1.]
98 0.953279166254 0.0164775745715 1.0 [ 1.]
55 0.0247822099079 0.791406591631 -1.0 [-1.]
73 0.290734591188 0.424810284665 1.0 [ 1.]
46 0.605808326697 0.870041821958 -1.0 [-1.]
49 0.0107333010178 0.855353886455 -1.0 [-1.]
43 0.930071803683 0.668773150417 1.0 [ 1.]
97 0.045875351594 0.423093427763 -1.0 [-1.]
0 0.112686349571 0.595819718002 -1.0 [-1.]
97 0.045875351594 0.423093427763 -1.0 [-1.]
8 0.8265617107 0.852185434141 1.0 [ 1.]
3 0.489900278109 0.0361629135948 1.0 [ 1.]
79 0.597297307701 0.823493915838 -1.0 [-1.]
46 0.605808326697 0.870041821958 -1.0 [-1.]
73 0.290734591188 0.424810284665 1.0 [ 1.]
37 0.115846816632 0.656502340474 -1.0 [-1.]
73 0.290734591188 0.424810284665 1.0 [ 1.]
82 0.535520226808 0.534943007047 1.0 [ 1.]
72 0.642924042961 0.274031947171 1.0 [ 1.]
8 0.8265617107 0.852185434141 1.0 [ 1.]
91 0.0890732533084 0.00244232163674 1.0 [ 1.]
42 0.764365085549 0.571570338649 1.0 [ 1.]
51 0.163476432452 0.201643499134 1.0 [ 1.]
67 0.8570472191 0.386760791026 1.0 [ 1.]
50 0.991234656298 0.976972742406 1.0 [ 1.]
24 0.967621042854 0.346893379289 1.0 [ 1.]
31 0.353749480424 0.222712938048 1.0 [ 1.]
90 0.306303132227 0.508438341563 -1.0 [-1.]
47 0.765856295185 0.973839204595 -1.0 [-1.]
71 0.736917167 0.0831379698757 1.0 [ 1.]
24 0.967621042854 0.346893379289 1.0 [ 1.]
94 0.461860723844 0.457741457449 1.0 [ 1.]
62 0.554190410985 0.732632567499 -1.0 [-1.]
85 0.93641898673 0.192563260228 1.0 [ 1.]
23 0.513922567866 0.152461236624 1.0 [ 1.]
7 0.0652883845189 0.186617724961 1.0 [ 1.]
73 0.290734591188 0.424810284665 1.0 [ 1.]
68 0.354333276088 0.930061890577 -1.0 [-1.]
47 0.765856295185 0.973839204595 -1.0 [-1.]
90 0.306303132227 0.508438341563 -1.0 [-1.]
59 0.694236822751 0.960496982481 -1.0 [-1.]
7 0.0652883845189 0.186617724961 1.0 [ 1.]
37 0.115846816632 0.656502340474 -1.0 [-1.]
3 0.489900278109 0.0361629135948 1.0 [ 1.]
78 0.776495471284 0.404143870584 1.0 [ 1.]
63 0.575638390701 0.934658331896 -1.0 [-1.]
5 0.919137310532 0.405530139276 1.0 [ 1.]
20 0.329549870376 0.969372894276 -1.0 [-1.]
20 0.329549870376 0.969372894276 -1.0 [-1.]
82 0.535520226808 0.534943007047 1.0 [ 1.]
26 0.120163718153 0.761187461488 -1.0 [-1.]
10 0.184646086615 0.258210259342 1.0 [ 1.]
93 0.370136828682 0.918092060306 -1.0 [-1.]
10 0.184646086615 0.258210259342 1.0 [ 1.]
37 0.115846816632 0.656502340474 -1.0 [-1.]
94 0.461860723844 0.457741457449 1.0 [ 1.]
88 0.18145704597 0.453838269468 -1.0 [-1.]
11 0.239075832595 0.37766247145 1.0 [ 1.]
9 0.173135674059 0.293647642084 1.0 [ 1.]
38 0.571551605706 0.624716789126 1.0 [ 1.]
89 0.576238788847 0.30669210164 1.0 [ 1.]
24 0.967621042854 0.346893379289 1.0 [ 1.]
95 0.00496862368636 0.979399577186 -1.0 [-1.]
20 0.329549870376 0.969372894276 -1.0 [-1.]
10 0.184646086615 0.258210259342 1.0 [ 1.]
97 0.045875351594 0.423093427763 -1.0 [-1.]
12 0.987358760495 0.695492327125 1.0 [ 1.]
74 0.866475955985 0.6472912176 1.0 [ 1.]
11 0.239075832595 0.37766247145 1.0 [ 1.]
12 0.987358760495 0.695492327125 1.0 [ 1.]
Out[15]:
<matplotlib.text.Text at 0x7f5d968e8090>

Not bad, right? The algorithm should have managed to converge to a good approximation of the separating line. If it didn't, try running the last piece of code again. Remember that this implementation updates randomly picked points, so in some cases convergence will be worse.

Also, note that the line that separates the points is not unique, given the dataset we have available. Would it be so if we had all of the possible information? My guess is that this depends on the data.

In any case, it can be proven that this process works every time, given a sufficient number of steps. This assumes that the data is linearly separable, a fact that is quite powerful on its own. We may be good at finding patterns in $\mathbb{R}^2$ but what about $\mathbb{R}^d$? Is there a way to show that a collection of points can be separated by "inserting" planes between them? We take a look at that next.

What if the dataset is not linearly separable?

If the data is not separable by a line, then, in most cases, this process will not work perfectly. Some points will be classified correctly and some will not. Then, we can think about two more questions.

  1. How much will it cost us if we missclassify a point? Is the cost an extra spam e-mail in our inbox or is it a patient not getting the correct medicine?
  2. If we don't want to take the risk with a line, which is the best curve to use instead?

We are not going to answer those here. Instead, I will just show you an example where the classification can fail, if the points are not separable by a line. Then, if you download this notebook, you can try with other curves and see what happens.

Remember that, in our case, given a point $x=(x_1,x_2)$, classification is done according to $\text{sign}(f(x_1)-x_2)$, which can either be -1 or 1.

In [16]:
# Change this function to select points with respect to a different curve.
f = lambda x: x**2;

x = np.linspace(0,1);

# Generate some data points to play with.
N = 100
xn = nr.rand(N,2)

fig = pl.figure()
figa = pl.gca();

# Plot classifier 
pl.plot(x,f(x),'r')

# Classify based on f(x)
yn = np.sign(f(xn[:,0])-xn[:,1])

colors = (yn+1)/2.0;

pl.scatter(xn[:,0],xn[:,1],c=colors,s=30);
pl.title('Classification based on f(x)')

In this example, we can see that $x^2$ colours some points as black and others as white. Let us find a linear separator now.

In [19]:
# Try percepton with that data.
w = perceptron(xn, yn, max_iter=1000)

# Re-scale the weights to construct a new representation
bnew = -w[0]/w[2];
anew = -w[1]/w[2];
y = lambda x: anew * x + bnew;

figa = pl.gca()
pl.scatter(xn[:,0],xn[:,1],c=colors,s=50);
pl.title('Classification based on f(x)')

pl.plot(x,f(x),'r',label='Separating curve.')
pl.plot(x,y(x),'b--',label = 'Curve from perceptron algorithm.')

pl.legend()
Out[19]:
<matplotlib.legend.Legend at 0x7f6b106f0f10>

In this case, our classifier cannot get all the cases right (white points should be above the blue line, black points below). This situation will probably become worse as we add more and more points.