mlcourse.ai – Open Machine Learning Course

Author: Archit Rungta

Tutorial

Imputing missing data with fancyimpute

Hi folks!

Often in real world applications of data analysis, we run into the problem of missing data. This can happen due to a multitude of reasons such as:

  • The data was compiled from different sources/times
  • Corrupted during storage
  • Certain fields were optional
  • etc.

This notebook has the following sections:

  1. Introduction
  2. The Problem
  3. KNN Imputation
  4. Comparison And Application
  5. Summary
  6. Further Reading

In this tutorial, we look at the problem of missing data in data analytics. Then, we categorize the different types of missing data and briefly discuss the specific issue presented by each specific type. Finally, we look at various methods of handling data imputation and compare their accuracy on a real-world dataset with logistic regression. We also look at the validity of a commonly held assumption about imputation techniques.

Introduction

Broadly, missing data is classified into 3 categories.

  • Missing Completely At Random (MCAR)

    Values in a data set are missing completely at random (MCAR) if the events that lead to any particular data-item being missing are independent both of observable variables and of unobservable parameters of interest, and occur entirely at random

  • Missing At Random (MAR)

    Missing at random (MAR) occurs when the missingness is not random, but where missingness can be fully accounted for by variables where there is complete information

  • Missing Not At Random (MNAR)

    Missing not at random (MNAR) (also known as nonignorable nonresponse) is data that is neither MAR nor MCAR

Data compilation from different sources is an example of MAR while data corruption is an example of MCAR. MNAR is not a problem we can fix with imputation because this is non-ignorable non-response. The only thing we can do about MNAR is to gather more information from different sources or ignore it all-together. As such we are not going to talk about MNAR anymore in this tutorial.

All of the techniques that follow are applicable only for MCAR. However, in real world scenarios, MAR is more common. As such, we will treat MAR as MCAR only which gives a reasonably good approximation in practice.

The Problem

Let's start with a toy example,

\begin{align} \ y & = \sin(x) x\, \text{for $|x|<=6$} \end{align}

In [1]:
import numpy as np                               # vectors and matrices
import pandas as pd                              # tables and data manipulations
import matplotlib.pyplot as plt                  # plots
import seaborn as sns                            # more plots

%matplotlib inline
In [2]:
x = np.linspace(-6,6)
y = np.asarray([x1*np.sin(x1) for x1 in x])
plt.scatter(x,y)
Out[2]:
<matplotlib.collections.PathCollection at 0x1aff64f28d0>

Let's delete some points on random to get an MCAR dataset

In [3]:
missing_fraction = 0.3
indices = np.random.randint(1,len(x)-1, size=int((1-missing_fraction)*len(x)))
x_mcar = x[indices]
y_mcar = y[indices]
In [4]:
plt.scatter(x_mcar,y_mcar)
Out[4]:
<matplotlib.collections.PathCollection at 0x1aff62928d0>

Throughout this tutorial, we will use MSE as an indicator of how good an imputation technique is when we have the original dataset and accuracy on predictions when we don't

In [5]:
from sklearn.metrics import mean_squared_error as mse

Let's try the easiest methods first:

  • Mean
  • Median
In [6]:
y_pred_mean = np.array(y)
for ind in list(set(np.linspace(0,len(x)-1))-set(indices)):
    y_pred_mean[int(ind)] = np.mean(y_mcar)
plt.scatter(x,y_pred_mean)
mse(y_pred_mean,y)
Out[6]:
2.26262965947143
In [7]:
y_pred_median = np.array(y)
for ind in list(set(np.linspace(0,len(x)-1))-set(indices)):
    y_pred_median[int(ind)] = np.median(y_mcar)
plt.scatter(x,y_pred_median)
mse(y_pred_median,y)
Out[7]:
3.2995309240897033

Well, this seems like pretty awful. Let's see what fancyimpute has to offer. Note: You need TensorFlow

In [8]:
!pip install fancyimpute
Requirement already satisfied: fancyimpute in c:\programdata\anaconda3\lib\site-packages (0.4.0)
Requirement already satisfied: scikit-learn>=0.19.1 in c:\programdata\anaconda3\lib\site-packages (from fancyimpute) (0.19.1)
Requirement already satisfied: numpy>=1.10 in c:\programdata\anaconda3\lib\site-packages (from fancyimpute) (1.14.3)
Requirement already satisfied: tensorflow in c:\programdata\anaconda3\lib\site-packages (from fancyimpute) (1.10.0)
Requirement already satisfied: scipy in c:\programdata\anaconda3\lib\site-packages (from fancyimpute) (1.1.0)
Requirement already satisfied: six in c:\programdata\anaconda3\lib\site-packages (from fancyimpute) (1.11.0)
Requirement already satisfied: np-utils in c:\programdata\anaconda3\lib\site-packages (from fancyimpute) (0.5.5.2)
Requirement already satisfied: cvxpy>=1.0.6 in c:\programdata\anaconda3\lib\site-packages (from fancyimpute) (1.0.9)
Requirement already satisfied: knnimpute in c:\programdata\anaconda3\lib\site-packages (from fancyimpute) (0.1.0)
Requirement already satisfied: keras>=2.0.0 in c:\programdata\anaconda3\lib\site-packages (from fancyimpute) (2.2.2)
Requirement already satisfied: termcolor>=1.1.0 in c:\programdata\anaconda3\lib\site-packages (from tensorflow->fancyimpute) (1.1.0)
Requirement already satisfied: protobuf>=3.6.0 in c:\programdata\anaconda3\lib\site-packages (from tensorflow->fancyimpute) (3.6.0)
Requirement already satisfied: setuptools<=39.1.0 in c:\programdata\anaconda3\lib\site-packages (from tensorflow->fancyimpute) (39.1.0)
Requirement already satisfied: astor>=0.6.0 in c:\programdata\anaconda3\lib\site-packages (from tensorflow->fancyimpute) (0.7.1)
Requirement already satisfied: tensorboard<1.11.0,>=1.10.0 in c:\programdata\anaconda3\lib\site-packages (from tensorflow->fancyimpute) (1.10.0)
Requirement already satisfied: grpcio>=1.8.6 in c:\programdata\anaconda3\lib\site-packages (from tensorflow->fancyimpute) (1.12.1)
Requirement already satisfied: absl-py>=0.1.6 in c:\programdata\anaconda3\lib\site-packages (from tensorflow->fancyimpute) (0.5.0)
Requirement already satisfied: gast>=0.2.0 in c:\programdata\anaconda3\lib\site-packages (from tensorflow->fancyimpute) (0.2.0)
Requirement already satisfied: wheel>=0.26 in c:\programdata\anaconda3\lib\site-packages (from tensorflow->fancyimpute) (0.31.1)
Requirement already satisfied: future>=0.16 in c:\programdata\anaconda3\lib\site-packages (from np-utils->fancyimpute) (0.16.0)
Requirement already satisfied: scs>=1.1.3 in c:\programdata\anaconda3\lib\site-packages (from cvxpy>=1.0.6->fancyimpute) (2.0.2)
Requirement already satisfied: multiprocess in c:\programdata\anaconda3\lib\site-packages (from cvxpy>=1.0.6->fancyimpute) (0.70.6.1)
Requirement already satisfied: toolz in c:\programdata\anaconda3\lib\site-packages (from cvxpy>=1.0.6->fancyimpute) (0.9.0)
Requirement already satisfied: osqp in c:\programdata\anaconda3\lib\site-packages (from cvxpy>=1.0.6->fancyimpute) (0.4.1)
Requirement already satisfied: fastcache in c:\programdata\anaconda3\lib\site-packages (from cvxpy>=1.0.6->fancyimpute) (1.0.2)
Requirement already satisfied: ecos>=2 in c:\programdata\anaconda3\lib\site-packages (from cvxpy>=1.0.6->fancyimpute) (2.0.5)
Requirement already satisfied: pyyaml in c:\programdata\anaconda3\lib\site-packages (from keras>=2.0.0->fancyimpute) (3.12)
Requirement already satisfied: h5py in c:\programdata\anaconda3\lib\site-packages (from keras>=2.0.0->fancyimpute) (2.7.1)
Requirement already satisfied: keras_applications==1.0.4 in c:\programdata\anaconda3\lib\site-packages (from keras>=2.0.0->fancyimpute) (1.0.4)
Requirement already satisfied: keras_preprocessing==1.0.2 in c:\programdata\anaconda3\lib\site-packages (from keras>=2.0.0->fancyimpute) (1.0.2)
Requirement already satisfied: markdown>=2.6.8 in c:\programdata\anaconda3\lib\site-packages (from tensorboard<1.11.0,>=1.10.0->tensorflow->fancyimpute) (2.6.11)
Requirement already satisfied: werkzeug>=0.11.10 in c:\programdata\anaconda3\lib\site-packages (from tensorboard<1.11.0,>=1.10.0->tensorflow->fancyimpute) (0.14.1)
Requirement already satisfied: dill>=0.2.8.1 in c:\programdata\anaconda3\lib\site-packages (from multiprocess->cvxpy>=1.0.6->fancyimpute) (0.2.8.2)
Requirement already satisfied: pyreadline>=1.7.1 in c:\programdata\anaconda3\lib\site-packages (from dill>=0.2.8.1->multiprocess->cvxpy>=1.0.6->fancyimpute) (2.1)
You are using pip version 10.0.1, however version 18.1 is available.
You should consider upgrading via the 'python -m pip install --upgrade pip' command.
In [9]:
import fancyimpute
C:\ProgramData\Anaconda3\lib\site-packages\h5py\__init__.py:36: FutureWarning: Conversion of the second argument of issubdtype from `float` to `np.floating` is deprecated. In future, it will be treated as `np.float64 == np.dtype(float).type`.
  from ._conv import register_converters as _register_converters
Using TensorFlow backend.
In [10]:
y_pred_knn = np.concatenate((np.array(y).reshape(-1,1), np.array(x).reshape(-1,1)), axis=1)
for ind in indices:
    y_pred_knn[int(ind)] = [float("NaN"), y_pred_knn[int(ind)][1]]
y_pred_knn = fancyimpute.KNN(k=3).fit_transform(y_pred_knn)
Imputing row 1/50 with 0 missing, elapsed time: 0.004
In [11]:
y_pred_knn_2 = [x[0] for x in y_pred_knn]
In [12]:
plt.scatter(x,y_pred_knn_2)
mse(y_pred_knn_2,y)
Out[12]:
0.11433938631068095

As we can see, fancyimpute has performed much better than mean or median methods on this toy dataset.

Next up, we get some in-depth understanding of how the KNN algorithm for fancyimpute works and apply it to some real datasets.

KNN Imputation

In pattern recognition, the k-nearest neighbors algorithm is a non-parametric method used for classification and regression

The assumption behind using KNN for missing values is that a point value can be approximated by the values of the points that are closest to it, based on other variables.

The fancyimpute KNN algorithm works by calculating the k nearest neighbors which have the missing features available and then weights them based on Euclidean distance from the target row. The missing value is then calculated as a weighted mean from these neighboring rows.

Below is an implementation for k = 2. Because we know our data is sorted we can code this much more efficiently. However, this isn't a general implementation. We also ignore the possibility that both of the closest neighbors can be on the same side to reduce the complexity of the code.

In [13]:
y_cust = np.array(y)
for ind in indices:
    low1 = ind-1
    while low1 in indices:
        low1 = low1 - 1
    high1 = ind + 1
    while high1 in indices:
        high1 = high1 + 1
    d1 = 1/(ind - low1)
    d2 = 1/(high1 - ind)
    y_cust[ind] = (d1*y_cust[low1]+d2*y_cust[high1])/(d1+d2)
In [14]:
plt.scatter(x,y_cust)
mse(y_cust,y)
Out[14]:
0.034650998762531186

Comparison and Application

We will use the Pima Indians Diabetes database for our example use case. This is an example of a MAR dataset but we will treat it as MCAR to make the best out of what we have. You can download the data from - https://www.kaggle.com/kumargh/pimaindiansdiabetescsv

In [15]:
df  = pd.read_csv('pima-indians-diabetes.csv',header=None)
In [16]:
df.head()
Out[16]:
0 1 2 3 4 5 6 7 8
0 6 148 72 35 0 33.6 0.627 50 1
1 1 85 66 29 0 26.6 0.351 31 0
2 8 183 64 0 0 23.3 0.672 32 1
3 1 89 66 23 94 28.1 0.167 21 0
4 0 137 40 35 168 43.1 2.288 33 1
  1. Number of times pregnant
  2. Plasma glucose concentration a 2 hours in an oral glucose tolerance test
  3. Diastolic blood pressure (mm Hg)
  4. Triceps skin fold thickness (mm)
  5. 2-Hour serum insulin (mu U/ml)
  6. Body mass index (weight in kg/(height in m)^2)
  7. Diabetes pedigree function
  8. Age (years)
  9. Class variable (0 or 1)

Clearly, a person cannot have triceps sking fold thickness as 0 mm. This is a missing value and we need to replace 0 with NaN to let our algorithms know that it's a missing value.

By reading the descriptions we can be sure that columns 1,2,3,4,5,6 and 7 cannot have zero values. As such, we will mark 0s as missing.

Also, imputing functions work better with scaled features so we will use MinMaxScaler to scale every feature between 0 to 1.

In [17]:
(df[[1,2,3,4,5,6,7]] == 0).sum()
Out[17]:
1      5
2     35
3    227
4    374
5     11
6      0
7      0
dtype: int64
In [18]:
from sklearn.preprocessing import MinMaxScaler

df = pd.DataFrame(data=MinMaxScaler().fit_transform(df.values), columns=df.columns, index=df.index)
df[[1,2,3,4,5,6,7]] = df[[1,2,3,4,5,6,7]].replace(0,float('NaN'))
In [19]:
df.head()
Out[19]:
0 1 2 3 4 5 6 7 8
0 0.352941 0.743719 0.590164 0.353535 NaN 0.500745 0.234415 0.483333 1.0
1 0.058824 0.427136 0.540984 0.292929 NaN 0.396423 0.116567 0.166667 0.0
2 0.470588 0.919598 0.524590 NaN NaN 0.347243 0.253629 0.183333 1.0
3 0.058824 0.447236 0.540984 0.232323 0.111111 0.418778 0.038002 NaN 0.0
4 0.000000 0.688442 0.327869 0.353535 0.198582 0.642325 0.943638 0.200000 1.0

fancyimpute offers many different forms of imputation methods, however, we are only comparing the four mentioned below. You can read about all of these at https://pypi.org/project/fancyimpute/

Now, we will compare Logistic Regression using four different imputation methods:

  • KNN
  • Mean
  • IterativeImputer
  • SoftImpute

We will first construct the dataframe for the bottom three because for KNN we need to find the optimum value of the hyperparameter.

In [20]:
df_mean=pd.DataFrame(data=fancyimpute.SimpleFill().fit_transform(df.values), columns=df.columns, index=df.index)
df_iterative=pd.DataFrame(data=fancyimpute.IterativeImputer().fit_transform(df.values), columns=df.columns, index=df.index)
df_soft=pd.DataFrame(data=fancyimpute.SoftImpute().fit_transform(df.values), columns=df.columns, index=df.index)
[SoftImpute] Max Singular Value of X_init = 31.698472
[SoftImpute] Iter 1: observed MAE=0.017449 rank=9
[SoftImpute] Iter 2: observed MAE=0.017573 rank=9
[SoftImpute] Iter 3: observed MAE=0.017683 rank=9
[SoftImpute] Iter 4: observed MAE=0.017781 rank=9
[SoftImpute] Iter 5: observed MAE=0.017869 rank=9
[SoftImpute] Iter 6: observed MAE=0.017944 rank=9
[SoftImpute] Iter 7: observed MAE=0.018009 rank=9
[SoftImpute] Iter 8: observed MAE=0.018064 rank=9
[SoftImpute] Iter 9: observed MAE=0.018111 rank=9
[SoftImpute] Iter 10: observed MAE=0.018151 rank=9
[SoftImpute] Iter 11: observed MAE=0.018186 rank=9
[SoftImpute] Iter 12: observed MAE=0.018216 rank=9
[SoftImpute] Iter 13: observed MAE=0.018242 rank=9
[SoftImpute] Iter 14: observed MAE=0.018265 rank=9
[SoftImpute] Iter 15: observed MAE=0.018285 rank=9
[SoftImpute] Iter 16: observed MAE=0.018301 rank=9
[SoftImpute] Iter 17: observed MAE=0.018315 rank=9
[SoftImpute] Iter 18: observed MAE=0.018326 rank=9
[SoftImpute] Iter 19: observed MAE=0.018337 rank=9
[SoftImpute] Iter 20: observed MAE=0.018346 rank=9
[SoftImpute] Iter 21: observed MAE=0.018353 rank=9
[SoftImpute] Iter 22: observed MAE=0.018360 rank=9
[SoftImpute] Iter 23: observed MAE=0.018366 rank=9
[SoftImpute] Iter 24: observed MAE=0.018371 rank=9
[SoftImpute] Iter 25: observed MAE=0.018375 rank=9
[SoftImpute] Iter 26: observed MAE=0.018379 rank=9
[SoftImpute] Iter 27: observed MAE=0.018382 rank=9
[SoftImpute] Iter 28: observed MAE=0.018385 rank=9
[SoftImpute] Iter 29: observed MAE=0.018387 rank=9
[SoftImpute] Iter 30: observed MAE=0.018389 rank=9
[SoftImpute] Iter 31: observed MAE=0.018391 rank=9
[SoftImpute] Iter 32: observed MAE=0.018392 rank=9
[SoftImpute] Iter 33: observed MAE=0.018393 rank=9
[SoftImpute] Iter 34: observed MAE=0.018394 rank=9
[SoftImpute] Iter 35: observed MAE=0.018395 rank=9
[SoftImpute] Iter 36: observed MAE=0.018396 rank=9
[SoftImpute] Iter 37: observed MAE=0.018397 rank=9
[SoftImpute] Iter 38: observed MAE=0.018398 rank=9
[SoftImpute] Iter 39: observed MAE=0.018398 rank=9
[SoftImpute] Iter 40: observed MAE=0.018399 rank=9
[SoftImpute] Stopped after iteration 40 for lambda=0.633969
In [21]:
from sklearn.linear_model import LogisticRegression
logisticRegr = LogisticRegression()
validation_split = 0.8
input_columns = [0,1,2,3,4,5,6,7]
In [22]:
logisticRegr.fit(df_mean[:int(len(df)*validation_split)][input_columns], df[:int(len(df)*validation_split)][8].values )
mean_score = logisticRegr.score(df_mean[int(len(df)*validation_split):][input_columns], df[int(len(df)*validation_split):][8].values )
mean_score
Out[22]:
0.7597402597402597
In [23]:
logisticRegr = LogisticRegression()

logisticRegr.fit(df_iterative[:int(len(df)*validation_split)][input_columns], df[:int(len(df)*validation_split)][8].values )
iter_score = logisticRegr.score(df_iterative[int(len(df)*validation_split):][input_columns], df[int(len(df)*validation_split):][8].values )
iter_score
Out[23]:
0.7727272727272727
In [24]:
logisticRegr = LogisticRegression()

logisticRegr.fit(df_soft[:int(len(df)*validation_split)][input_columns], df[:int(len(df)*validation_split)][8].values )
soft_score = logisticRegr.score(df_soft[int(len(df)*validation_split):][input_columns], df[int(len(df)*validation_split):][8].values )
soft_score
Out[24]:
0.7727272727272727
In [25]:
results_knn = []

for k in range(2,30):
    df_knn=pd.DataFrame(data=fancyimpute.KNN(k=k).fit_transform(df.values), columns=df.columns, index=df.index)
    logisticRegr.fit(df_knn[:int(len(df)*validation_split)][input_columns], df[:int(len(df)*validation_split)][8].values )
    results_knn.append(logisticRegr.score(df_knn[int(len(df)*validation_split):][input_columns], df[int(len(df)*validation_split):][8].values ))
Imputing row 1/768 with 1 missing, elapsed time: 0.265
Imputing row 101/768 with 2 missing, elapsed time: 0.272
Imputing row 201/768 with 2 missing, elapsed time: 0.275
Imputing row 301/768 with 3 missing, elapsed time: 0.278
Imputing row 401/768 with 2 missing, elapsed time: 0.282
Imputing row 501/768 with 1 missing, elapsed time: 0.287
Imputing row 601/768 with 1 missing, elapsed time: 0.292
Imputing row 701/768 with 0 missing, elapsed time: 0.297
Imputing row 1/768 with 1 missing, elapsed time: 0.281
Imputing row 101/768 with 2 missing, elapsed time: 0.288
Imputing row 201/768 with 2 missing, elapsed time: 0.293
Imputing row 301/768 with 3 missing, elapsed time: 0.298
Imputing row 401/768 with 2 missing, elapsed time: 0.305
Imputing row 501/768 with 1 missing, elapsed time: 0.309
Imputing row 601/768 with 1 missing, elapsed time: 0.315
Imputing row 701/768 with 0 missing, elapsed time: 0.321
Imputing row 1/768 with 1 missing, elapsed time: 0.256
Imputing row 101/768 with 2 missing, elapsed time: 0.261
Imputing row 201/768 with 2 missing, elapsed time: 0.266
Imputing row 301/768 with 3 missing, elapsed time: 0.270
Imputing row 401/768 with 2 missing, elapsed time: 0.274
Imputing row 501/768 with 1 missing, elapsed time: 0.280
Imputing row 601/768 with 1 missing, elapsed time: 0.284
Imputing row 701/768 with 0 missing, elapsed time: 0.288
Imputing row 1/768 with 1 missing, elapsed time: 0.264
Imputing row 101/768 with 2 missing, elapsed time: 0.269
Imputing row 201/768 with 2 missing, elapsed time: 0.274
Imputing row 301/768 with 3 missing, elapsed time: 0.279
Imputing row 401/768 with 2 missing, elapsed time: 0.283
Imputing row 501/768 with 1 missing, elapsed time: 0.287
Imputing row 601/768 with 1 missing, elapsed time: 0.291
Imputing row 701/768 with 0 missing, elapsed time: 0.295
Imputing row 1/768 with 1 missing, elapsed time: 0.261
Imputing row 101/768 with 2 missing, elapsed time: 0.266
Imputing row 201/768 with 2 missing, elapsed time: 0.271
Imputing row 301/768 with 3 missing, elapsed time: 0.275
Imputing row 401/768 with 2 missing, elapsed time: 0.279
Imputing row 501/768 with 1 missing, elapsed time: 0.283
Imputing row 601/768 with 1 missing, elapsed time: 0.290
Imputing row 701/768 with 0 missing, elapsed time: 0.295
Imputing row 1/768 with 1 missing, elapsed time: 0.266
Imputing row 101/768 with 2 missing, elapsed time: 0.272
Imputing row 201/768 with 2 missing, elapsed time: 0.276
Imputing row 301/768 with 3 missing, elapsed time: 0.280
Imputing row 401/768 with 2 missing, elapsed time: 0.284
Imputing row 501/768 with 1 missing, elapsed time: 0.289
Imputing row 601/768 with 1 missing, elapsed time: 0.293
Imputing row 701/768 with 0 missing, elapsed time: 0.298
Imputing row 1/768 with 1 missing, elapsed time: 0.280
Imputing row 101/768 with 2 missing, elapsed time: 0.285
Imputing row 201/768 with 2 missing, elapsed time: 0.289
Imputing row 301/768 with 3 missing, elapsed time: 0.293
Imputing row 401/768 with 2 missing, elapsed time: 0.296
Imputing row 501/768 with 1 missing, elapsed time: 0.300
Imputing row 601/768 with 1 missing, elapsed time: 0.305
Imputing row 701/768 with 0 missing, elapsed time: 0.309
Imputing row 1/768 with 1 missing, elapsed time: 0.269
Imputing row 101/768 with 2 missing, elapsed time: 0.274
Imputing row 201/768 with 2 missing, elapsed time: 0.279
Imputing row 301/768 with 3 missing, elapsed time: 0.283
Imputing row 401/768 with 2 missing, elapsed time: 0.286
Imputing row 501/768 with 1 missing, elapsed time: 0.291
Imputing row 601/768 with 1 missing, elapsed time: 0.297
Imputing row 701/768 with 0 missing, elapsed time: 0.302
Imputing row 1/768 with 1 missing, elapsed time: 0.263
Imputing row 101/768 with 2 missing, elapsed time: 0.269
Imputing row 201/768 with 2 missing, elapsed time: 0.274
Imputing row 301/768 with 3 missing, elapsed time: 0.278
Imputing row 401/768 with 2 missing, elapsed time: 0.282
Imputing row 501/768 with 1 missing, elapsed time: 0.286
Imputing row 601/768 with 1 missing, elapsed time: 0.290
Imputing row 701/768 with 0 missing, elapsed time: 0.294
Imputing row 1/768 with 1 missing, elapsed time: 0.272
Imputing row 101/768 with 2 missing, elapsed time: 0.278
Imputing row 201/768 with 2 missing, elapsed time: 0.283
Imputing row 301/768 with 3 missing, elapsed time: 0.287
Imputing row 401/768 with 2 missing, elapsed time: 0.290
Imputing row 501/768 with 1 missing, elapsed time: 0.297
Imputing row 601/768 with 1 missing, elapsed time: 0.302
Imputing row 701/768 with 0 missing, elapsed time: 0.311
Imputing row 1/768 with 1 missing, elapsed time: 0.269
Imputing row 101/768 with 2 missing, elapsed time: 0.273
Imputing row 201/768 with 2 missing, elapsed time: 0.277
Imputing row 301/768 with 3 missing, elapsed time: 0.282
Imputing row 401/768 with 2 missing, elapsed time: 0.286
Imputing row 501/768 with 1 missing, elapsed time: 0.290
Imputing row 601/768 with 1 missing, elapsed time: 0.294
Imputing row 701/768 with 0 missing, elapsed time: 0.301
Imputing row 1/768 with 1 missing, elapsed time: 0.259
Imputing row 101/768 with 2 missing, elapsed time: 0.264
Imputing row 201/768 with 2 missing, elapsed time: 0.269
Imputing row 301/768 with 3 missing, elapsed time: 0.272
Imputing row 401/768 with 2 missing, elapsed time: 0.275
Imputing row 501/768 with 1 missing, elapsed time: 0.278
Imputing row 601/768 with 1 missing, elapsed time: 0.283
Imputing row 701/768 with 0 missing, elapsed time: 0.289
Imputing row 1/768 with 1 missing, elapsed time: 0.259
Imputing row 101/768 with 2 missing, elapsed time: 0.265
Imputing row 201/768 with 2 missing, elapsed time: 0.269
Imputing row 301/768 with 3 missing, elapsed time: 0.273
Imputing row 401/768 with 2 missing, elapsed time: 0.276
Imputing row 501/768 with 1 missing, elapsed time: 0.280
Imputing row 601/768 with 1 missing, elapsed time: 0.284
Imputing row 701/768 with 0 missing, elapsed time: 0.290
Imputing row 1/768 with 1 missing, elapsed time: 0.260
Imputing row 101/768 with 2 missing, elapsed time: 0.266
Imputing row 201/768 with 2 missing, elapsed time: 0.270
Imputing row 301/768 with 3 missing, elapsed time: 0.274
Imputing row 401/768 with 2 missing, elapsed time: 0.279
Imputing row 501/768 with 1 missing, elapsed time: 0.284
Imputing row 601/768 with 1 missing, elapsed time: 0.289
Imputing row 701/768 with 0 missing, elapsed time: 0.293
Imputing row 1/768 with 1 missing, elapsed time: 0.261
Imputing row 101/768 with 2 missing, elapsed time: 0.269
Imputing row 201/768 with 2 missing, elapsed time: 0.272
Imputing row 301/768 with 3 missing, elapsed time: 0.277
Imputing row 401/768 with 2 missing, elapsed time: 0.282
Imputing row 501/768 with 1 missing, elapsed time: 0.287
Imputing row 601/768 with 1 missing, elapsed time: 0.298
Imputing row 701/768 with 0 missing, elapsed time: 0.305
Imputing row 1/768 with 1 missing, elapsed time: 0.261
Imputing row 101/768 with 2 missing, elapsed time: 0.266
Imputing row 201/768 with 2 missing, elapsed time: 0.272
Imputing row 301/768 with 3 missing, elapsed time: 0.276
Imputing row 401/768 with 2 missing, elapsed time: 0.280
Imputing row 501/768 with 1 missing, elapsed time: 0.284
Imputing row 601/768 with 1 missing, elapsed time: 0.290
Imputing row 701/768 with 0 missing, elapsed time: 0.295
Imputing row 1/768 with 1 missing, elapsed time: 0.254
Imputing row 101/768 with 2 missing, elapsed time: 0.258
Imputing row 201/768 with 2 missing, elapsed time: 0.264
Imputing row 301/768 with 3 missing, elapsed time: 0.269
Imputing row 401/768 with 2 missing, elapsed time: 0.273
Imputing row 501/768 with 1 missing, elapsed time: 0.277
Imputing row 601/768 with 1 missing, elapsed time: 0.282
Imputing row 701/768 with 0 missing, elapsed time: 0.286
Imputing row 1/768 with 1 missing, elapsed time: 0.253
Imputing row 101/768 with 2 missing, elapsed time: 0.258
Imputing row 201/768 with 2 missing, elapsed time: 0.265
Imputing row 301/768 with 3 missing, elapsed time: 0.268
Imputing row 401/768 with 2 missing, elapsed time: 0.272
Imputing row 501/768 with 1 missing, elapsed time: 0.277
Imputing row 601/768 with 1 missing, elapsed time: 0.282
Imputing row 701/768 with 0 missing, elapsed time: 0.287
Imputing row 1/768 with 1 missing, elapsed time: 0.249
Imputing row 101/768 with 2 missing, elapsed time: 0.254
Imputing row 201/768 with 2 missing, elapsed time: 0.259
Imputing row 301/768 with 3 missing, elapsed time: 0.264
Imputing row 401/768 with 2 missing, elapsed time: 0.268
Imputing row 501/768 with 1 missing, elapsed time: 0.273
Imputing row 601/768 with 1 missing, elapsed time: 0.277
Imputing row 701/768 with 0 missing, elapsed time: 0.283
Imputing row 1/768 with 1 missing, elapsed time: 0.265
Imputing row 101/768 with 2 missing, elapsed time: 0.270
Imputing row 201/768 with 2 missing, elapsed time: 0.274
Imputing row 301/768 with 3 missing, elapsed time: 0.278
Imputing row 401/768 with 2 missing, elapsed time: 0.282
Imputing row 501/768 with 1 missing, elapsed time: 0.286
Imputing row 601/768 with 1 missing, elapsed time: 0.290
Imputing row 701/768 with 0 missing, elapsed time: 0.294
Imputing row 1/768 with 1 missing, elapsed time: 0.267
Imputing row 101/768 with 2 missing, elapsed time: 0.272
Imputing row 201/768 with 2 missing, elapsed time: 0.278
Imputing row 301/768 with 3 missing, elapsed time: 0.284
Imputing row 401/768 with 2 missing, elapsed time: 0.288
Imputing row 501/768 with 1 missing, elapsed time: 0.292
Imputing row 601/768 with 1 missing, elapsed time: 0.299
Imputing row 701/768 with 0 missing, elapsed time: 0.304
Imputing row 1/768 with 1 missing, elapsed time: 0.268
Imputing row 101/768 with 2 missing, elapsed time: 0.273
Imputing row 201/768 with 2 missing, elapsed time: 0.277
Imputing row 301/768 with 3 missing, elapsed time: 0.282
Imputing row 401/768 with 2 missing, elapsed time: 0.286
Imputing row 501/768 with 1 missing, elapsed time: 0.290
Imputing row 601/768 with 1 missing, elapsed time: 0.296
Imputing row 701/768 with 0 missing, elapsed time: 0.301
Imputing row 1/768 with 1 missing, elapsed time: 0.257
Imputing row 101/768 with 2 missing, elapsed time: 0.264
Imputing row 201/768 with 2 missing, elapsed time: 0.269
Imputing row 301/768 with 3 missing, elapsed time: 0.273
Imputing row 401/768 with 2 missing, elapsed time: 0.278
Imputing row 501/768 with 1 missing, elapsed time: 0.283
Imputing row 601/768 with 1 missing, elapsed time: 0.289
Imputing row 701/768 with 0 missing, elapsed time: 0.295
Imputing row 1/768 with 1 missing, elapsed time: 0.262
Imputing row 101/768 with 2 missing, elapsed time: 0.267
Imputing row 201/768 with 2 missing, elapsed time: 0.272
Imputing row 301/768 with 3 missing, elapsed time: 0.276
Imputing row 401/768 with 2 missing, elapsed time: 0.279
Imputing row 501/768 with 1 missing, elapsed time: 0.284
Imputing row 601/768 with 1 missing, elapsed time: 0.288
Imputing row 701/768 with 0 missing, elapsed time: 0.292
Imputing row 1/768 with 1 missing, elapsed time: 0.264
Imputing row 101/768 with 2 missing, elapsed time: 0.269
Imputing row 201/768 with 2 missing, elapsed time: 0.273
Imputing row 301/768 with 3 missing, elapsed time: 0.278
Imputing row 401/768 with 2 missing, elapsed time: 0.281
Imputing row 501/768 with 1 missing, elapsed time: 0.286
Imputing row 601/768 with 1 missing, elapsed time: 0.290
Imputing row 701/768 with 0 missing, elapsed time: 0.295
Imputing row 1/768 with 1 missing, elapsed time: 0.257
Imputing row 101/768 with 2 missing, elapsed time: 0.264
Imputing row 201/768 with 2 missing, elapsed time: 0.269
Imputing row 301/768 with 3 missing, elapsed time: 0.274
Imputing row 401/768 with 2 missing, elapsed time: 0.281
Imputing row 501/768 with 1 missing, elapsed time: 0.285
Imputing row 601/768 with 1 missing, elapsed time: 0.289
Imputing row 701/768 with 0 missing, elapsed time: 0.294
Imputing row 1/768 with 1 missing, elapsed time: 0.246
Imputing row 101/768 with 2 missing, elapsed time: 0.252
Imputing row 201/768 with 2 missing, elapsed time: 0.256
Imputing row 301/768 with 3 missing, elapsed time: 0.261
Imputing row 401/768 with 2 missing, elapsed time: 0.264
Imputing row 501/768 with 1 missing, elapsed time: 0.270
Imputing row 601/768 with 1 missing, elapsed time: 0.275
Imputing row 701/768 with 0 missing, elapsed time: 0.280
Imputing row 1/768 with 1 missing, elapsed time: 0.256
Imputing row 101/768 with 2 missing, elapsed time: 0.260
Imputing row 201/768 with 2 missing, elapsed time: 0.265
Imputing row 301/768 with 3 missing, elapsed time: 0.271
Imputing row 401/768 with 2 missing, elapsed time: 0.275
Imputing row 501/768 with 1 missing, elapsed time: 0.280
Imputing row 601/768 with 1 missing, elapsed time: 0.284
Imputing row 701/768 with 0 missing, elapsed time: 0.288
In [26]:
plt.plot(results_knn)
Out[26]:
[<matplotlib.lines.Line2D at 0x1affc3ce5f8>]

Summarising the results:

  • Mean Imputation - 75.97%
  • Iterative Imputer - 77.27%
  • Soft Imputer - 77.27%
  • KNN Imputation - 80.52%

It is often claimed that mean imputation is just as good as the fancier methods such as KNN when used in conjunction with more complicated models. To test it, we build a simple neural network and train it with mean imputed data and compare results with KNN imputed data.

In [27]:
!pip install keras
Requirement already satisfied: keras in c:\programdata\anaconda3\lib\site-packages (2.2.2)
Requirement already satisfied: numpy>=1.9.1 in c:\programdata\anaconda3\lib\site-packages (from keras) (1.14.3)
Requirement already satisfied: scipy>=0.14 in c:\programdata\anaconda3\lib\site-packages (from keras) (1.1.0)
Requirement already satisfied: six>=1.9.0 in c:\programdata\anaconda3\lib\site-packages (from keras) (1.11.0)
Requirement already satisfied: pyyaml in c:\programdata\anaconda3\lib\site-packages (from keras) (3.12)
Requirement already satisfied: h5py in c:\programdata\anaconda3\lib\site-packages (from keras) (2.7.1)
Requirement already satisfied: keras_applications==1.0.4 in c:\programdata\anaconda3\lib\site-packages (from keras) (1.0.4)
Requirement already satisfied: keras_preprocessing==1.0.2 in c:\programdata\anaconda3\lib\site-packages (from keras) (1.0.2)
You are using pip version 10.0.1, however version 18.1 is available.
You should consider upgrading via the 'python -m pip install --upgrade pip' command.
In [28]:
from keras.models import Sequential
from keras.layers import Dense, Dropout

model = Sequential()
model.add(Dense(10,activation='relu', input_dim=8))

model.add(Dense(10,activation='relu'))

model.add(Dense(1, activation='sigmoid'))

model.compile(loss='binary_crossentropy', optimizer='rmsprop', metrics=['accuracy'])
model.fit(df_mean[input_columns], df[8], batch_size=32, epochs=400, validation_split=0.2)
Train on 614 samples, validate on 154 samples
Epoch 1/400
614/614 [==============================] - 1s 900us/step - loss: 0.7004 - acc: 0.3941 - val_loss: 0.6877 - val_acc: 0.6039
Epoch 2/400
614/614 [==============================] - 0s 95us/step - loss: 0.6769 - acc: 0.7101 - val_loss: 0.6661 - val_acc: 0.6494
Epoch 3/400
614/614 [==============================] - 0s 101us/step - loss: 0.6599 - acc: 0.6612 - val_loss: 0.6540 - val_acc: 0.6429
Epoch 4/400
614/614 [==============================] - 0s 62us/step - loss: 0.6512 - acc: 0.6531 - val_loss: 0.6494 - val_acc: 0.6429
Epoch 5/400
614/614 [==============================] - 0s 83us/step - loss: 0.6459 - acc: 0.6531 - val_loss: 0.6456 - val_acc: 0.6429
Epoch 6/400
614/614 [==============================] - 0s 79us/step - loss: 0.6415 - acc: 0.6531 - val_loss: 0.6418 - val_acc: 0.6429
Epoch 7/400
614/614 [==============================] - 0s 57us/step - loss: 0.6368 - acc: 0.6547 - val_loss: 0.6370 - val_acc: 0.6429
Epoch 8/400
614/614 [==============================] - 0s 122us/step - loss: 0.6322 - acc: 0.6564 - val_loss: 0.6321 - val_acc: 0.6429
Epoch 9/400
614/614 [==============================] - 0s 101us/step - loss: 0.6276 - acc: 0.6564 - val_loss: 0.6271 - val_acc: 0.6429
Epoch 10/400
614/614 [==============================] - 0s 64us/step - loss: 0.6234 - acc: 0.6564 - val_loss: 0.6211 - val_acc: 0.6429
Epoch 11/400
614/614 [==============================] - 0s 104us/step - loss: 0.6189 - acc: 0.6661 - val_loss: 0.6168 - val_acc: 0.6364
Epoch 12/400
614/614 [==============================] - 0s 54us/step - loss: 0.6145 - acc: 0.6743 - val_loss: 0.6117 - val_acc: 0.6364
Epoch 13/400
614/614 [==============================] - 0s 95us/step - loss: 0.6100 - acc: 0.6824 - val_loss: 0.6082 - val_acc: 0.6364
Epoch 14/400
614/614 [==============================] - 0s 64us/step - loss: 0.6065 - acc: 0.6775 - val_loss: 0.6041 - val_acc: 0.6623
Epoch 15/400
614/614 [==============================] - 0s 99us/step - loss: 0.6024 - acc: 0.6808 - val_loss: 0.5998 - val_acc: 0.6688
Epoch 16/400
614/614 [==============================] - 0s 65us/step - loss: 0.5985 - acc: 0.7052 - val_loss: 0.5964 - val_acc: 0.6623
Epoch 17/400
614/614 [==============================] - 0s 109us/step - loss: 0.5947 - acc: 0.6954 - val_loss: 0.5924 - val_acc: 0.6688
Epoch 18/400
614/614 [==============================] - 0s 73us/step - loss: 0.5914 - acc: 0.7068 - val_loss: 0.5893 - val_acc: 0.6688
Epoch 19/400
614/614 [==============================] - 0s 86us/step - loss: 0.5867 - acc: 0.7052 - val_loss: 0.5867 - val_acc: 0.6623
Epoch 20/400
614/614 [==============================] - 0s 52us/step - loss: 0.5824 - acc: 0.6987 - val_loss: 0.5799 - val_acc: 0.7013
Epoch 21/400
614/614 [==============================] - 0s 73us/step - loss: 0.5794 - acc: 0.7101 - val_loss: 0.5757 - val_acc: 0.7013
Epoch 22/400
614/614 [==============================] - 0s 55us/step - loss: 0.5756 - acc: 0.7020 - val_loss: 0.5717 - val_acc: 0.7143
Epoch 23/400
614/614 [==============================] - 0s 73us/step - loss: 0.5717 - acc: 0.7085 - val_loss: 0.5682 - val_acc: 0.6948
Epoch 24/400
614/614 [==============================] - 0s 81us/step - loss: 0.5690 - acc: 0.7134 - val_loss: 0.5639 - val_acc: 0.7338
Epoch 25/400
614/614 [==============================] - 0s 104us/step - loss: 0.5654 - acc: 0.7134 - val_loss: 0.5612 - val_acc: 0.7468
Epoch 26/400
614/614 [==============================] - 0s 141us/step - loss: 0.5634 - acc: 0.7215 - val_loss: 0.5567 - val_acc: 0.7338
Epoch 27/400
614/614 [==============================] - 0s 133us/step - loss: 0.5593 - acc: 0.7215 - val_loss: 0.5553 - val_acc: 0.7078
Epoch 28/400
614/614 [==============================] - 0s 119us/step - loss: 0.5554 - acc: 0.7231 - val_loss: 0.5530 - val_acc: 0.7078
Epoch 29/400
614/614 [==============================] - 0s 57us/step - loss: 0.5532 - acc: 0.7248 - val_loss: 0.5458 - val_acc: 0.7468
Epoch 30/400
614/614 [==============================] - 0s 125us/step - loss: 0.5496 - acc: 0.7296 - val_loss: 0.5441 - val_acc: 0.7338
Epoch 31/400
614/614 [==============================] - 0s 65us/step - loss: 0.5472 - acc: 0.7296 - val_loss: 0.5458 - val_acc: 0.7078
Epoch 32/400
614/614 [==============================] - 0s 127us/step - loss: 0.5446 - acc: 0.7313 - val_loss: 0.5372 - val_acc: 0.7532
Epoch 33/400
614/614 [==============================] - 0s 57us/step - loss: 0.5411 - acc: 0.7296 - val_loss: 0.5397 - val_acc: 0.7143
Epoch 34/400
614/614 [==============================] - 0s 78us/step - loss: 0.5390 - acc: 0.7313 - val_loss: 0.5316 - val_acc: 0.7468
Epoch 35/400
614/614 [==============================] - 0s 97us/step - loss: 0.5352 - acc: 0.7394 - val_loss: 0.5296 - val_acc: 0.7727
Epoch 36/400
614/614 [==============================] - 0s 73us/step - loss: 0.5346 - acc: 0.7394 - val_loss: 0.5299 - val_acc: 0.7338
Epoch 37/400
614/614 [==============================] - 0s 132us/step - loss: 0.5313 - acc: 0.7427 - val_loss: 0.5311 - val_acc: 0.7338
Epoch 38/400
614/614 [==============================] - 0s 143us/step - loss: 0.5307 - acc: 0.7476 - val_loss: 0.5254 - val_acc: 0.7403
Epoch 39/400
614/614 [==============================] - 0s 109us/step - loss: 0.5275 - acc: 0.7394 - val_loss: 0.5234 - val_acc: 0.7403
Epoch 40/400
614/614 [==============================] - 0s 81us/step - loss: 0.5255 - acc: 0.7410 - val_loss: 0.5183 - val_acc: 0.7727
Epoch 41/400
614/614 [==============================] - 0s 112us/step - loss: 0.5240 - acc: 0.7410 - val_loss: 0.5213 - val_acc: 0.7403
Epoch 42/400
614/614 [==============================] - 0s 68us/step - loss: 0.5211 - acc: 0.7476 - val_loss: 0.5145 - val_acc: 0.7662
Epoch 43/400
614/614 [==============================] - 0s 70us/step - loss: 0.5187 - acc: 0.7492 - val_loss: 0.5127 - val_acc: 0.7662
Epoch 44/400
614/614 [==============================] - 0s 99us/step - loss: 0.5179 - acc: 0.7459 - val_loss: 0.5111 - val_acc: 0.7597
Epoch 45/400
614/614 [==============================] - 0s 58us/step - loss: 0.5157 - acc: 0.7508 - val_loss: 0.5089 - val_acc: 0.7662
Epoch 46/400
614/614 [==============================] - 0s 55us/step - loss: 0.5148 - acc: 0.7524 - val_loss: 0.5100 - val_acc: 0.7532
Epoch 47/400
614/614 [==============================] - 0s 148us/step - loss: 0.5117 - acc: 0.7508 - val_loss: 0.5055 - val_acc: 0.7727
Epoch 48/400
614/614 [==============================] - 0s 145us/step - loss: 0.5103 - acc: 0.7492 - val_loss: 0.5045 - val_acc: 0.7597
Epoch 49/400
614/614 [==============================] - 0s 81us/step - loss: 0.5091 - acc: 0.7541 - val_loss: 0.5022 - val_acc: 0.7792
Epoch 50/400
614/614 [==============================] - 0s 62us/step - loss: 0.5078 - acc: 0.7524 - val_loss: 0.5008 - val_acc: 0.7792
Epoch 51/400
614/614 [==============================] - 0s 119us/step - loss: 0.5061 - acc: 0.7443 - val_loss: 0.5034 - val_acc: 0.7468
Epoch 52/400
614/614 [==============================] - 0s 91us/step - loss: 0.5051 - acc: 0.7541 - val_loss: 0.5053 - val_acc: 0.7338
Epoch 53/400
614/614 [==============================] - 0s 84us/step - loss: 0.5053 - acc: 0.7476 - val_loss: 0.4972 - val_acc: 0.7857
Epoch 54/400
614/614 [==============================] - 0s 89us/step - loss: 0.5015 - acc: 0.7492 - val_loss: 0.4971 - val_acc: 0.7857
Epoch 55/400
614/614 [==============================] - 0s 89us/step - loss: 0.5006 - acc: 0.7476 - val_loss: 0.5020 - val_acc: 0.7403
Epoch 56/400
614/614 [==============================] - 0s 125us/step - loss: 0.5008 - acc: 0.7524 - val_loss: 0.4970 - val_acc: 0.7403
Epoch 57/400
614/614 [==============================] - 0s 116us/step - loss: 0.4990 - acc: 0.7459 - val_loss: 0.4922 - val_acc: 0.7792
Epoch 58/400
614/614 [==============================] - 0s 141us/step - loss: 0.4990 - acc: 0.7541 - val_loss: 0.4911 - val_acc: 0.7792
Epoch 59/400
614/614 [==============================] - 0s 80us/step - loss: 0.4957 - acc: 0.7590 - val_loss: 0.4913 - val_acc: 0.7792
Epoch 60/400
614/614 [==============================] - 0s 75us/step - loss: 0.4965 - acc: 0.7557 - val_loss: 0.4898 - val_acc: 0.7662
Epoch 61/400
614/614 [==============================] - 0s 62us/step - loss: 0.4950 - acc: 0.7541 - val_loss: 0.4917 - val_acc: 0.7922
Epoch 62/400
614/614 [==============================] - 0s 71us/step - loss: 0.4950 - acc: 0.7492 - val_loss: 0.4886 - val_acc: 0.7792
Epoch 63/400
614/614 [==============================] - 0s 83us/step - loss: 0.4934 - acc: 0.7508 - val_loss: 0.4894 - val_acc: 0.7532
Epoch 64/400
614/614 [==============================] - 0s 113us/step - loss: 0.4932 - acc: 0.7622 - val_loss: 0.4880 - val_acc: 0.7532
Epoch 65/400
614/614 [==============================] - 0s 102us/step - loss: 0.4909 - acc: 0.7557 - val_loss: 0.4861 - val_acc: 0.7597
Epoch 66/400
614/614 [==============================] - 0s 102us/step - loss: 0.4908 - acc: 0.7541 - val_loss: 0.4855 - val_acc: 0.7597
Epoch 67/400
614/614 [==============================] - 0s 117us/step - loss: 0.4918 - acc: 0.7541 - val_loss: 0.4886 - val_acc: 0.7532
Epoch 68/400
614/614 [==============================] - 0s 122us/step - loss: 0.4888 - acc: 0.7638 - val_loss: 0.4830 - val_acc: 0.7792
Epoch 69/400
614/614 [==============================] - 0s 86us/step - loss: 0.4888 - acc: 0.7524 - val_loss: 0.4903 - val_acc: 0.7532
Epoch 70/400
614/614 [==============================] - 0s 88us/step - loss: 0.4887 - acc: 0.7573 - val_loss: 0.4848 - val_acc: 0.7532
Epoch 71/400
614/614 [==============================] - 0s 76us/step - loss: 0.4875 - acc: 0.7541 - val_loss: 0.4986 - val_acc: 0.7597
Epoch 72/400
614/614 [==============================] - 0s 123us/step - loss: 0.4895 - acc: 0.7606 - val_loss: 0.4882 - val_acc: 0.7532
Epoch 73/400
614/614 [==============================] - 0s 112us/step - loss: 0.4844 - acc: 0.7606 - val_loss: 0.4822 - val_acc: 0.7792
Epoch 74/400
614/614 [==============================] - 0s 89us/step - loss: 0.4844 - acc: 0.7524 - val_loss: 0.4828 - val_acc: 0.7597
Epoch 75/400
614/614 [==============================] - 0s 83us/step - loss: 0.4853 - acc: 0.7524 - val_loss: 0.4792 - val_acc: 0.7662
Epoch 76/400
614/614 [==============================] - 0s 110us/step - loss: 0.4845 - acc: 0.7655 - val_loss: 0.4792 - val_acc: 0.7597
Epoch 77/400
614/614 [==============================] - 0s 84us/step - loss: 0.4847 - acc: 0.7557 - val_loss: 0.4788 - val_acc: 0.7597
Epoch 78/400
614/614 [==============================] - 0s 84us/step - loss: 0.4831 - acc: 0.7622 - val_loss: 0.4780 - val_acc: 0.7662
Epoch 79/400
614/614 [==============================] - 0s 89us/step - loss: 0.4835 - acc: 0.7541 - val_loss: 0.4790 - val_acc: 0.7597
Epoch 80/400
614/614 [==============================] - 0s 123us/step - loss: 0.4801 - acc: 0.7687 - val_loss: 0.4817 - val_acc: 0.7532
Epoch 81/400
614/614 [==============================] - 0s 88us/step - loss: 0.4823 - acc: 0.7606 - val_loss: 0.4859 - val_acc: 0.7532
Epoch 82/400
614/614 [==============================] - 0s 86us/step - loss: 0.4812 - acc: 0.7590 - val_loss: 0.4780 - val_acc: 0.7597
Epoch 83/400
614/614 [==============================] - 0s 97us/step - loss: 0.4798 - acc: 0.7606 - val_loss: 0.4774 - val_acc: 0.7597
Epoch 84/400
614/614 [==============================] - 0s 120us/step - loss: 0.4806 - acc: 0.7622 - val_loss: 0.4804 - val_acc: 0.7597
Epoch 85/400
614/614 [==============================] - 0s 81us/step - loss: 0.4804 - acc: 0.7573 - val_loss: 0.4785 - val_acc: 0.7597
Epoch 86/400
614/614 [==============================] - 0s 81us/step - loss: 0.4803 - acc: 0.7655 - val_loss: 0.4840 - val_acc: 0.7468
Epoch 87/400
614/614 [==============================] - 0s 92us/step - loss: 0.4776 - acc: 0.7671 - val_loss: 0.4787 - val_acc: 0.7727
Epoch 88/400
614/614 [==============================] - 0s 88us/step - loss: 0.4806 - acc: 0.7590 - val_loss: 0.4760 - val_acc: 0.7597
Epoch 89/400
614/614 [==============================] - 0s 83us/step - loss: 0.4790 - acc: 0.7557 - val_loss: 0.4806 - val_acc: 0.7532
Epoch 90/400
614/614 [==============================] - 0s 83us/step - loss: 0.4783 - acc: 0.7606 - val_loss: 0.4765 - val_acc: 0.7662
Epoch 91/400
614/614 [==============================] - 0s 88us/step - loss: 0.4774 - acc: 0.7655 - val_loss: 0.4801 - val_acc: 0.7597
Epoch 92/400
614/614 [==============================] - 0s 70us/step - loss: 0.4768 - acc: 0.7638 - val_loss: 0.4800 - val_acc: 0.7597
Epoch 93/400
614/614 [==============================] - 0s 63us/step - loss: 0.4774 - acc: 0.7704 - val_loss: 0.4757 - val_acc: 0.7662
Epoch 94/400
614/614 [==============================] - 0s 84us/step - loss: 0.4766 - acc: 0.7655 - val_loss: 0.4765 - val_acc: 0.7662
Epoch 95/400
614/614 [==============================] - 0s 106us/step - loss: 0.4774 - acc: 0.7671 - val_loss: 0.4759 - val_acc: 0.7662
Epoch 96/400
614/614 [==============================] - 0s 60us/step - loss: 0.4765 - acc: 0.7557 - val_loss: 0.4778 - val_acc: 0.7597
Epoch 97/400
614/614 [==============================] - 0s 94us/step - loss: 0.4768 - acc: 0.7655 - val_loss: 0.4758 - val_acc: 0.7662
Epoch 98/400
614/614 [==============================] - 0s 80us/step - loss: 0.4752 - acc: 0.7638 - val_loss: 0.4729 - val_acc: 0.7597
Epoch 99/400
614/614 [==============================] - 0s 68us/step - loss: 0.4749 - acc: 0.7671 - val_loss: 0.4730 - val_acc: 0.7597
Epoch 100/400
614/614 [==============================] - 0s 74us/step - loss: 0.4747 - acc: 0.7622 - val_loss: 0.4734 - val_acc: 0.7662
Epoch 101/400
614/614 [==============================] - 0s 73us/step - loss: 0.4750 - acc: 0.7557 - val_loss: 0.4810 - val_acc: 0.7532
Epoch 102/400
614/614 [==============================] - 0s 70us/step - loss: 0.4750 - acc: 0.7671 - val_loss: 0.4741 - val_acc: 0.7662
Epoch 103/400
614/614 [==============================] - 0s 78us/step - loss: 0.4746 - acc: 0.7769 - val_loss: 0.4779 - val_acc: 0.7662
Epoch 104/400
614/614 [==============================] - 0s 88us/step - loss: 0.4748 - acc: 0.7606 - val_loss: 0.4720 - val_acc: 0.7662
Epoch 105/400
614/614 [==============================] - 0s 67us/step - loss: 0.4726 - acc: 0.7720 - val_loss: 0.4731 - val_acc: 0.7597
Epoch 106/400
614/614 [==============================] - 0s 73us/step - loss: 0.4739 - acc: 0.7671 - val_loss: 0.4731 - val_acc: 0.7662
Epoch 107/400
614/614 [==============================] - 0s 70us/step - loss: 0.4735 - acc: 0.7622 - val_loss: 0.4778 - val_acc: 0.7662
Epoch 108/400
614/614 [==============================] - 0s 85us/step - loss: 0.4736 - acc: 0.7655 - val_loss: 0.4724 - val_acc: 0.7662
Epoch 109/400
614/614 [==============================] - 0s 93us/step - loss: 0.4742 - acc: 0.7736 - val_loss: 0.4737 - val_acc: 0.7662
Epoch 110/400
614/614 [==============================] - 0s 81us/step - loss: 0.4710 - acc: 0.7655 - val_loss: 0.4789 - val_acc: 0.7662
Epoch 111/400
614/614 [==============================] - 0s 75us/step - loss: 0.4708 - acc: 0.7687 - val_loss: 0.4711 - val_acc: 0.7597
Epoch 112/400
614/614 [==============================] - 0s 70us/step - loss: 0.4727 - acc: 0.7638 - val_loss: 0.4708 - val_acc: 0.7597
Epoch 113/400
614/614 [==============================] - 0s 75us/step - loss: 0.4690 - acc: 0.7736 - val_loss: 0.4770 - val_acc: 0.7727
Epoch 114/400
614/614 [==============================] - 0s 66us/step - loss: 0.4710 - acc: 0.7655 - val_loss: 0.4713 - val_acc: 0.7597
Epoch 115/400
614/614 [==============================] - 0s 74us/step - loss: 0.4714 - acc: 0.7687 - val_loss: 0.4713 - val_acc: 0.7597
Epoch 116/400
614/614 [==============================] - 0s 68us/step - loss: 0.4711 - acc: 0.7671 - val_loss: 0.4755 - val_acc: 0.7662
Epoch 117/400
614/614 [==============================] - 0s 81us/step - loss: 0.4695 - acc: 0.7622 - val_loss: 0.4783 - val_acc: 0.7727
Epoch 118/400
614/614 [==============================] - 0s 67us/step - loss: 0.4693 - acc: 0.7671 - val_loss: 0.4708 - val_acc: 0.7597
Epoch 119/400
614/614 [==============================] - 0s 68us/step - loss: 0.4705 - acc: 0.7655 - val_loss: 0.4689 - val_acc: 0.7662
Epoch 120/400
614/614 [==============================] - 0s 78us/step - loss: 0.4690 - acc: 0.7638 - val_loss: 0.4702 - val_acc: 0.7532
Epoch 121/400
614/614 [==============================] - 0s 67us/step - loss: 0.4695 - acc: 0.7720 - val_loss: 0.4707 - val_acc: 0.7662
Epoch 122/400
614/614 [==============================] - 0s 101us/step - loss: 0.4711 - acc: 0.7638 - val_loss: 0.4696 - val_acc: 0.7597
Epoch 123/400
614/614 [==============================] - 0s 88us/step - loss: 0.4695 - acc: 0.7638 - val_loss: 0.4737 - val_acc: 0.7727
Epoch 124/400
614/614 [==============================] - 0s 125us/step - loss: 0.4693 - acc: 0.7638 - val_loss: 0.4874 - val_acc: 0.7468
Epoch 125/400
614/614 [==============================] - 0s 119us/step - loss: 0.4692 - acc: 0.7834 - val_loss: 0.4707 - val_acc: 0.7597
Epoch 126/400
614/614 [==============================] - 0s 218us/step - loss: 0.4674 - acc: 0.7704 - val_loss: 0.4863 - val_acc: 0.7532
Epoch 127/400
614/614 [==============================] - 0s 102us/step - loss: 0.4697 - acc: 0.7769 - val_loss: 0.4693 - val_acc: 0.7597
Epoch 128/400
614/614 [==============================] - 0s 89us/step - loss: 0.4672 - acc: 0.7655 - val_loss: 0.4696 - val_acc: 0.7597
Epoch 129/400
614/614 [==============================] - 0s 120us/step - loss: 0.4670 - acc: 0.7736 - val_loss: 0.4895 - val_acc: 0.7532
Epoch 130/400
614/614 [==============================] - 0s 159us/step - loss: 0.4679 - acc: 0.7785 - val_loss: 0.4708 - val_acc: 0.7532
Epoch 131/400
614/614 [==============================] - 0s 132us/step - loss: 0.4689 - acc: 0.7720 - val_loss: 0.4743 - val_acc: 0.7727
Epoch 132/400
614/614 [==============================] - 0s 117us/step - loss: 0.4674 - acc: 0.7671 - val_loss: 0.4746 - val_acc: 0.7727
Epoch 133/400
614/614 [==============================] - 0s 136us/step - loss: 0.4685 - acc: 0.7687 - val_loss: 0.4727 - val_acc: 0.7727
Epoch 134/400
614/614 [==============================] - 0s 140us/step - loss: 0.4671 - acc: 0.7687 - val_loss: 0.4715 - val_acc: 0.7727
Epoch 135/400
614/614 [==============================] - 0s 106us/step - loss: 0.4675 - acc: 0.7622 - val_loss: 0.4690 - val_acc: 0.7597
Epoch 136/400
614/614 [==============================] - 0s 146us/step - loss: 0.4681 - acc: 0.7720 - val_loss: 0.4691 - val_acc: 0.7597
Epoch 137/400
614/614 [==============================] - 0s 127us/step - loss: 0.4661 - acc: 0.7801 - val_loss: 0.4752 - val_acc: 0.7727
Epoch 138/400
614/614 [==============================] - 0s 99us/step - loss: 0.4657 - acc: 0.7720 - val_loss: 0.4686 - val_acc: 0.7597
Epoch 139/400
614/614 [==============================] - 0s 91us/step - loss: 0.4658 - acc: 0.7655 - val_loss: 0.4766 - val_acc: 0.7662
Epoch 140/400
614/614 [==============================] - 0s 84us/step - loss: 0.4675 - acc: 0.7671 - val_loss: 0.4718 - val_acc: 0.7727
Epoch 141/400
614/614 [==============================] - 0s 75us/step - loss: 0.4661 - acc: 0.7687 - val_loss: 0.4725 - val_acc: 0.7727
Epoch 142/400
614/614 [==============================] - 0s 97us/step - loss: 0.4657 - acc: 0.7671 - val_loss: 0.4692 - val_acc: 0.7662
Epoch 143/400
614/614 [==============================] - 0s 73us/step - loss: 0.4654 - acc: 0.7720 - val_loss: 0.4820 - val_acc: 0.7597
Epoch 144/400
614/614 [==============================] - 0s 89us/step - loss: 0.4667 - acc: 0.7720 - val_loss: 0.4708 - val_acc: 0.7532
Epoch 145/400
614/614 [==============================] - 0s 83us/step - loss: 0.4650 - acc: 0.7655 - val_loss: 0.4694 - val_acc: 0.7662
Epoch 146/400
614/614 [==============================] - 0s 128us/step - loss: 0.4642 - acc: 0.7720 - val_loss: 0.4713 - val_acc: 0.7727
Epoch 147/400
614/614 [==============================] - 0s 78us/step - loss: 0.4668 - acc: 0.7655 - val_loss: 0.4705 - val_acc: 0.7662
Epoch 148/400
614/614 [==============================] - 0s 123us/step - loss: 0.4651 - acc: 0.7704 - val_loss: 0.4715 - val_acc: 0.7727
Epoch 149/400
614/614 [==============================] - 0s 161us/step - loss: 0.4657 - acc: 0.7736 - val_loss: 0.4788 - val_acc: 0.7597
Epoch 150/400
614/614 [==============================] - 0s 97us/step - loss: 0.4649 - acc: 0.7671 - val_loss: 0.4794 - val_acc: 0.7597
Epoch 151/400
614/614 [==============================] - 0s 164us/step - loss: 0.4627 - acc: 0.7785 - val_loss: 0.4705 - val_acc: 0.7597
Epoch 152/400
614/614 [==============================] - 0s 136us/step - loss: 0.4665 - acc: 0.7736 - val_loss: 0.4691 - val_acc: 0.7532
Epoch 153/400
614/614 [==============================] - 0s 174us/step - loss: 0.4656 - acc: 0.7785 - val_loss: 0.4701 - val_acc: 0.7532
Epoch 154/400
614/614 [==============================] - 0s 97us/step - loss: 0.4639 - acc: 0.7736 - val_loss: 0.4705 - val_acc: 0.7532
Epoch 155/400
614/614 [==============================] - 0s 65us/step - loss: 0.4643 - acc: 0.7736 - val_loss: 0.4697 - val_acc: 0.7468
Epoch 156/400
614/614 [==============================] - 0s 81us/step - loss: 0.4647 - acc: 0.7736 - val_loss: 0.4699 - val_acc: 0.7468
Epoch 157/400
614/614 [==============================] - 0s 86us/step - loss: 0.4637 - acc: 0.7769 - val_loss: 0.4695 - val_acc: 0.7468
Epoch 158/400
614/614 [==============================] - 0s 49us/step - loss: 0.4620 - acc: 0.7736 - val_loss: 0.4700 - val_acc: 0.7662
Epoch 159/400
614/614 [==============================] - 0s 71us/step - loss: 0.4627 - acc: 0.7720 - val_loss: 0.4686 - val_acc: 0.7597
Epoch 160/400
614/614 [==============================] - 0s 60us/step - loss: 0.4637 - acc: 0.7704 - val_loss: 0.4720 - val_acc: 0.7662
Epoch 161/400
614/614 [==============================] - 0s 104us/step - loss: 0.4641 - acc: 0.7704 - val_loss: 0.4724 - val_acc: 0.7662
Epoch 162/400
614/614 [==============================] - 0s 110us/step - loss: 0.4605 - acc: 0.7736 - val_loss: 0.4761 - val_acc: 0.7727
Epoch 163/400
614/614 [==============================] - 0s 63us/step - loss: 0.4648 - acc: 0.7687 - val_loss: 0.4723 - val_acc: 0.7532
Epoch 164/400
614/614 [==============================] - 0s 89us/step - loss: 0.4621 - acc: 0.7704 - val_loss: 0.4831 - val_acc: 0.7597
Epoch 165/400
614/614 [==============================] - 0s 58us/step - loss: 0.4648 - acc: 0.7801 - val_loss: 0.4705 - val_acc: 0.7662
Epoch 166/400
614/614 [==============================] - 0s 70us/step - loss: 0.4625 - acc: 0.7704 - val_loss: 0.4713 - val_acc: 0.7597
Epoch 167/400
614/614 [==============================] - ETA: 0s - loss: 0.4148 - acc: 0.812 - 0s 83us/step - loss: 0.4639 - acc: 0.7671 - val_loss: 0.4727 - val_acc: 0.7597
Epoch 168/400
614/614 [==============================] - 0s 94us/step - loss: 0.4620 - acc: 0.7736 - val_loss: 0.4822 - val_acc: 0.7662
Epoch 169/400
614/614 [==============================] - 0s 80us/step - loss: 0.4628 - acc: 0.7752 - val_loss: 0.4688 - val_acc: 0.7597
Epoch 170/400
614/614 [==============================] - 0s 71us/step - loss: 0.4612 - acc: 0.7752 - val_loss: 0.4768 - val_acc: 0.7987
Epoch 171/400
614/614 [==============================] - 0s 76us/step - loss: 0.4640 - acc: 0.7752 - val_loss: 0.4708 - val_acc: 0.7662
Epoch 172/400
614/614 [==============================] - 0s 91us/step - loss: 0.4651 - acc: 0.7687 - val_loss: 0.4698 - val_acc: 0.7532
Epoch 173/400
614/614 [==============================] - 0s 80us/step - loss: 0.4627 - acc: 0.7704 - val_loss: 0.4697 - val_acc: 0.7597
Epoch 174/400
614/614 [==============================] - 0s 63us/step - loss: 0.4622 - acc: 0.7720 - val_loss: 0.4749 - val_acc: 0.7662
Epoch 175/400
614/614 [==============================] - 0s 63us/step - loss: 0.4618 - acc: 0.7704 - val_loss: 0.4714 - val_acc: 0.7662
Epoch 176/400
614/614 [==============================] - 0s 65us/step - loss: 0.4618 - acc: 0.7720 - val_loss: 0.4728 - val_acc: 0.7597
Epoch 177/400
614/614 [==============================] - 0s 67us/step - loss: 0.4614 - acc: 0.7704 - val_loss: 0.4698 - val_acc: 0.7597
Epoch 178/400
614/614 [==============================] - 0s 76us/step - loss: 0.4602 - acc: 0.7818 - val_loss: 0.4869 - val_acc: 0.7662
Epoch 179/400
614/614 [==============================] - 0s 73us/step - loss: 0.4626 - acc: 0.7736 - val_loss: 0.4708 - val_acc: 0.7597
Epoch 180/400
614/614 [==============================] - 0s 63us/step - loss: 0.4619 - acc: 0.7704 - val_loss: 0.4693 - val_acc: 0.7597
Epoch 181/400
614/614 [==============================] - 0s 68us/step - loss: 0.4598 - acc: 0.7736 - val_loss: 0.4730 - val_acc: 0.7597
Epoch 182/400
614/614 [==============================] - 0s 88us/step - loss: 0.4612 - acc: 0.7752 - val_loss: 0.4695 - val_acc: 0.7532
Epoch 183/400
614/614 [==============================] - 0s 54us/step - loss: 0.4602 - acc: 0.7785 - val_loss: 0.4709 - val_acc: 0.7597
Epoch 184/400
614/614 [==============================] - 0s 115us/step - loss: 0.4622 - acc: 0.7720 - val_loss: 0.4699 - val_acc: 0.7597
Epoch 185/400
614/614 [==============================] - 0s 93us/step - loss: 0.4613 - acc: 0.7736 - val_loss: 0.4723 - val_acc: 0.7597
Epoch 186/400
614/614 [==============================] - 0s 63us/step - loss: 0.4606 - acc: 0.7801 - val_loss: 0.4722 - val_acc: 0.7597
Epoch 187/400
614/614 [==============================] - 0s 73us/step - loss: 0.4596 - acc: 0.7769 - val_loss: 0.4731 - val_acc: 0.7662
Epoch 188/400
614/614 [==============================] - 0s 68us/step - loss: 0.4613 - acc: 0.7785 - val_loss: 0.4685 - val_acc: 0.7662
Epoch 189/400
614/614 [==============================] - 0s 96us/step - loss: 0.4591 - acc: 0.7769 - val_loss: 0.4695 - val_acc: 0.7597
Epoch 190/400
614/614 [==============================] - 0s 67us/step - loss: 0.4604 - acc: 0.7801 - val_loss: 0.4722 - val_acc: 0.7597
Epoch 191/400
614/614 [==============================] - 0s 65us/step - loss: 0.4582 - acc: 0.7850 - val_loss: 0.4694 - val_acc: 0.7597
Epoch 192/400
614/614 [==============================] - 0s 60us/step - loss: 0.4620 - acc: 0.7655 - val_loss: 0.4703 - val_acc: 0.7532
Epoch 193/400
614/614 [==============================] - 0s 65us/step - loss: 0.4545 - acc: 0.7818 - val_loss: 0.4907 - val_acc: 0.7597
Epoch 194/400
614/614 [==============================] - 0s 89us/step - loss: 0.4624 - acc: 0.7769 - val_loss: 0.4745 - val_acc: 0.7662
Epoch 195/400
614/614 [==============================] - 0s 68us/step - loss: 0.4598 - acc: 0.7671 - val_loss: 0.4686 - val_acc: 0.7662
Epoch 196/400
614/614 [==============================] - 0s 70us/step - loss: 0.4592 - acc: 0.7769 - val_loss: 0.4736 - val_acc: 0.7662
Epoch 197/400
614/614 [==============================] - 0s 70us/step - loss: 0.4605 - acc: 0.7704 - val_loss: 0.4826 - val_acc: 0.7597
Epoch 198/400
614/614 [==============================] - 0s 81us/step - loss: 0.4619 - acc: 0.7704 - val_loss: 0.4691 - val_acc: 0.7597
Epoch 199/400
614/614 [==============================] - 0s 83us/step - loss: 0.4615 - acc: 0.7736 - val_loss: 0.4717 - val_acc: 0.7532
Epoch 200/400
614/614 [==============================] - 0s 73us/step - loss: 0.4606 - acc: 0.7752 - val_loss: 0.4742 - val_acc: 0.7597
Epoch 201/400
614/614 [==============================] - 0s 78us/step - loss: 0.4605 - acc: 0.7720 - val_loss: 0.4753 - val_acc: 0.7662
Epoch 202/400
614/614 [==============================] - 0s 89us/step - loss: 0.4589 - acc: 0.7736 - val_loss: 0.4711 - val_acc: 0.7532
Epoch 203/400
614/614 [==============================] - 0s 89us/step - loss: 0.4593 - acc: 0.7834 - val_loss: 0.4681 - val_acc: 0.7662
Epoch 204/400
614/614 [==============================] - 0s 84us/step - loss: 0.4595 - acc: 0.7752 - val_loss: 0.4757 - val_acc: 0.7987
Epoch 205/400
614/614 [==============================] - 0s 78us/step - loss: 0.4609 - acc: 0.7720 - val_loss: 0.4694 - val_acc: 0.7662
Epoch 206/400
614/614 [==============================] - 0s 78us/step - loss: 0.4595 - acc: 0.7736 - val_loss: 0.4710 - val_acc: 0.7532
Epoch 207/400
614/614 [==============================] - 0s 76us/step - loss: 0.4589 - acc: 0.7769 - val_loss: 0.4703 - val_acc: 0.7532
Epoch 208/400
614/614 [==============================] - 0s 73us/step - loss: 0.4576 - acc: 0.7736 - val_loss: 0.4784 - val_acc: 0.7662
Epoch 209/400
614/614 [==============================] - 0s 71us/step - loss: 0.4604 - acc: 0.7720 - val_loss: 0.4699 - val_acc: 0.7532
Epoch 210/400
614/614 [==============================] - 0s 91us/step - loss: 0.4602 - acc: 0.7769 - val_loss: 0.4701 - val_acc: 0.7597
Epoch 211/400
614/614 [==============================] - 0s 78us/step - loss: 0.4589 - acc: 0.7752 - val_loss: 0.4713 - val_acc: 0.7532
Epoch 212/400
614/614 [==============================] - 0s 76us/step - loss: 0.4591 - acc: 0.7769 - val_loss: 0.4686 - val_acc: 0.7662
Epoch 213/400
614/614 [==============================] - 0s 99us/step - loss: 0.4591 - acc: 0.7687 - val_loss: 0.4709 - val_acc: 0.7662
Epoch 214/400
614/614 [==============================] - 0s 91us/step - loss: 0.4592 - acc: 0.7720 - val_loss: 0.4686 - val_acc: 0.7532
Epoch 215/400
614/614 [==============================] - 0s 68us/step - loss: 0.4600 - acc: 0.7736 - val_loss: 0.4691 - val_acc: 0.7532
Epoch 216/400
614/614 [==============================] - 0s 83us/step - loss: 0.4588 - acc: 0.7720 - val_loss: 0.4692 - val_acc: 0.7727
Epoch 217/400
614/614 [==============================] - 0s 75us/step - loss: 0.4593 - acc: 0.7769 - val_loss: 0.4783 - val_acc: 0.7597
Epoch 218/400
614/614 [==============================] - 0s 88us/step - loss: 0.4605 - acc: 0.7752 - val_loss: 0.4691 - val_acc: 0.7532
Epoch 219/400
614/614 [==============================] - 0s 110us/step - loss: 0.4576 - acc: 0.7736 - val_loss: 0.4682 - val_acc: 0.7662
Epoch 220/400
614/614 [==============================] - 0s 78us/step - loss: 0.4596 - acc: 0.7752 - val_loss: 0.4723 - val_acc: 0.7532
Epoch 221/400
614/614 [==============================] - 0s 81us/step - loss: 0.4562 - acc: 0.7883 - val_loss: 0.4710 - val_acc: 0.7662
Epoch 222/400
614/614 [==============================] - 0s 76us/step - loss: 0.4595 - acc: 0.7785 - val_loss: 0.4697 - val_acc: 0.7532
Epoch 223/400
614/614 [==============================] - 0s 67us/step - loss: 0.4579 - acc: 0.7752 - val_loss: 0.4709 - val_acc: 0.7532
Epoch 224/400
614/614 [==============================] - 0s 80us/step - loss: 0.4592 - acc: 0.7720 - val_loss: 0.4698 - val_acc: 0.7597
Epoch 225/400
614/614 [==============================] - 0s 86us/step - loss: 0.4582 - acc: 0.7785 - val_loss: 0.4715 - val_acc: 0.7532
Epoch 226/400
614/614 [==============================] - 0s 78us/step - loss: 0.4588 - acc: 0.7801 - val_loss: 0.4714 - val_acc: 0.7532
Epoch 227/400
614/614 [==============================] - 0s 89us/step - loss: 0.4572 - acc: 0.7752 - val_loss: 0.4776 - val_acc: 0.7987
Epoch 228/400
614/614 [==============================] - 0s 68us/step - loss: 0.4592 - acc: 0.7785 - val_loss: 0.4760 - val_acc: 0.7662
Epoch 229/400
614/614 [==============================] - 0s 88us/step - loss: 0.4584 - acc: 0.7736 - val_loss: 0.4677 - val_acc: 0.7662
Epoch 230/400
614/614 [==============================] - 0s 76us/step - loss: 0.4572 - acc: 0.7704 - val_loss: 0.4685 - val_acc: 0.7532
Epoch 231/400
614/614 [==============================] - 0s 67us/step - loss: 0.4575 - acc: 0.7736 - val_loss: 0.4686 - val_acc: 0.7532
Epoch 232/400
614/614 [==============================] - 0s 68us/step - loss: 0.4585 - acc: 0.7769 - val_loss: 0.4718 - val_acc: 0.7532
Epoch 233/400
614/614 [==============================] - 0s 75us/step - loss: 0.4575 - acc: 0.7752 - val_loss: 0.4751 - val_acc: 0.7597
Epoch 234/400
614/614 [==============================] - 0s 76us/step - loss: 0.4578 - acc: 0.7720 - val_loss: 0.4799 - val_acc: 0.7597
Epoch 235/400
614/614 [==============================] - 0s 58us/step - loss: 0.4560 - acc: 0.7704 - val_loss: 0.4887 - val_acc: 0.7662
Epoch 236/400
614/614 [==============================] - 0s 67us/step - loss: 0.4600 - acc: 0.7769 - val_loss: 0.4682 - val_acc: 0.7597
Epoch 237/400
614/614 [==============================] - 0s 68us/step - loss: 0.4564 - acc: 0.7655 - val_loss: 0.4698 - val_acc: 0.7532
Epoch 238/400
614/614 [==============================] - 0s 70us/step - loss: 0.4566 - acc: 0.7752 - val_loss: 0.4749 - val_acc: 0.7597
Epoch 239/400
614/614 [==============================] - 0s 70us/step - loss: 0.4574 - acc: 0.7769 - val_loss: 0.4672 - val_acc: 0.7662
Epoch 240/400
614/614 [==============================] - 0s 62us/step - loss: 0.4561 - acc: 0.7785 - val_loss: 0.4693 - val_acc: 0.7727
Epoch 241/400
614/614 [==============================] - 0s 58us/step - loss: 0.4582 - acc: 0.7655 - val_loss: 0.4675 - val_acc: 0.7597
Epoch 242/400
614/614 [==============================] - 0s 70us/step - loss: 0.4572 - acc: 0.7720 - val_loss: 0.4746 - val_acc: 0.7532
Epoch 243/400
614/614 [==============================] - 0s 63us/step - loss: 0.4566 - acc: 0.7769 - val_loss: 0.4701 - val_acc: 0.7662
Epoch 244/400
614/614 [==============================] - 0s 86us/step - loss: 0.4549 - acc: 0.7785 - val_loss: 0.4792 - val_acc: 0.7597
Epoch 245/400
614/614 [==============================] - 0s 89us/step - loss: 0.4595 - acc: 0.7720 - val_loss: 0.4702 - val_acc: 0.7727
Epoch 246/400
614/614 [==============================] - 0s 76us/step - loss: 0.4576 - acc: 0.7769 - val_loss: 0.4716 - val_acc: 0.7792
Epoch 247/400
614/614 [==============================] - 0s 89us/step - loss: 0.4571 - acc: 0.7769 - val_loss: 0.4673 - val_acc: 0.7597
Epoch 248/400
614/614 [==============================] - 0s 81us/step - loss: 0.4569 - acc: 0.7752 - val_loss: 0.4686 - val_acc: 0.7532
Epoch 249/400
614/614 [==============================] - 0s 62us/step - loss: 0.4586 - acc: 0.7785 - val_loss: 0.4683 - val_acc: 0.7532
Epoch 250/400
614/614 [==============================] - 0s 96us/step - loss: 0.4559 - acc: 0.7769 - val_loss: 0.4681 - val_acc: 0.7532
Epoch 251/400
614/614 [==============================] - 0s 112us/step - loss: 0.4563 - acc: 0.7769 - val_loss: 0.4669 - val_acc: 0.7727
Epoch 252/400
614/614 [==============================] - 0s 71us/step - loss: 0.4563 - acc: 0.7752 - val_loss: 0.4671 - val_acc: 0.7662
Epoch 253/400
614/614 [==============================] - 0s 70us/step - loss: 0.4547 - acc: 0.7785 - val_loss: 0.4757 - val_acc: 0.7922
Epoch 254/400
614/614 [==============================] - 0s 67us/step - loss: 0.4576 - acc: 0.7752 - val_loss: 0.4712 - val_acc: 0.7792
Epoch 255/400
614/614 [==============================] - 0s 81us/step - loss: 0.4566 - acc: 0.7736 - val_loss: 0.4692 - val_acc: 0.7532
Epoch 256/400
614/614 [==============================] - 0s 80us/step - loss: 0.4560 - acc: 0.7818 - val_loss: 0.4702 - val_acc: 0.7468
Epoch 257/400
614/614 [==============================] - 0s 88us/step - loss: 0.4554 - acc: 0.7801 - val_loss: 0.4687 - val_acc: 0.7597
Epoch 258/400
614/614 [==============================] - 0s 80us/step - loss: 0.4561 - acc: 0.7801 - val_loss: 0.4679 - val_acc: 0.7662
Epoch 259/400
614/614 [==============================] - 0s 94us/step - loss: 0.4547 - acc: 0.7752 - val_loss: 0.4711 - val_acc: 0.7727
Epoch 260/400
614/614 [==============================] - 0s 75us/step - loss: 0.4586 - acc: 0.7671 - val_loss: 0.4715 - val_acc: 0.7727
Epoch 261/400
614/614 [==============================] - 0s 76us/step - loss: 0.4550 - acc: 0.7769 - val_loss: 0.4701 - val_acc: 0.7468
Epoch 262/400
614/614 [==============================] - 0s 76us/step - loss: 0.4547 - acc: 0.7736 - val_loss: 0.4671 - val_acc: 0.7597
Epoch 263/400
614/614 [==============================] - 0s 57us/step - loss: 0.4558 - acc: 0.7769 - val_loss: 0.4674 - val_acc: 0.7597
Epoch 264/400
614/614 [==============================] - 0s 70us/step - loss: 0.4556 - acc: 0.7785 - val_loss: 0.4759 - val_acc: 0.7532
Epoch 265/400
614/614 [==============================] - 0s 60us/step - loss: 0.4540 - acc: 0.7818 - val_loss: 0.4697 - val_acc: 0.7662
Epoch 266/400
614/614 [==============================] - 0s 67us/step - loss: 0.4546 - acc: 0.7883 - val_loss: 0.4697 - val_acc: 0.7532
Epoch 267/400
614/614 [==============================] - 0s 67us/step - loss: 0.4551 - acc: 0.7834 - val_loss: 0.4692 - val_acc: 0.7662
Epoch 268/400
614/614 [==============================] - 0s 65us/step - loss: 0.4556 - acc: 0.7801 - val_loss: 0.4695 - val_acc: 0.7532
Epoch 269/400
614/614 [==============================] - 0s 75us/step - loss: 0.4553 - acc: 0.7736 - val_loss: 0.4690 - val_acc: 0.7532
Epoch 270/400
614/614 [==============================] - 0s 70us/step - loss: 0.4545 - acc: 0.7818 - val_loss: 0.4723 - val_acc: 0.7532
Epoch 271/400
614/614 [==============================] - 0s 86us/step - loss: 0.4580 - acc: 0.7818 - val_loss: 0.4726 - val_acc: 0.7532
Epoch 272/400
614/614 [==============================] - 0s 68us/step - loss: 0.4534 - acc: 0.7785 - val_loss: 0.4742 - val_acc: 0.7532
Epoch 273/400
614/614 [==============================] - 0s 63us/step - loss: 0.4550 - acc: 0.7801 - val_loss: 0.4744 - val_acc: 0.7922
Epoch 274/400
614/614 [==============================] - 0s 67us/step - loss: 0.4577 - acc: 0.7720 - val_loss: 0.4698 - val_acc: 0.7532
Epoch 275/400
614/614 [==============================] - 0s 73us/step - loss: 0.4570 - acc: 0.7818 - val_loss: 0.4699 - val_acc: 0.7662
Epoch 276/400
614/614 [==============================] - 0s 76us/step - loss: 0.4554 - acc: 0.7769 - val_loss: 0.4703 - val_acc: 0.7532
Epoch 277/400
614/614 [==============================] - 0s 67us/step - loss: 0.4569 - acc: 0.7769 - val_loss: 0.4723 - val_acc: 0.7532
Epoch 278/400
614/614 [==============================] - 0s 62us/step - loss: 0.4550 - acc: 0.7801 - val_loss: 0.4689 - val_acc: 0.7532
Epoch 279/400
614/614 [==============================] - 0s 50us/step - loss: 0.4528 - acc: 0.7785 - val_loss: 0.4681 - val_acc: 0.7403
Epoch 280/400
614/614 [==============================] - 0s 63us/step - loss: 0.4559 - acc: 0.7769 - val_loss: 0.4705 - val_acc: 0.7792
Epoch 281/400
614/614 [==============================] - 0s 58us/step - loss: 0.4560 - acc: 0.7785 - val_loss: 0.4687 - val_acc: 0.7532
Epoch 282/400
614/614 [==============================] - 0s 52us/step - loss: 0.4548 - acc: 0.7850 - val_loss: 0.4685 - val_acc: 0.7468
Epoch 283/400
614/614 [==============================] - 0s 55us/step - loss: 0.4556 - acc: 0.7769 - val_loss: 0.4686 - val_acc: 0.7468
Epoch 284/400
614/614 [==============================] - 0s 60us/step - loss: 0.4537 - acc: 0.7818 - val_loss: 0.4682 - val_acc: 0.7468
Epoch 285/400
614/614 [==============================] - 0s 55us/step - loss: 0.4521 - acc: 0.7818 - val_loss: 0.4796 - val_acc: 0.7468
Epoch 286/400
614/614 [==============================] - 0s 65us/step - loss: 0.4543 - acc: 0.7752 - val_loss: 0.4682 - val_acc: 0.7532
Epoch 287/400
614/614 [==============================] - 0s 58us/step - loss: 0.4533 - acc: 0.7801 - val_loss: 0.4861 - val_acc: 0.7662
Epoch 288/400
614/614 [==============================] - 0s 50us/step - loss: 0.4585 - acc: 0.7850 - val_loss: 0.4709 - val_acc: 0.7532
Epoch 289/400
614/614 [==============================] - 0s 55us/step - loss: 0.4542 - acc: 0.7801 - val_loss: 0.4747 - val_acc: 0.7922
Epoch 290/400
614/614 [==============================] - 0s 52us/step - loss: 0.4559 - acc: 0.7834 - val_loss: 0.4712 - val_acc: 0.7532
Epoch 291/400
614/614 [==============================] - 0s 45us/step - loss: 0.4555 - acc: 0.7769 - val_loss: 0.4722 - val_acc: 0.7532
Epoch 292/400
614/614 [==============================] - 0s 54us/step - loss: 0.4529 - acc: 0.7769 - val_loss: 0.4789 - val_acc: 0.7468
Epoch 293/400
614/614 [==============================] - 0s 68us/step - loss: 0.4524 - acc: 0.7818 - val_loss: 0.4778 - val_acc: 0.7532
Epoch 294/400
614/614 [==============================] - 0s 60us/step - loss: 0.4555 - acc: 0.7818 - val_loss: 0.4711 - val_acc: 0.7532
Epoch 295/400
614/614 [==============================] - 0s 65us/step - loss: 0.4543 - acc: 0.7850 - val_loss: 0.4719 - val_acc: 0.7792
Epoch 296/400
614/614 [==============================] - 0s 65us/step - loss: 0.4530 - acc: 0.7769 - val_loss: 0.4721 - val_acc: 0.7532
Epoch 297/400
614/614 [==============================] - 0s 63us/step - loss: 0.4549 - acc: 0.7834 - val_loss: 0.4745 - val_acc: 0.7532
Epoch 298/400
614/614 [==============================] - 0s 55us/step - loss: 0.4518 - acc: 0.7785 - val_loss: 0.4731 - val_acc: 0.7597
Epoch 299/400
614/614 [==============================] - 0s 52us/step - loss: 0.4559 - acc: 0.7834 - val_loss: 0.4717 - val_acc: 0.7532
Epoch 300/400
614/614 [==============================] - 0s 55us/step - loss: 0.4525 - acc: 0.7866 - val_loss: 0.4724 - val_acc: 0.7532
Epoch 301/400
614/614 [==============================] - 0s 57us/step - loss: 0.4534 - acc: 0.7834 - val_loss: 0.4781 - val_acc: 0.7468
Epoch 302/400
614/614 [==============================] - 0s 75us/step - loss: 0.4550 - acc: 0.7834 - val_loss: 0.4759 - val_acc: 0.7532
Epoch 303/400
614/614 [==============================] - 0s 55us/step - loss: 0.4548 - acc: 0.7818 - val_loss: 0.4698 - val_acc: 0.7468
Epoch 304/400
614/614 [==============================] - 0s 84us/step - loss: 0.4546 - acc: 0.7801 - val_loss: 0.4701 - val_acc: 0.7468
Epoch 305/400
614/614 [==============================] - 0s 65us/step - loss: 0.4524 - acc: 0.7769 - val_loss: 0.4749 - val_acc: 0.7857
Epoch 306/400
614/614 [==============================] - 0s 70us/step - loss: 0.4535 - acc: 0.7752 - val_loss: 0.4692 - val_acc: 0.7468
Epoch 307/400
614/614 [==============================] - 0s 80us/step - loss: 0.4543 - acc: 0.7834 - val_loss: 0.4681 - val_acc: 0.7468
Epoch 308/400
614/614 [==============================] - 0s 84us/step - loss: 0.4541 - acc: 0.7834 - val_loss: 0.4679 - val_acc: 0.7532
Epoch 309/400
614/614 [==============================] - 0s 55us/step - loss: 0.4546 - acc: 0.7866 - val_loss: 0.4676 - val_acc: 0.7468
Epoch 310/400
614/614 [==============================] - 0s 68us/step - loss: 0.4518 - acc: 0.7818 - val_loss: 0.4688 - val_acc: 0.7468
Epoch 311/400
614/614 [==============================] - 0s 76us/step - loss: 0.4526 - acc: 0.7850 - val_loss: 0.4731 - val_acc: 0.7597
Epoch 312/400
614/614 [==============================] - 0s 73us/step - loss: 0.4510 - acc: 0.7850 - val_loss: 0.4805 - val_acc: 0.7727
Epoch 313/400
614/614 [==============================] - 0s 57us/step - loss: 0.4552 - acc: 0.7834 - val_loss: 0.4708 - val_acc: 0.7403
Epoch 314/400
614/614 [==============================] - 0s 68us/step - loss: 0.4538 - acc: 0.7850 - val_loss: 0.4692 - val_acc: 0.7468
Epoch 315/400
614/614 [==============================] - 0s 60us/step - loss: 0.4508 - acc: 0.7818 - val_loss: 0.4843 - val_acc: 0.7597
Epoch 316/400
614/614 [==============================] - 0s 60us/step - loss: 0.4537 - acc: 0.7866 - val_loss: 0.4712 - val_acc: 0.7532
Epoch 317/400
614/614 [==============================] - 0s 84us/step - loss: 0.4519 - acc: 0.7769 - val_loss: 0.4705 - val_acc: 0.7597
Epoch 318/400
614/614 [==============================] - 0s 73us/step - loss: 0.4553 - acc: 0.7818 - val_loss: 0.4738 - val_acc: 0.7597
Epoch 319/400
614/614 [==============================] - 0s 80us/step - loss: 0.4533 - acc: 0.7866 - val_loss: 0.4720 - val_acc: 0.7532
Epoch 320/400
614/614 [==============================] - 0s 93us/step - loss: 0.4540 - acc: 0.7834 - val_loss: 0.4714 - val_acc: 0.7468
Epoch 321/400
614/614 [==============================] - 0s 67us/step - loss: 0.4539 - acc: 0.7818 - val_loss: 0.4691 - val_acc: 0.7468
Epoch 322/400
614/614 [==============================] - 0s 62us/step - loss: 0.4532 - acc: 0.7818 - val_loss: 0.4791 - val_acc: 0.7468
Epoch 323/400
614/614 [==============================] - 0s 57us/step - loss: 0.4530 - acc: 0.7834 - val_loss: 0.4707 - val_acc: 0.7792
Epoch 324/400
614/614 [==============================] - 0s 52us/step - loss: 0.4500 - acc: 0.7915 - val_loss: 0.4760 - val_acc: 0.7987
Epoch 325/400
614/614 [==============================] - 0s 62us/step - loss: 0.4520 - acc: 0.7834 - val_loss: 0.4767 - val_acc: 0.7597
Epoch 326/400
614/614 [==============================] - 0s 57us/step - loss: 0.4528 - acc: 0.7850 - val_loss: 0.4725 - val_acc: 0.7468
Epoch 327/400
614/614 [==============================] - 0s 47us/step - loss: 0.4484 - acc: 0.7850 - val_loss: 0.4887 - val_acc: 0.7662
Epoch 328/400
614/614 [==============================] - 0s 54us/step - loss: 0.4539 - acc: 0.7866 - val_loss: 0.4699 - val_acc: 0.7468
Epoch 329/400
614/614 [==============================] - 0s 71us/step - loss: 0.4534 - acc: 0.7915 - val_loss: 0.4729 - val_acc: 0.7532
Epoch 330/400
614/614 [==============================] - 0s 58us/step - loss: 0.4527 - acc: 0.7801 - val_loss: 0.4766 - val_acc: 0.7857
Epoch 331/400
614/614 [==============================] - 0s 65us/step - loss: 0.4520 - acc: 0.7834 - val_loss: 0.4794 - val_acc: 0.7468
Epoch 332/400
614/614 [==============================] - 0s 60us/step - loss: 0.4543 - acc: 0.7752 - val_loss: 0.4733 - val_acc: 0.7662
Epoch 333/400
614/614 [==============================] - 0s 62us/step - loss: 0.4497 - acc: 0.7834 - val_loss: 0.4794 - val_acc: 0.7532
Epoch 334/400
614/614 [==============================] - 0s 65us/step - loss: 0.4512 - acc: 0.7818 - val_loss: 0.4696 - val_acc: 0.7468
Epoch 335/400
614/614 [==============================] - 0s 52us/step - loss: 0.4547 - acc: 0.7834 - val_loss: 0.4693 - val_acc: 0.7468
Epoch 336/400
614/614 [==============================] - 0s 42us/step - loss: 0.4516 - acc: 0.7785 - val_loss: 0.4763 - val_acc: 0.7532
Epoch 337/400
614/614 [==============================] - 0s 62us/step - loss: 0.4532 - acc: 0.7818 - val_loss: 0.4878 - val_acc: 0.7662
Epoch 338/400
614/614 [==============================] - 0s 60us/step - loss: 0.4555 - acc: 0.7899 - val_loss: 0.4730 - val_acc: 0.7468
Epoch 339/400
614/614 [==============================] - 0s 52us/step - loss: 0.4540 - acc: 0.7769 - val_loss: 0.4718 - val_acc: 0.7468
Epoch 340/400
614/614 [==============================] - 0s 57us/step - loss: 0.4528 - acc: 0.7866 - val_loss: 0.4720 - val_acc: 0.7532
Epoch 341/400
614/614 [==============================] - 0s 60us/step - loss: 0.4494 - acc: 0.7769 - val_loss: 0.4749 - val_acc: 0.7597
Epoch 342/400
614/614 [==============================] - 0s 65us/step - loss: 0.4502 - acc: 0.7818 - val_loss: 0.4741 - val_acc: 0.7857
Epoch 343/400
614/614 [==============================] - 0s 54us/step - loss: 0.4497 - acc: 0.7866 - val_loss: 0.4964 - val_acc: 0.7727
Epoch 344/400
614/614 [==============================] - 0s 55us/step - loss: 0.4553 - acc: 0.7883 - val_loss: 0.4690 - val_acc: 0.7468
Epoch 345/400
614/614 [==============================] - 0s 47us/step - loss: 0.4503 - acc: 0.7883 - val_loss: 0.4684 - val_acc: 0.7468
Epoch 346/400
614/614 [==============================] - 0s 54us/step - loss: 0.4476 - acc: 0.7899 - val_loss: 0.4751 - val_acc: 0.7597
Epoch 347/400
614/614 [==============================] - 0s 58us/step - loss: 0.4502 - acc: 0.7866 - val_loss: 0.4692 - val_acc: 0.7468
Epoch 348/400
614/614 [==============================] - 0s 52us/step - loss: 0.4503 - acc: 0.7883 - val_loss: 0.4687 - val_acc: 0.7468
Epoch 349/400
614/614 [==============================] - 0s 52us/step - loss: 0.4505 - acc: 0.7834 - val_loss: 0.4785 - val_acc: 0.7532
Epoch 350/400
614/614 [==============================] - 0s 50us/step - loss: 0.4503 - acc: 0.7834 - val_loss: 0.4750 - val_acc: 0.7662
Epoch 351/400
614/614 [==============================] - 0s 44us/step - loss: 0.4522 - acc: 0.7785 - val_loss: 0.4727 - val_acc: 0.7403
Epoch 352/400
614/614 [==============================] - 0s 44us/step - loss: 0.4533 - acc: 0.7883 - val_loss: 0.4752 - val_acc: 0.7532
Epoch 353/400
614/614 [==============================] - 0s 60us/step - loss: 0.4492 - acc: 0.7899 - val_loss: 0.4744 - val_acc: 0.7597
Epoch 354/400
614/614 [==============================] - 0s 62us/step - loss: 0.4461 - acc: 0.7834 - val_loss: 0.4729 - val_acc: 0.7468
Epoch 355/400
614/614 [==============================] - 0s 45us/step - loss: 0.4507 - acc: 0.7834 - val_loss: 0.4771 - val_acc: 0.7597
Epoch 356/400
614/614 [==============================] - 0s 39us/step - loss: 0.4516 - acc: 0.7769 - val_loss: 0.4947 - val_acc: 0.7727
Epoch 357/400
614/614 [==============================] - 0s 52us/step - loss: 0.4522 - acc: 0.7850 - val_loss: 0.4714 - val_acc: 0.7468
Epoch 358/400
614/614 [==============================] - 0s 54us/step - loss: 0.4506 - acc: 0.7883 - val_loss: 0.4806 - val_acc: 0.7597
Epoch 359/400
614/614 [==============================] - 0s 58us/step - loss: 0.4511 - acc: 0.7818 - val_loss: 0.4750 - val_acc: 0.7597
Epoch 360/400
614/614 [==============================] - 0s 57us/step - loss: 0.4488 - acc: 0.7850 - val_loss: 0.4758 - val_acc: 0.7597
Epoch 361/400
614/614 [==============================] - 0s 49us/step - loss: 0.4521 - acc: 0.7752 - val_loss: 0.4712 - val_acc: 0.7468
Epoch 362/400
614/614 [==============================] - 0s 45us/step - loss: 0.4488 - acc: 0.7801 - val_loss: 0.4692 - val_acc: 0.7532
Epoch 363/400
614/614 [==============================] - 0s 52us/step - loss: 0.4502 - acc: 0.7834 - val_loss: 0.4714 - val_acc: 0.7468
Epoch 364/400
614/614 [==============================] - 0s 52us/step - loss: 0.4507 - acc: 0.7866 - val_loss: 0.4716 - val_acc: 0.7597
Epoch 365/400
614/614 [==============================] - 0s 63us/step - loss: 0.4507 - acc: 0.7834 - val_loss: 0.4774 - val_acc: 0.7597
Epoch 366/400
614/614 [==============================] - 0s 63us/step - loss: 0.4493 - acc: 0.7834 - val_loss: 0.4731 - val_acc: 0.7468
Epoch 367/400
614/614 [==============================] - 0s 54us/step - loss: 0.4472 - acc: 0.7883 - val_loss: 0.4734 - val_acc: 0.7468
Epoch 368/400
614/614 [==============================] - 0s 63us/step - loss: 0.4498 - acc: 0.7899 - val_loss: 0.4769 - val_acc: 0.7922
Epoch 369/400
614/614 [==============================] - 0s 71us/step - loss: 0.4495 - acc: 0.7850 - val_loss: 0.4713 - val_acc: 0.7468
Epoch 370/400
614/614 [==============================] - 0s 62us/step - loss: 0.4492 - acc: 0.7834 - val_loss: 0.4819 - val_acc: 0.7597
Epoch 371/400
614/614 [==============================] - 0s 52us/step - loss: 0.4475 - acc: 0.7964 - val_loss: 0.4706 - val_acc: 0.7468
Epoch 372/400
614/614 [==============================] - 0s 52us/step - loss: 0.4502 - acc: 0.7785 - val_loss: 0.4707 - val_acc: 0.7468
Epoch 373/400
614/614 [==============================] - 0s 67us/step - loss: 0.4502 - acc: 0.7801 - val_loss: 0.4694 - val_acc: 0.7468
Epoch 374/400
614/614 [==============================] - 0s 55us/step - loss: 0.4503 - acc: 0.7801 - val_loss: 0.4769 - val_acc: 0.7532
Epoch 375/400
614/614 [==============================] - 0s 84us/step - loss: 0.4486 - acc: 0.7915 - val_loss: 0.4722 - val_acc: 0.7468
Epoch 376/400
614/614 [==============================] - 0s 63us/step - loss: 0.4499 - acc: 0.7850 - val_loss: 0.4748 - val_acc: 0.7597
Epoch 377/400
614/614 [==============================] - 0s 47us/step - loss: 0.4498 - acc: 0.7883 - val_loss: 0.4762 - val_acc: 0.7857
Epoch 378/400
614/614 [==============================] - 0s 52us/step - loss: 0.4495 - acc: 0.7801 - val_loss: 0.4713 - val_acc: 0.7468
Epoch 379/400
614/614 [==============================] - 0s 55us/step - loss: 0.4477 - acc: 0.7850 - val_loss: 0.4723 - val_acc: 0.7468
Epoch 380/400
614/614 [==============================] - 0s 62us/step - loss: 0.4483 - acc: 0.7883 - val_loss: 0.4812 - val_acc: 0.7597
Epoch 381/400
614/614 [==============================] - 0s 67us/step - loss: 0.4511 - acc: 0.7964 - val_loss: 0.4754 - val_acc: 0.7597
Epoch 382/400
614/614 [==============================] - 0s 54us/step - loss: 0.4491 - acc: 0.7834 - val_loss: 0.4747 - val_acc: 0.7468
Epoch 383/400
614/614 [==============================] - 0s 45us/step - loss: 0.4497 - acc: 0.7866 - val_loss: 0.4747 - val_acc: 0.7403
Epoch 384/400
614/614 [==============================] - 0s 62us/step - loss: 0.4492 - acc: 0.7866 - val_loss: 0.4737 - val_acc: 0.7662
Epoch 385/400
614/614 [==============================] - 0s 57us/step - loss: 0.4477 - acc: 0.7801 - val_loss: 0.4731 - val_acc: 0.7468
Epoch 386/400
614/614 [==============================] - 0s 73us/step - loss: 0.4474 - acc: 0.7866 - val_loss: 0.4731 - val_acc: 0.7532
Epoch 387/400
614/614 [==============================] - 0s 67us/step - loss: 0.4505 - acc: 0.7883 - val_loss: 0.4725 - val_acc: 0.7468
Epoch 388/400
614/614 [==============================] - 0s 63us/step - loss: 0.4480 - acc: 0.7850 - val_loss: 0.4723 - val_acc: 0.7468
Epoch 389/400
614/614 [==============================] - 0s 47us/step - loss: 0.4505 - acc: 0.7818 - val_loss: 0.4710 - val_acc: 0.7468
Epoch 390/400
614/614 [==============================] - 0s 49us/step - loss: 0.4496 - acc: 0.7932 - val_loss: 0.4723 - val_acc: 0.7662
Epoch 391/400
614/614 [==============================] - 0s 58us/step - loss: 0.4470 - acc: 0.7883 - val_loss: 0.4721 - val_acc: 0.7532
Epoch 392/400
614/614 [==============================] - 0s 54us/step - loss: 0.4489 - acc: 0.7850 - val_loss: 0.4793 - val_acc: 0.7662
Epoch 393/400
614/614 [==============================] - 0s 45us/step - loss: 0.4490 - acc: 0.7948 - val_loss: 0.4751 - val_acc: 0.7468
Epoch 394/400
614/614 [==============================] - 0s 47us/step - loss: 0.4484 - acc: 0.7899 - val_loss: 0.4733 - val_acc: 0.7468
Epoch 395/400
614/614 [==============================] - 0s 63us/step - loss: 0.4484 - acc: 0.7964 - val_loss: 0.4740 - val_acc: 0.7532
Epoch 396/400
614/614 [==============================] - 0s 52us/step - loss: 0.4471 - acc: 0.7899 - val_loss: 0.4717 - val_acc: 0.7532
Epoch 397/400
614/614 [==============================] - 0s 60us/step - loss: 0.4488 - acc: 0.7883 - val_loss: 0.4746 - val_acc: 0.7468
Epoch 398/400
614/614 [==============================] - 0s 52us/step - loss: 0.4460 - acc: 0.7850 - val_loss: 0.4725 - val_acc: 0.7468
Epoch 399/400
614/614 [==============================] - 0s 45us/step - loss: 0.4459 - acc: 0.7866 - val_loss: 0.4745 - val_acc: 0.7597
Epoch 400/400
614/614 [==============================] - 0s 49us/step - loss: 0.4473 - acc: 0.7818 - val_loss: 0.4706 - val_acc: 0.7468
Out[28]:
<keras.callbacks.History at 0x1affc3dacc0>
In [29]:
df_knn=pd.DataFrame(data=fancyimpute.KNN(k=8).fit_transform(df.values), columns=df.columns, index=df.index)
model = Sequential()
model.add(Dense(10,activation='relu', input_dim=8))

model.add(Dense(10,activation='relu'))

model.add(Dense(1, activation='sigmoid'))

model.compile(loss='binary_crossentropy', optimizer='rmsprop', metrics=['accuracy'])
model.fit(df_knn[input_columns], df[8], batch_size=32, epochs=400, validation_split=0.2)
Imputing row 1/768 with 1 missing, elapsed time: 0.252
Imputing row 101/768 with 2 missing, elapsed time: 0.256
Imputing row 201/768 with 2 missing, elapsed time: 0.259
Imputing row 301/768 with 3 missing, elapsed time: 0.261
Imputing row 401/768 with 2 missing, elapsed time: 0.264
Imputing row 501/768 with 1 missing, elapsed time: 0.267
Imputing row 601/768 with 1 missing, elapsed time: 0.275
Imputing row 701/768 with 0 missing, elapsed time: 0.283
Train on 614 samples, validate on 154 samples
Epoch 1/400
614/614 [==============================] - 0s 650us/step - loss: 0.7430 - acc: 0.3469 - val_loss: 0.7083 - val_acc: 0.3636
Epoch 2/400
614/614 [==============================] - 0s 45us/step - loss: 0.6932 - acc: 0.4739 - val_loss: 0.6788 - val_acc: 0.6623
Epoch 3/400
614/614 [==============================] - 0s 73us/step - loss: 0.6685 - acc: 0.6710 - val_loss: 0.6619 - val_acc: 0.6688
Epoch 4/400
614/614 [==============================] - 0s 52us/step - loss: 0.6541 - acc: 0.6596 - val_loss: 0.6524 - val_acc: 0.6494
Epoch 5/400
614/614 [==============================] - 0s 60us/step - loss: 0.6453 - acc: 0.6580 - val_loss: 0.6458 - val_acc: 0.6429
Epoch 6/400
614/614 [==============================] - 0s 57us/step - loss: 0.6390 - acc: 0.6580 - val_loss: 0.6399 - val_acc: 0.6429
Epoch 7/400
614/614 [==============================] - 0s 65us/step - loss: 0.6332 - acc: 0.6564 - val_loss: 0.6343 - val_acc: 0.6429
Epoch 8/400
614/614 [==============================] - 0s 52us/step - loss: 0.6284 - acc: 0.6580 - val_loss: 0.6295 - val_acc: 0.6494
Epoch 9/400
614/614 [==============================] - 0s 67us/step - loss: 0.6247 - acc: 0.6547 - val_loss: 0.6252 - val_acc: 0.6494
Epoch 10/400
614/614 [==============================] - 0s 63us/step - loss: 0.6204 - acc: 0.6564 - val_loss: 0.6198 - val_acc: 0.6494
Epoch 11/400
614/614 [==============================] - 0s 67us/step - loss: 0.6164 - acc: 0.6612 - val_loss: 0.6158 - val_acc: 0.6494
Epoch 12/400
614/614 [==============================] - 0s 50us/step - loss: 0.6118 - acc: 0.6596 - val_loss: 0.6100 - val_acc: 0.6623
Epoch 13/400
614/614 [==============================] - 0s 70us/step - loss: 0.6073 - acc: 0.6645 - val_loss: 0.6044 - val_acc: 0.6623
Epoch 14/400
614/614 [==============================] - 0s 60us/step - loss: 0.6028 - acc: 0.6694 - val_loss: 0.6006 - val_acc: 0.6688
Epoch 15/400
614/614 [==============================] - 0s 50us/step - loss: 0.5985 - acc: 0.6661 - val_loss: 0.5963 - val_acc: 0.6688
Epoch 16/400
614/614 [==============================] - 0s 62us/step - loss: 0.5945 - acc: 0.6629 - val_loss: 0.5893 - val_acc: 0.6753
Epoch 17/400
614/614 [==============================] - 0s 67us/step - loss: 0.5887 - acc: 0.6710 - val_loss: 0.5832 - val_acc: 0.6948
Epoch 18/400
614/614 [==============================] - 0s 47us/step - loss: 0.5837 - acc: 0.6759 - val_loss: 0.5776 - val_acc: 0.7013
Epoch 19/400
614/614 [==============================] - 0s 65us/step - loss: 0.5789 - acc: 0.6840 - val_loss: 0.5729 - val_acc: 0.7013
Epoch 20/400
614/614 [==============================] - 0s 60us/step - loss: 0.5745 - acc: 0.6824 - val_loss: 0.5676 - val_acc: 0.7143
Epoch 21/400
614/614 [==============================] - 0s 47us/step - loss: 0.5696 - acc: 0.6857 - val_loss: 0.5619 - val_acc: 0.7338
Epoch 22/400
614/614 [==============================] - 0s 67us/step - loss: 0.5648 - acc: 0.6922 - val_loss: 0.5574 - val_acc: 0.7338
Epoch 23/400
614/614 [==============================] - 0s 65us/step - loss: 0.5605 - acc: 0.7068 - val_loss: 0.5525 - val_acc: 0.7403
Epoch 24/400
614/614 [==============================] - 0s 57us/step - loss: 0.5558 - acc: 0.7166 - val_loss: 0.5488 - val_acc: 0.7273
Epoch 25/400
614/614 [==============================] - 0s 60us/step - loss: 0.5517 - acc: 0.7101 - val_loss: 0.5438 - val_acc: 0.7403
Epoch 26/400
614/614 [==============================] - 0s 50us/step - loss: 0.5481 - acc: 0.7329 - val_loss: 0.5402 - val_acc: 0.7273
Epoch 27/400
614/614 [==============================] - 0s 54us/step - loss: 0.5455 - acc: 0.7231 - val_loss: 0.5356 - val_acc: 0.7468
Epoch 28/400
614/614 [==============================] - 0s 60us/step - loss: 0.5416 - acc: 0.7345 - val_loss: 0.5326 - val_acc: 0.7532
Epoch 29/400
614/614 [==============================] - 0s 54us/step - loss: 0.5384 - acc: 0.7378 - val_loss: 0.5291 - val_acc: 0.7468
Epoch 30/400
614/614 [==============================] - 0s 71us/step - loss: 0.5349 - acc: 0.7427 - val_loss: 0.5254 - val_acc: 0.7532
Epoch 31/400
614/614 [==============================] - 0s 45us/step - loss: 0.5316 - acc: 0.7394 - val_loss: 0.5222 - val_acc: 0.7468
Epoch 32/400
614/614 [==============================] - 0s 60us/step - loss: 0.5288 - acc: 0.7476 - val_loss: 0.5188 - val_acc: 0.7468
Epoch 33/400
614/614 [==============================] - 0s 44us/step - loss: 0.5248 - acc: 0.7459 - val_loss: 0.5162 - val_acc: 0.7403
Epoch 34/400
614/614 [==============================] - 0s 60us/step - loss: 0.5220 - acc: 0.7427 - val_loss: 0.5126 - val_acc: 0.7597
Epoch 35/400
614/614 [==============================] - 0s 60us/step - loss: 0.5177 - acc: 0.7459 - val_loss: 0.5099 - val_acc: 0.7532
Epoch 36/400
614/614 [==============================] - 0s 44us/step - loss: 0.5156 - acc: 0.7476 - val_loss: 0.5061 - val_acc: 0.7792
Epoch 37/400
614/614 [==============================] - 0s 62us/step - loss: 0.5119 - acc: 0.7508 - val_loss: 0.5042 - val_acc: 0.7857
Epoch 38/400
614/614 [==============================] - 0s 49us/step - loss: 0.5098 - acc: 0.7541 - val_loss: 0.5026 - val_acc: 0.7922
Epoch 39/400
614/614 [==============================] - 0s 57us/step - loss: 0.5072 - acc: 0.7573 - val_loss: 0.4995 - val_acc: 0.7857
Epoch 40/400
614/614 [==============================] - 0s 50us/step - loss: 0.5043 - acc: 0.7524 - val_loss: 0.4968 - val_acc: 0.7857
Epoch 41/400
614/614 [==============================] - 0s 62us/step - loss: 0.5012 - acc: 0.7687 - val_loss: 0.4948 - val_acc: 0.7922
Epoch 42/400
614/614 [==============================] - 0s 49us/step - loss: 0.4993 - acc: 0.7573 - val_loss: 0.4928 - val_acc: 0.7857
Epoch 43/400
614/614 [==============================] - 0s 63us/step - loss: 0.4957 - acc: 0.7671 - val_loss: 0.4925 - val_acc: 0.7468
Epoch 44/400
614/614 [==============================] - 0s 49us/step - loss: 0.4948 - acc: 0.7557 - val_loss: 0.4878 - val_acc: 0.7987
Epoch 45/400
614/614 [==============================] - 0s 63us/step - loss: 0.4921 - acc: 0.7573 - val_loss: 0.4861 - val_acc: 0.7857
Epoch 46/400
614/614 [==============================] - 0s 47us/step - loss: 0.4909 - acc: 0.7606 - val_loss: 0.4847 - val_acc: 0.8052
Epoch 47/400
614/614 [==============================] - 0s 62us/step - loss: 0.4887 - acc: 0.7655 - val_loss: 0.4840 - val_acc: 0.7857
Epoch 48/400
614/614 [==============================] - 0s 47us/step - loss: 0.4868 - acc: 0.7573 - val_loss: 0.4821 - val_acc: 0.7857
Epoch 49/400
614/614 [==============================] - 0s 57us/step - loss: 0.4855 - acc: 0.7638 - val_loss: 0.4794 - val_acc: 0.7792
Epoch 50/400
614/614 [==============================] - 0s 42us/step - loss: 0.4834 - acc: 0.7638 - val_loss: 0.4787 - val_acc: 0.7857
Epoch 51/400
614/614 [==============================] - 0s 62us/step - loss: 0.4818 - acc: 0.7638 - val_loss: 0.4760 - val_acc: 0.7857
Epoch 52/400
614/614 [==============================] - 0s 47us/step - loss: 0.4800 - acc: 0.7769 - val_loss: 0.4749 - val_acc: 0.7987
Epoch 53/400
614/614 [==============================] - 0s 67us/step - loss: 0.4782 - acc: 0.7704 - val_loss: 0.4742 - val_acc: 0.7987
Epoch 54/400
614/614 [==============================] - 0s 42us/step - loss: 0.4768 - acc: 0.7720 - val_loss: 0.4722 - val_acc: 0.7792
Epoch 55/400
614/614 [==============================] - 0s 50us/step - loss: 0.4758 - acc: 0.7671 - val_loss: 0.4706 - val_acc: 0.7857
Epoch 56/400
614/614 [==============================] - 0s 57us/step - loss: 0.4747 - acc: 0.7687 - val_loss: 0.4701 - val_acc: 0.7857
Epoch 57/400
614/614 [==============================] - 0s 58us/step - loss: 0.4736 - acc: 0.7736 - val_loss: 0.4688 - val_acc: 0.7922
Epoch 58/400
614/614 [==============================] - 0s 72us/step - loss: 0.4727 - acc: 0.7704 - val_loss: 0.4723 - val_acc: 0.7792
Epoch 59/400
614/614 [==============================] - 0s 58us/step - loss: 0.4705 - acc: 0.7769 - val_loss: 0.4683 - val_acc: 0.7857
Epoch 60/400
614/614 [==============================] - 0s 54us/step - loss: 0.4691 - acc: 0.7752 - val_loss: 0.4730 - val_acc: 0.7662
Epoch 61/400
614/614 [==============================] - 0s 70us/step - loss: 0.4696 - acc: 0.7785 - val_loss: 0.4665 - val_acc: 0.7857
Epoch 62/400
614/614 [==============================] - 0s 62us/step - loss: 0.4691 - acc: 0.7720 - val_loss: 0.4647 - val_acc: 0.7922
Epoch 63/400
614/614 [==============================] - 0s 78us/step - loss: 0.4669 - acc: 0.7752 - val_loss: 0.4671 - val_acc: 0.7792
Epoch 64/400
614/614 [==============================] - 0s 60us/step - loss: 0.4667 - acc: 0.7769 - val_loss: 0.4632 - val_acc: 0.7987
Epoch 65/400
614/614 [==============================] - 0s 65us/step - loss: 0.4657 - acc: 0.7720 - val_loss: 0.4626 - val_acc: 0.8052
Epoch 66/400
614/614 [==============================] - 0s 76us/step - loss: 0.4644 - acc: 0.7752 - val_loss: 0.4618 - val_acc: 0.7987
Epoch 67/400
614/614 [==============================] - 0s 93us/step - loss: 0.4632 - acc: 0.7769 - val_loss: 0.4607 - val_acc: 0.7987
Epoch 68/400
614/614 [==============================] - 0s 78us/step - loss: 0.4615 - acc: 0.7785 - val_loss: 0.4689 - val_acc: 0.7727
Epoch 69/400
614/614 [==============================] - 0s 65us/step - loss: 0.4641 - acc: 0.7801 - val_loss: 0.4622 - val_acc: 0.7857
Epoch 70/400
614/614 [==============================] - 0s 62us/step - loss: 0.4613 - acc: 0.7834 - val_loss: 0.4583 - val_acc: 0.7922
Epoch 71/400
614/614 [==============================] - 0s 67us/step - loss: 0.4617 - acc: 0.7785 - val_loss: 0.4580 - val_acc: 0.7987
Epoch 72/400
614/614 [==============================] - 0s 71us/step - loss: 0.4606 - acc: 0.7736 - val_loss: 0.4575 - val_acc: 0.8052
Epoch 73/400
614/614 [==============================] - 0s 65us/step - loss: 0.4607 - acc: 0.7752 - val_loss: 0.4582 - val_acc: 0.7987
Epoch 74/400
614/614 [==============================] - 0s 54us/step - loss: 0.4590 - acc: 0.7801 - val_loss: 0.4563 - val_acc: 0.7987
Epoch 75/400
614/614 [==============================] - 0s 54us/step - loss: 0.4585 - acc: 0.7834 - val_loss: 0.4567 - val_acc: 0.7987
Epoch 76/400
614/614 [==============================] - 0s 70us/step - loss: 0.4573 - acc: 0.7752 - val_loss: 0.4556 - val_acc: 0.7987
Epoch 77/400
614/614 [==============================] - 0s 71us/step - loss: 0.4568 - acc: 0.7801 - val_loss: 0.4541 - val_acc: 0.7922
Epoch 78/400
614/614 [==============================] - 0s 60us/step - loss: 0.4569 - acc: 0.7785 - val_loss: 0.4539 - val_acc: 0.7987
Epoch 79/400
614/614 [==============================] - 0s 60us/step - loss: 0.4553 - acc: 0.7818 - val_loss: 0.4532 - val_acc: 0.7987
Epoch 80/400
614/614 [==============================] - 0s 54us/step - loss: 0.4559 - acc: 0.7818 - val_loss: 0.4521 - val_acc: 0.7987
Epoch 81/400
614/614 [==============================] - 0s 55us/step - loss: 0.4542 - acc: 0.7801 - val_loss: 0.4558 - val_acc: 0.8052
Epoch 82/400
614/614 [==============================] - 0s 65us/step - loss: 0.4539 - acc: 0.7818 - val_loss: 0.4517 - val_acc: 0.7922
Epoch 83/400
614/614 [==============================] - 0s 55us/step - loss: 0.4536 - acc: 0.7801 - val_loss: 0.4508 - val_acc: 0.7987
Epoch 84/400
614/614 [==============================] - 0s 52us/step - loss: 0.4527 - acc: 0.7801 - val_loss: 0.4505 - val_acc: 0.8052
Epoch 85/400
614/614 [==============================] - 0s 50us/step - loss: 0.4509 - acc: 0.7883 - val_loss: 0.4493 - val_acc: 0.7922
Epoch 86/400
614/614 [==============================] - 0s 55us/step - loss: 0.4517 - acc: 0.7801 - val_loss: 0.4504 - val_acc: 0.8052
Epoch 87/400
614/614 [==============================] - 0s 70us/step - loss: 0.4500 - acc: 0.7801 - val_loss: 0.4512 - val_acc: 0.8052
Epoch 88/400
614/614 [==============================] - 0s 65us/step - loss: 0.4508 - acc: 0.7785 - val_loss: 0.4487 - val_acc: 0.8052
Epoch 89/400
614/614 [==============================] - 0s 58us/step - loss: 0.4496 - acc: 0.7801 - val_loss: 0.4552 - val_acc: 0.7922
Epoch 90/400
614/614 [==============================] - 0s 58us/step - loss: 0.4489 - acc: 0.7850 - val_loss: 0.4489 - val_acc: 0.8117
Epoch 91/400
614/614 [==============================] - 0s 50us/step - loss: 0.4491 - acc: 0.7834 - val_loss: 0.4456 - val_acc: 0.8052
Epoch 92/400
614/614 [==============================] - 0s 60us/step - loss: 0.4492 - acc: 0.7818 - val_loss: 0.4453 - val_acc: 0.8052
Epoch 93/400
614/614 [==============================] - 0s 63us/step - loss: 0.4480 - acc: 0.7850 - val_loss: 0.4523 - val_acc: 0.7987
Epoch 94/400
614/614 [==============================] - 0s 54us/step - loss: 0.4486 - acc: 0.7818 - val_loss: 0.4491 - val_acc: 0.8052
Epoch 95/400
614/614 [==============================] - 0s 54us/step - loss: 0.4479 - acc: 0.7834 - val_loss: 0.4450 - val_acc: 0.8052
Epoch 96/400
614/614 [==============================] - 0s 54us/step - loss: 0.4467 - acc: 0.7818 - val_loss: 0.4448 - val_acc: 0.8052
Epoch 97/400
614/614 [==============================] - 0s 54us/step - loss: 0.4467 - acc: 0.7883 - val_loss: 0.4480 - val_acc: 0.8052
Epoch 98/400
614/614 [==============================] - 0s 49us/step - loss: 0.4453 - acc: 0.7834 - val_loss: 0.4441 - val_acc: 0.7922
Epoch 99/400
614/614 [==============================] - 0s 58us/step - loss: 0.4457 - acc: 0.7899 - val_loss: 0.4426 - val_acc: 0.8052
Epoch 100/400
614/614 [==============================] - 0s 62us/step - loss: 0.4451 - acc: 0.7801 - val_loss: 0.4453 - val_acc: 0.8052
Epoch 101/400
614/614 [==============================] - 0s 49us/step - loss: 0.4454 - acc: 0.7899 - val_loss: 0.4452 - val_acc: 0.8052
Epoch 102/400
614/614 [==============================] - 0s 55us/step - loss: 0.4447 - acc: 0.7801 - val_loss: 0.4420 - val_acc: 0.8052
Epoch 103/400
614/614 [==============================] - 0s 62us/step - loss: 0.4440 - acc: 0.7818 - val_loss: 0.4431 - val_acc: 0.8052
Epoch 104/400
614/614 [==============================] - 0s 67us/step - loss: 0.4431 - acc: 0.7866 - val_loss: 0.4407 - val_acc: 0.8052
Epoch 105/400
614/614 [==============================] - 0s 54us/step - loss: 0.4430 - acc: 0.7818 - val_loss: 0.4414 - val_acc: 0.8052
Epoch 106/400
614/614 [==============================] - 0s 55us/step - loss: 0.4405 - acc: 0.7932 - val_loss: 0.4428 - val_acc: 0.8052
Epoch 107/400
614/614 [==============================] - 0s 62us/step - loss: 0.4433 - acc: 0.7801 - val_loss: 0.4423 - val_acc: 0.8052
Epoch 108/400
614/614 [==============================] - 0s 50us/step - loss: 0.4414 - acc: 0.7866 - val_loss: 0.4406 - val_acc: 0.8052
Epoch 109/400
614/614 [==============================] - 0s 58us/step - loss: 0.4411 - acc: 0.7785 - val_loss: 0.4503 - val_acc: 0.7987
Epoch 110/400
614/614 [==============================] - 0s 55us/step - loss: 0.4424 - acc: 0.7915 - val_loss: 0.4445 - val_acc: 0.8052
Epoch 111/400
614/614 [==============================] - 0s 50us/step - loss: 0.4410 - acc: 0.7834 - val_loss: 0.4403 - val_acc: 0.8052
Epoch 112/400
614/614 [==============================] - 0s 52us/step - loss: 0.4413 - acc: 0.7866 - val_loss: 0.4382 - val_acc: 0.8052
Epoch 113/400
614/614 [==============================] - 0s 50us/step - loss: 0.4406 - acc: 0.7883 - val_loss: 0.4394 - val_acc: 0.8052
Epoch 114/400
614/614 [==============================] - 0s 50us/step - loss: 0.4404 - acc: 0.7850 - val_loss: 0.4383 - val_acc: 0.7987
Epoch 115/400
614/614 [==============================] - 0s 54us/step - loss: 0.4410 - acc: 0.7834 - val_loss: 0.4382 - val_acc: 0.7987
Epoch 116/400
614/614 [==============================] - 0s 60us/step - loss: 0.4393 - acc: 0.7834 - val_loss: 0.4360 - val_acc: 0.8052
Epoch 117/400
614/614 [==============================] - 0s 52us/step - loss: 0.4394 - acc: 0.7932 - val_loss: 0.4354 - val_acc: 0.8052
Epoch 118/400
614/614 [==============================] - 0s 47us/step - loss: 0.4391 - acc: 0.7915 - val_loss: 0.4346 - val_acc: 0.8052
Epoch 119/400
614/614 [==============================] - 0s 55us/step - loss: 0.4380 - acc: 0.7850 - val_loss: 0.4367 - val_acc: 0.7987
Epoch 120/400
614/614 [==============================] - 0s 58us/step - loss: 0.4391 - acc: 0.7834 - val_loss: 0.4329 - val_acc: 0.8052
Epoch 121/400
614/614 [==============================] - 0s 101us/step - loss: 0.4387 - acc: 0.7801 - val_loss: 0.4331 - val_acc: 0.8052
Epoch 122/400
614/614 [==============================] - 0s 70us/step - loss: 0.4372 - acc: 0.7883 - val_loss: 0.4329 - val_acc: 0.7987
Epoch 123/400
614/614 [==============================] - 0s 78us/step - loss: 0.4378 - acc: 0.7850 - val_loss: 0.4351 - val_acc: 0.8052
Epoch 124/400
614/614 [==============================] - 0s 86us/step - loss: 0.4382 - acc: 0.7834 - val_loss: 0.4329 - val_acc: 0.7987
Epoch 125/400
614/614 [==============================] - 0s 80us/step - loss: 0.4369 - acc: 0.7866 - val_loss: 0.4349 - val_acc: 0.8052
Epoch 126/400
614/614 [==============================] - 0s 71us/step - loss: 0.4361 - acc: 0.7818 - val_loss: 0.4325 - val_acc: 0.7987
Epoch 127/400
614/614 [==============================] - 0s 62us/step - loss: 0.4376 - acc: 0.7899 - val_loss: 0.4328 - val_acc: 0.8052
Epoch 128/400
614/614 [==============================] - 0s 57us/step - loss: 0.4355 - acc: 0.7850 - val_loss: 0.4339 - val_acc: 0.7987
Epoch 129/400
614/614 [==============================] - 0s 80us/step - loss: 0.4358 - acc: 0.7899 - val_loss: 0.4318 - val_acc: 0.8052
Epoch 130/400
614/614 [==============================] - 0s 71us/step - loss: 0.4360 - acc: 0.7834 - val_loss: 0.4312 - val_acc: 0.8052
Epoch 131/400
614/614 [==============================] - 0s 49us/step - loss: 0.4343 - acc: 0.7932 - val_loss: 0.4328 - val_acc: 0.8052
Epoch 132/400
614/614 [==============================] - 0s 63us/step - loss: 0.4360 - acc: 0.7899 - val_loss: 0.4307 - val_acc: 0.7987
Epoch 133/400
614/614 [==============================] - 0s 70us/step - loss: 0.4349 - acc: 0.7915 - val_loss: 0.4288 - val_acc: 0.8052
Epoch 134/400
614/614 [==============================] - 0s 63us/step - loss: 0.4343 - acc: 0.7883 - val_loss: 0.4297 - val_acc: 0.7987
Epoch 135/400
614/614 [==============================] - 0s 67us/step - loss: 0.4332 - acc: 0.7899 - val_loss: 0.4279 - val_acc: 0.7987
Epoch 136/400
614/614 [==============================] - 0s 54us/step - loss: 0.4336 - acc: 0.7883 - val_loss: 0.4279 - val_acc: 0.8052
Epoch 137/400
614/614 [==============================] - 0s 62us/step - loss: 0.4328 - acc: 0.7883 - val_loss: 0.4329 - val_acc: 0.8052
Epoch 138/400
614/614 [==============================] - 0s 52us/step - loss: 0.4339 - acc: 0.7915 - val_loss: 0.4284 - val_acc: 0.7987
Epoch 139/400
614/614 [==============================] - ETA: 0s - loss: 0.4803 - acc: 0.750 - 0s 57us/step - loss: 0.4317 - acc: 0.7948 - val_loss: 0.4289 - val_acc: 0.7987
Epoch 140/400
614/614 [==============================] - 0s 58us/step - loss: 0.4332 - acc: 0.7818 - val_loss: 0.4287 - val_acc: 0.7987
Epoch 141/400
614/614 [==============================] - 0s 62us/step - loss: 0.4333 - acc: 0.7866 - val_loss: 0.4286 - val_acc: 0.7987
Epoch 142/400
614/614 [==============================] - 0s 50us/step - loss: 0.4319 - acc: 0.7915 - val_loss: 0.4320 - val_acc: 0.8052
Epoch 143/400
614/614 [==============================] - 0s 54us/step - loss: 0.4313 - acc: 0.7915 - val_loss: 0.4281 - val_acc: 0.7987
Epoch 144/400
614/614 [==============================] - 0s 67us/step - loss: 0.4312 - acc: 0.7932 - val_loss: 0.4288 - val_acc: 0.7987
Epoch 145/400
614/614 [==============================] - 0s 58us/step - loss: 0.4302 - acc: 0.7948 - val_loss: 0.4291 - val_acc: 0.7987
Epoch 146/400
614/614 [==============================] - 0s 62us/step - loss: 0.4309 - acc: 0.7964 - val_loss: 0.4279 - val_acc: 0.7987
Epoch 147/400
614/614 [==============================] - 0s 55us/step - loss: 0.4306 - acc: 0.7866 - val_loss: 0.4260 - val_acc: 0.7987
Epoch 148/400
614/614 [==============================] - 0s 52us/step - loss: 0.4289 - acc: 0.7899 - val_loss: 0.4279 - val_acc: 0.7987
Epoch 149/400
614/614 [==============================] - 0s 49us/step - loss: 0.4296 - acc: 0.7948 - val_loss: 0.4309 - val_acc: 0.8052
Epoch 150/400
614/614 [==============================] - 0s 60us/step - loss: 0.4305 - acc: 0.7932 - val_loss: 0.4291 - val_acc: 0.8117
Epoch 151/400
614/614 [==============================] - 0s 55us/step - loss: 0.4273 - acc: 0.7980 - val_loss: 0.4228 - val_acc: 0.7922
Epoch 152/400
614/614 [==============================] - 0s 50us/step - loss: 0.4289 - acc: 0.7948 - val_loss: 0.4260 - val_acc: 0.8052
Epoch 153/400
614/614 [==============================] - 0s 55us/step - loss: 0.4298 - acc: 0.7899 - val_loss: 0.4236 - val_acc: 0.7922
Epoch 154/400
614/614 [==============================] - 0s 58us/step - loss: 0.4295 - acc: 0.7915 - val_loss: 0.4272 - val_acc: 0.8117
Epoch 155/400
614/614 [==============================] - 0s 68us/step - loss: 0.4277 - acc: 0.7997 - val_loss: 0.4242 - val_acc: 0.7987
Epoch 156/400
614/614 [==============================] - 0s 62us/step - loss: 0.4278 - acc: 0.7899 - val_loss: 0.4239 - val_acc: 0.7922
Epoch 157/400
614/614 [==============================] - 0s 65us/step - loss: 0.4273 - acc: 0.7997 - val_loss: 0.4240 - val_acc: 0.7987
Epoch 158/400
614/614 [==============================] - 0s 55us/step - loss: 0.4260 - acc: 0.7932 - val_loss: 0.4341 - val_acc: 0.8052
Epoch 159/400
614/614 [==============================] - 0s 47us/step - loss: 0.4253 - acc: 0.7964 - val_loss: 0.4214 - val_acc: 0.7922
Epoch 160/400
614/614 [==============================] - 0s 58us/step - loss: 0.4258 - acc: 0.7980 - val_loss: 0.4335 - val_acc: 0.8052
Epoch 161/400
614/614 [==============================] - 0s 67us/step - loss: 0.4260 - acc: 0.8029 - val_loss: 0.4225 - val_acc: 0.7987
Epoch 162/400
614/614 [==============================] - 0s 55us/step - loss: 0.4262 - acc: 0.7932 - val_loss: 0.4318 - val_acc: 0.8052
Epoch 163/400
614/614 [==============================] - 0s 50us/step - loss: 0.4266 - acc: 0.7997 - val_loss: 0.4221 - val_acc: 0.7922
Epoch 164/400
614/614 [==============================] - 0s 52us/step - loss: 0.4257 - acc: 0.7948 - val_loss: 0.4228 - val_acc: 0.7922
Epoch 165/400
614/614 [==============================] - 0s 57us/step - loss: 0.4247 - acc: 0.7915 - val_loss: 0.4264 - val_acc: 0.8052
Epoch 166/400
614/614 [==============================] - 0s 57us/step - loss: 0.4251 - acc: 0.8029 - val_loss: 0.4211 - val_acc: 0.7987
Epoch 167/400
614/614 [==============================] - 0s 65us/step - loss: 0.4264 - acc: 0.7948 - val_loss: 0.4257 - val_acc: 0.8052
Epoch 168/400
614/614 [==============================] - 0s 67us/step - loss: 0.4230 - acc: 0.7980 - val_loss: 0.4204 - val_acc: 0.7922
Epoch 169/400
614/614 [==============================] - 0s 52us/step - loss: 0.4252 - acc: 0.7932 - val_loss: 0.4234 - val_acc: 0.7987
Epoch 170/400
614/614 [==============================] - 0s 50us/step - loss: 0.4252 - acc: 0.7948 - val_loss: 0.4210 - val_acc: 0.7987
Epoch 171/400
614/614 [==============================] - 0s 54us/step - loss: 0.4247 - acc: 0.7932 - val_loss: 0.4204 - val_acc: 0.7987
Epoch 172/400
614/614 [==============================] - 0s 60us/step - loss: 0.4243 - acc: 0.7932 - val_loss: 0.4197 - val_acc: 0.7922
Epoch 173/400
614/614 [==============================] - 0s 55us/step - loss: 0.4253 - acc: 0.7964 - val_loss: 0.4209 - val_acc: 0.7922
Epoch 174/400
614/614 [==============================] - 0s 55us/step - loss: 0.4244 - acc: 0.7883 - val_loss: 0.4290 - val_acc: 0.8052
Epoch 175/400
614/614 [==============================] - 0s 52us/step - loss: 0.4240 - acc: 0.7964 - val_loss: 0.4237 - val_acc: 0.7987
Epoch 176/400
614/614 [==============================] - 0s 50us/step - loss: 0.4233 - acc: 0.7964 - val_loss: 0.4217 - val_acc: 0.8052
Epoch 177/400
614/614 [==============================] - 0s 55us/step - loss: 0.4204 - acc: 0.8029 - val_loss: 0.4176 - val_acc: 0.7922
Epoch 178/400
614/614 [==============================] - 0s 62us/step - loss: 0.4232 - acc: 0.7883 - val_loss: 0.4192 - val_acc: 0.7922
Epoch 179/400
614/614 [==============================] - 0s 55us/step - loss: 0.4224 - acc: 0.8013 - val_loss: 0.4229 - val_acc: 0.7987
Epoch 180/400
614/614 [==============================] - 0s 50us/step - loss: 0.4220 - acc: 0.7899 - val_loss: 0.4228 - val_acc: 0.8052
Epoch 181/400
614/614 [==============================] - 0s 52us/step - loss: 0.4225 - acc: 0.8029 - val_loss: 0.4206 - val_acc: 0.8052
Epoch 182/400
614/614 [==============================] - 0s 67us/step - loss: 0.4214 - acc: 0.7964 - val_loss: 0.4192 - val_acc: 0.7922
Epoch 183/400
614/614 [==============================] - 0s 78us/step - loss: 0.4220 - acc: 0.7964 - val_loss: 0.4172 - val_acc: 0.7922
Epoch 184/400
614/614 [==============================] - 0s 73us/step - loss: 0.4211 - acc: 0.8013 - val_loss: 0.4173 - val_acc: 0.8117
Epoch 185/400
614/614 [==============================] - 0s 70us/step - loss: 0.4212 - acc: 0.7964 - val_loss: 0.4220 - val_acc: 0.8052
Epoch 186/400
614/614 [==============================] - 0s 50us/step - loss: 0.4201 - acc: 0.8029 - val_loss: 0.4155 - val_acc: 0.7922
Epoch 187/400
614/614 [==============================] - 0s 71us/step - loss: 0.4207 - acc: 0.8013 - val_loss: 0.4169 - val_acc: 0.7922
Epoch 188/400
614/614 [==============================] - 0s 91us/step - loss: 0.4194 - acc: 0.7948 - val_loss: 0.4184 - val_acc: 0.7987
Epoch 189/400
614/614 [==============================] - 0s 55us/step - loss: 0.4205 - acc: 0.7899 - val_loss: 0.4158 - val_acc: 0.7922
Epoch 190/400
614/614 [==============================] - 0s 60us/step - loss: 0.4200 - acc: 0.7980 - val_loss: 0.4161 - val_acc: 0.7922
Epoch 191/400
614/614 [==============================] - 0s 55us/step - loss: 0.4187 - acc: 0.8013 - val_loss: 0.4196 - val_acc: 0.8052
Epoch 192/400
614/614 [==============================] - 0s 57us/step - loss: 0.4189 - acc: 0.7964 - val_loss: 0.4172 - val_acc: 0.7922
Epoch 193/400
614/614 [==============================] - 0s 60us/step - loss: 0.4179 - acc: 0.8013 - val_loss: 0.4180 - val_acc: 0.7987
Epoch 194/400
614/614 [==============================] - 0s 54us/step - loss: 0.4191 - acc: 0.7964 - val_loss: 0.4196 - val_acc: 0.8052
Epoch 195/400
614/614 [==============================] - 0s 45us/step - loss: 0.4182 - acc: 0.8013 - val_loss: 0.4145 - val_acc: 0.7922
Epoch 196/400
614/614 [==============================] - 0s 47us/step - loss: 0.4176 - acc: 0.7915 - val_loss: 0.4208 - val_acc: 0.8052
Epoch 197/400
614/614 [==============================] - 0s 62us/step - loss: 0.4180 - acc: 0.8013 - val_loss: 0.4141 - val_acc: 0.7922
Epoch 198/400
614/614 [==============================] - 0s 63us/step - loss: 0.4178 - acc: 0.7964 - val_loss: 0.4160 - val_acc: 0.8052
Epoch 199/400
614/614 [==============================] - 0s 68us/step - loss: 0.4172 - acc: 0.7980 - val_loss: 0.4149 - val_acc: 0.7922
Epoch 200/400
614/614 [==============================] - 0s 67us/step - loss: 0.4181 - acc: 0.8013 - val_loss: 0.4142 - val_acc: 0.7922
Epoch 201/400
614/614 [==============================] - 0s 70us/step - loss: 0.4170 - acc: 0.7948 - val_loss: 0.4159 - val_acc: 0.7987
Epoch 202/400
614/614 [==============================] - 0s 65us/step - loss: 0.4159 - acc: 0.7980 - val_loss: 0.4156 - val_acc: 0.8117
Epoch 203/400
614/614 [==============================] - 0s 84us/step - loss: 0.4147 - acc: 0.7948 - val_loss: 0.4271 - val_acc: 0.8052
Epoch 204/400
614/614 [==============================] - 0s 60us/step - loss: 0.4177 - acc: 0.8062 - val_loss: 0.4159 - val_acc: 0.8052
Epoch 205/400
614/614 [==============================] - 0s 58us/step - loss: 0.4161 - acc: 0.7964 - val_loss: 0.4129 - val_acc: 0.7987
Epoch 206/400
614/614 [==============================] - 0s 55us/step - loss: 0.4159 - acc: 0.7997 - val_loss: 0.4116 - val_acc: 0.7922
Epoch 207/400
614/614 [==============================] - 0s 45us/step - loss: 0.4160 - acc: 0.8029 - val_loss: 0.4118 - val_acc: 0.7922
Epoch 208/400
614/614 [==============================] - 0s 73us/step - loss: 0.4152 - acc: 0.7997 - val_loss: 0.4120 - val_acc: 0.7987
Epoch 209/400
614/614 [==============================] - 0s 63us/step - loss: 0.4151 - acc: 0.7964 - val_loss: 0.4134 - val_acc: 0.7922
Epoch 210/400
614/614 [==============================] - 0s 49us/step - loss: 0.4158 - acc: 0.7997 - val_loss: 0.4099 - val_acc: 0.7987
Epoch 211/400
614/614 [==============================] - 0s 58us/step - loss: 0.4151 - acc: 0.7980 - val_loss: 0.4141 - val_acc: 0.7987
Epoch 212/400
614/614 [==============================] - 0s 49us/step - loss: 0.4158 - acc: 0.8046 - val_loss: 0.4134 - val_acc: 0.7922
Epoch 213/400
614/614 [==============================] - 0s 54us/step - loss: 0.4149 - acc: 0.7997 - val_loss: 0.4126 - val_acc: 0.7922
Epoch 214/400
614/614 [==============================] - 0s 75us/step - loss: 0.4142 - acc: 0.8062 - val_loss: 0.4111 - val_acc: 0.7987
Epoch 215/400
614/614 [==============================] - 0s 71us/step - loss: 0.4133 - acc: 0.7980 - val_loss: 0.4116 - val_acc: 0.7987
Epoch 216/400
614/614 [==============================] - 0s 60us/step - loss: 0.4125 - acc: 0.7980 - val_loss: 0.4096 - val_acc: 0.7987
Epoch 217/400
614/614 [==============================] - 0s 52us/step - loss: 0.4143 - acc: 0.7964 - val_loss: 0.4107 - val_acc: 0.7987
Epoch 218/400
614/614 [==============================] - 0s 52us/step - loss: 0.4137 - acc: 0.8013 - val_loss: 0.4103 - val_acc: 0.8052
Epoch 219/400
614/614 [==============================] - 0s 67us/step - loss: 0.4128 - acc: 0.8029 - val_loss: 0.4158 - val_acc: 0.8117
Epoch 220/400
614/614 [==============================] - 0s 55us/step - loss: 0.4133 - acc: 0.8046 - val_loss: 0.4108 - val_acc: 0.8052
Epoch 221/400
614/614 [==============================] - 0s 57us/step - loss: 0.4129 - acc: 0.8029 - val_loss: 0.4094 - val_acc: 0.7987
Epoch 222/400
614/614 [==============================] - 0s 62us/step - loss: 0.4127 - acc: 0.7997 - val_loss: 0.4097 - val_acc: 0.7987
Epoch 223/400
614/614 [==============================] - 0s 57us/step - loss: 0.4129 - acc: 0.7980 - val_loss: 0.4094 - val_acc: 0.7987
Epoch 224/400
614/614 [==============================] - 0s 58us/step - loss: 0.4113 - acc: 0.8029 - val_loss: 0.4081 - val_acc: 0.8182
Epoch 225/400
614/614 [==============================] - 0s 63us/step - loss: 0.4117 - acc: 0.7980 - val_loss: 0.4081 - val_acc: 0.7987
Epoch 226/400
614/614 [==============================] - 0s 54us/step - loss: 0.4114 - acc: 0.8046 - val_loss: 0.4079 - val_acc: 0.7987
Epoch 227/400
614/614 [==============================] - 0s 65us/step - loss: 0.4109 - acc: 0.8029 - val_loss: 0.4064 - val_acc: 0.8052
Epoch 228/400
614/614 [==============================] - 0s 57us/step - loss: 0.4108 - acc: 0.8029 - val_loss: 0.4072 - val_acc: 0.7922
Epoch 229/400
614/614 [==============================] - 0s 62us/step - loss: 0.4106 - acc: 0.7964 - val_loss: 0.4085 - val_acc: 0.7922
Epoch 230/400
614/614 [==============================] - 0s 58us/step - loss: 0.4099 - acc: 0.8046 - val_loss: 0.4101 - val_acc: 0.8052
Epoch 231/400
614/614 [==============================] - 0s 52us/step - loss: 0.4093 - acc: 0.8094 - val_loss: 0.4050 - val_acc: 0.8117
Epoch 232/400
614/614 [==============================] - 0s 60us/step - loss: 0.4104 - acc: 0.7964 - val_loss: 0.4082 - val_acc: 0.8052
Epoch 233/400
614/614 [==============================] - 0s 58us/step - loss: 0.4111 - acc: 0.8013 - val_loss: 0.4035 - val_acc: 0.8117
Epoch 234/400
614/614 [==============================] - 0s 65us/step - loss: 0.4084 - acc: 0.8029 - val_loss: 0.4068 - val_acc: 0.8052
Epoch 235/400
614/614 [==============================] - 0s 62us/step - loss: 0.4095 - acc: 0.7980 - val_loss: 0.4034 - val_acc: 0.8117
Epoch 236/400
614/614 [==============================] - 0s 54us/step - loss: 0.4089 - acc: 0.7997 - val_loss: 0.4074 - val_acc: 0.8052
Epoch 237/400
614/614 [==============================] - 0s 55us/step - loss: 0.4081 - acc: 0.7964 - val_loss: 0.4098 - val_acc: 0.8117
Epoch 238/400
614/614 [==============================] - 0s 49us/step - loss: 0.4086 - acc: 0.7980 - val_loss: 0.4043 - val_acc: 0.8052
Epoch 239/400
614/614 [==============================] - 0s 50us/step - loss: 0.4094 - acc: 0.7932 - val_loss: 0.4053 - val_acc: 0.8052
Epoch 240/400
614/614 [==============================] - 0s 54us/step - loss: 0.4088 - acc: 0.8013 - val_loss: 0.4037 - val_acc: 0.7987
Epoch 241/400
614/614 [==============================] - 0s 55us/step - loss: 0.4096 - acc: 0.8013 - val_loss: 0.4076 - val_acc: 0.8052
Epoch 242/400
614/614 [==============================] - 0s 54us/step - loss: 0.4080 - acc: 0.8013 - val_loss: 0.4044 - val_acc: 0.8052
Epoch 243/400
614/614 [==============================] - 0s 49us/step - loss: 0.4070 - acc: 0.7948 - val_loss: 0.4145 - val_acc: 0.8182
Epoch 244/400
614/614 [==============================] - 0s 45us/step - loss: 0.4091 - acc: 0.8094 - val_loss: 0.4073 - val_acc: 0.8052
Epoch 245/400
614/614 [==============================] - 0s 49us/step - loss: 0.4079 - acc: 0.8046 - val_loss: 0.4017 - val_acc: 0.8052
Epoch 246/400
614/614 [==============================] - 0s 55us/step - loss: 0.4071 - acc: 0.8062 - val_loss: 0.3995 - val_acc: 0.8182
Epoch 247/400
614/614 [==============================] - 0s 65us/step - loss: 0.4077 - acc: 0.7997 - val_loss: 0.4011 - val_acc: 0.8117
Epoch 248/400
614/614 [==============================] - 0s 49us/step - loss: 0.4061 - acc: 0.8029 - val_loss: 0.4093 - val_acc: 0.8052
Epoch 249/400
614/614 [==============================] - 0s 58us/step - loss: 0.4069 - acc: 0.7964 - val_loss: 0.4040 - val_acc: 0.8052
Epoch 250/400
614/614 [==============================] - 0s 50us/step - loss: 0.4062 - acc: 0.8013 - val_loss: 0.3987 - val_acc: 0.8182
Epoch 251/400
614/614 [==============================] - 0s 55us/step - loss: 0.4057 - acc: 0.8013 - val_loss: 0.4011 - val_acc: 0.8247
Epoch 252/400
614/614 [==============================] - 0s 50us/step - loss: 0.4056 - acc: 0.8046 - val_loss: 0.4007 - val_acc: 0.8247
Epoch 253/400
614/614 [==============================] - 0s 55us/step - loss: 0.4072 - acc: 0.7964 - val_loss: 0.3983 - val_acc: 0.8117
Epoch 254/400
614/614 [==============================] - 0s 42us/step - loss: 0.4062 - acc: 0.8111 - val_loss: 0.4016 - val_acc: 0.8117
Epoch 255/400
614/614 [==============================] - 0s 52us/step - loss: 0.4047 - acc: 0.7997 - val_loss: 0.4020 - val_acc: 0.8117
Epoch 256/400
614/614 [==============================] - 0s 49us/step - loss: 0.4048 - acc: 0.8046 - val_loss: 0.4000 - val_acc: 0.8247
Epoch 257/400
614/614 [==============================] - 0s 45us/step - loss: 0.4059 - acc: 0.8029 - val_loss: 0.3991 - val_acc: 0.8182
Epoch 258/400
614/614 [==============================] - 0s 47us/step - loss: 0.4058 - acc: 0.8013 - val_loss: 0.4000 - val_acc: 0.8247
Epoch 259/400
614/614 [==============================] - 0s 57us/step - loss: 0.4036 - acc: 0.8013 - val_loss: 0.3973 - val_acc: 0.8117
Epoch 260/400
614/614 [==============================] - 0s 52us/step - loss: 0.4038 - acc: 0.8062 - val_loss: 0.3999 - val_acc: 0.8182
Epoch 261/400
614/614 [==============================] - 0s 62us/step - loss: 0.4037 - acc: 0.8078 - val_loss: 0.3969 - val_acc: 0.8377
Epoch 262/400
614/614 [==============================] - 0s 50us/step - loss: 0.4032 - acc: 0.8029 - val_loss: 0.3988 - val_acc: 0.8117
Epoch 263/400
614/614 [==============================] - 0s 49us/step - loss: 0.4035 - acc: 0.8046 - val_loss: 0.4001 - val_acc: 0.8247
Epoch 264/400
614/614 [==============================] - 0s 54us/step - loss: 0.4037 - acc: 0.8078 - val_loss: 0.4002 - val_acc: 0.8117
Epoch 265/400
614/614 [==============================] - 0s 58us/step - loss: 0.4035 - acc: 0.8029 - val_loss: 0.4001 - val_acc: 0.8117
Epoch 266/400
614/614 [==============================] - 0s 49us/step - loss: 0.4034 - acc: 0.8029 - val_loss: 0.4002 - val_acc: 0.8182
Epoch 267/400
614/614 [==============================] - 0s 42us/step - loss: 0.4042 - acc: 0.8013 - val_loss: 0.4018 - val_acc: 0.8117
Epoch 268/400
614/614 [==============================] - 0s 52us/step - loss: 0.4024 - acc: 0.8078 - val_loss: 0.3958 - val_acc: 0.8312
Epoch 269/400
614/614 [==============================] - 0s 47us/step - loss: 0.4034 - acc: 0.8062 - val_loss: 0.3970 - val_acc: 0.8312
Epoch 270/400
614/614 [==============================] - 0s 50us/step - loss: 0.4019 - acc: 0.8062 - val_loss: 0.4065 - val_acc: 0.8052
Epoch 271/400
614/614 [==============================] - 0s 50us/step - loss: 0.4022 - acc: 0.8078 - val_loss: 0.3951 - val_acc: 0.8312
Epoch 272/400
614/614 [==============================] - 0s 52us/step - loss: 0.4019 - acc: 0.7997 - val_loss: 0.4001 - val_acc: 0.8117
Epoch 273/400
614/614 [==============================] - 0s 50us/step - loss: 0.4019 - acc: 0.8111 - val_loss: 0.4096 - val_acc: 0.8052
Epoch 274/400
614/614 [==============================] - 0s 47us/step - loss: 0.4048 - acc: 0.8111 - val_loss: 0.3952 - val_acc: 0.8312
Epoch 275/400
614/614 [==============================] - 0s 45us/step - loss: 0.4011 - acc: 0.8078 - val_loss: 0.3983 - val_acc: 0.8117
Epoch 276/400
614/614 [==============================] - 0s 49us/step - loss: 0.4010 - acc: 0.8078 - val_loss: 0.3980 - val_acc: 0.8182
Epoch 277/400
614/614 [==============================] - 0s 57us/step - loss: 0.4019 - acc: 0.8029 - val_loss: 0.3953 - val_acc: 0.8182
Epoch 278/400
614/614 [==============================] - 0s 55us/step - loss: 0.4011 - acc: 0.8078 - val_loss: 0.3949 - val_acc: 0.8312
Epoch 279/400
614/614 [==============================] - 0s 54us/step - loss: 0.4024 - acc: 0.8078 - val_loss: 0.3953 - val_acc: 0.8312
Epoch 280/400
614/614 [==============================] - 0s 50us/step - loss: 0.4023 - acc: 0.8094 - val_loss: 0.3952 - val_acc: 0.8312
Epoch 281/400
614/614 [==============================] - 0s 47us/step - loss: 0.4005 - acc: 0.8078 - val_loss: 0.3950 - val_acc: 0.8377
Epoch 282/400
614/614 [==============================] - 0s 50us/step - loss: 0.4009 - acc: 0.8029 - val_loss: 0.3954 - val_acc: 0.8312
Epoch 283/400
614/614 [==============================] - 0s 55us/step - loss: 0.3978 - acc: 0.8094 - val_loss: 0.3931 - val_acc: 0.8312
Epoch 284/400
614/614 [==============================] - 0s 57us/step - loss: 0.3990 - acc: 0.8078 - val_loss: 0.3986 - val_acc: 0.8117
Epoch 285/400
614/614 [==============================] - 0s 57us/step - loss: 0.3991 - acc: 0.8143 - val_loss: 0.3961 - val_acc: 0.8182
Epoch 286/400
614/614 [==============================] - 0s 47us/step - loss: 0.4002 - acc: 0.8111 - val_loss: 0.3911 - val_acc: 0.8377
Epoch 287/400
614/614 [==============================] - 0s 45us/step - loss: 0.3984 - acc: 0.8062 - val_loss: 0.3932 - val_acc: 0.8312
Epoch 288/400
614/614 [==============================] - 0s 47us/step - loss: 0.3995 - acc: 0.8094 - val_loss: 0.3916 - val_acc: 0.8247
Epoch 289/400
614/614 [==============================] - 0s 50us/step - loss: 0.3993 - acc: 0.8078 - val_loss: 0.3893 - val_acc: 0.8377
Epoch 290/400
614/614 [==============================] - 0s 55us/step - loss: 0.3978 - acc: 0.8111 - val_loss: 0.3965 - val_acc: 0.8182
Epoch 291/400
614/614 [==============================] - 0s 50us/step - loss: 0.3981 - acc: 0.8111 - val_loss: 0.3953 - val_acc: 0.8117
Epoch 292/400
614/614 [==============================] - 0s 50us/step - loss: 0.3990 - acc: 0.8062 - val_loss: 0.3920 - val_acc: 0.8247
Epoch 293/400
614/614 [==============================] - 0s 55us/step - loss: 0.3977 - acc: 0.8046 - val_loss: 0.3895 - val_acc: 0.8377
Epoch 294/400
614/614 [==============================] - 0s 49us/step - loss: 0.3972 - acc: 0.8029 - val_loss: 0.3908 - val_acc: 0.8377
Epoch 295/400
614/614 [==============================] - 0s 62us/step - loss: 0.3976 - acc: 0.8062 - val_loss: 0.3920 - val_acc: 0.8247
Epoch 296/400
614/614 [==============================] - 0s 55us/step - loss: 0.3960 - acc: 0.8029 - val_loss: 0.3937 - val_acc: 0.8247
Epoch 297/400
614/614 [==============================] - 0s 50us/step - loss: 0.3968 - acc: 0.8046 - val_loss: 0.3899 - val_acc: 0.8377
Epoch 298/400
614/614 [==============================] - 0s 42us/step - loss: 0.3949 - acc: 0.8062 - val_loss: 0.3871 - val_acc: 0.8377
Epoch 299/400
614/614 [==============================] - 0s 42us/step - loss: 0.3965 - acc: 0.8094 - val_loss: 0.3869 - val_acc: 0.8442
Epoch 300/400
614/614 [==============================] - 0s 52us/step - loss: 0.3963 - acc: 0.8029 - val_loss: 0.3882 - val_acc: 0.8442
Epoch 301/400
614/614 [==============================] - 0s 57us/step - loss: 0.3961 - acc: 0.8111 - val_loss: 0.3884 - val_acc: 0.8442
Epoch 302/400
614/614 [==============================] - 0s 60us/step - loss: 0.3956 - acc: 0.8046 - val_loss: 0.3885 - val_acc: 0.8442
Epoch 303/400
614/614 [==============================] - 0s 57us/step - loss: 0.3951 - acc: 0.8078 - val_loss: 0.3923 - val_acc: 0.8247
Epoch 304/400
614/614 [==============================] - 0s 45us/step - loss: 0.3965 - acc: 0.8029 - val_loss: 0.3855 - val_acc: 0.8442
Epoch 305/400
614/614 [==============================] - 0s 49us/step - loss: 0.3956 - acc: 0.8029 - val_loss: 0.3843 - val_acc: 0.8442
Epoch 306/400
614/614 [==============================] - 0s 55us/step - loss: 0.3958 - acc: 0.8111 - val_loss: 0.3859 - val_acc: 0.8442
Epoch 307/400
614/614 [==============================] - 0s 49us/step - loss: 0.3946 - acc: 0.8127 - val_loss: 0.3881 - val_acc: 0.8312
Epoch 308/400
614/614 [==============================] - 0s 54us/step - loss: 0.3952 - acc: 0.8094 - val_loss: 0.3865 - val_acc: 0.8442
Epoch 309/400
614/614 [==============================] - 0s 50us/step - loss: 0.3951 - acc: 0.8078 - val_loss: 0.3872 - val_acc: 0.8442
Epoch 310/400
614/614 [==============================] - 0s 49us/step - loss: 0.3936 - acc: 0.8062 - val_loss: 0.3897 - val_acc: 0.8506
Epoch 311/400
614/614 [==============================] - 0s 41us/step - loss: 0.3954 - acc: 0.8062 - val_loss: 0.3890 - val_acc: 0.8442
Epoch 312/400
614/614 [==============================] - 0s 62us/step - loss: 0.3938 - acc: 0.8078 - val_loss: 0.3869 - val_acc: 0.8377
Epoch 313/400
614/614 [==============================] - 0s 55us/step - loss: 0.3943 - acc: 0.8029 - val_loss: 0.3908 - val_acc: 0.8312
Epoch 314/400
614/614 [==============================] - 0s 52us/step - loss: 0.3945 - acc: 0.8078 - val_loss: 0.3897 - val_acc: 0.8247
Epoch 315/400
614/614 [==============================] - 0s 47us/step - loss: 0.3937 - acc: 0.8078 - val_loss: 0.3847 - val_acc: 0.8442
Epoch 316/400
614/614 [==============================] - 0s 58us/step - loss: 0.3913 - acc: 0.8160 - val_loss: 0.3860 - val_acc: 0.8377
Epoch 317/400
614/614 [==============================] - 0s 52us/step - loss: 0.3931 - acc: 0.8111 - val_loss: 0.3866 - val_acc: 0.8377
Epoch 318/400
614/614 [==============================] - 0s 49us/step - loss: 0.3929 - acc: 0.8062 - val_loss: 0.3863 - val_acc: 0.8506
Epoch 319/400
614/614 [==============================] - 0s 49us/step - loss: 0.3943 - acc: 0.8062 - val_loss: 0.3866 - val_acc: 0.8442
Epoch 320/400
614/614 [==============================] - 0s 71us/step - loss: 0.3927 - acc: 0.8029 - val_loss: 0.3848 - val_acc: 0.8442
Epoch 321/400
614/614 [==============================] - 0s 54us/step - loss: 0.3918 - acc: 0.8046 - val_loss: 0.3845 - val_acc: 0.8377
Epoch 322/400
614/614 [==============================] - 0s 42us/step - loss: 0.3907 - acc: 0.8078 - val_loss: 0.3861 - val_acc: 0.8506
Epoch 323/400
614/614 [==============================] - 0s 54us/step - loss: 0.3928 - acc: 0.8078 - val_loss: 0.3829 - val_acc: 0.8506
Epoch 324/400
614/614 [==============================] - 0s 54us/step - loss: 0.3934 - acc: 0.8046 - val_loss: 0.3850 - val_acc: 0.8571
Epoch 325/400
614/614 [==============================] - 0s 57us/step - loss: 0.3913 - acc: 0.8062 - val_loss: 0.3834 - val_acc: 0.8442
Epoch 326/400
614/614 [==============================] - 0s 57us/step - loss: 0.3927 - acc: 0.8046 - val_loss: 0.3870 - val_acc: 0.8247
Epoch 327/400
614/614 [==============================] - 0s 44us/step - loss: 0.3909 - acc: 0.8127 - val_loss: 0.3838 - val_acc: 0.8442
Epoch 328/400
614/614 [==============================] - 0s 49us/step - loss: 0.3922 - acc: 0.8062 - val_loss: 0.3833 - val_acc: 0.8442
Epoch 329/400
614/614 [==============================] - 0s 52us/step - loss: 0.3908 - acc: 0.8094 - val_loss: 0.3822 - val_acc: 0.8636
Epoch 330/400
614/614 [==============================] - 0s 37us/step - loss: 0.3911 - acc: 0.8127 - val_loss: 0.3809 - val_acc: 0.8571
Epoch 331/400
614/614 [==============================] - 0s 55us/step - loss: 0.3920 - acc: 0.8111 - val_loss: 0.3848 - val_acc: 0.8312
Epoch 332/400
614/614 [==============================] - 0s 58us/step - loss: 0.3917 - acc: 0.8029 - val_loss: 0.3896 - val_acc: 0.8117
Epoch 333/400
614/614 [==============================] - 0s 55us/step - loss: 0.3902 - acc: 0.8143 - val_loss: 0.3834 - val_acc: 0.8442
Epoch 334/400
614/614 [==============================] - 0s 49us/step - loss: 0.3917 - acc: 0.8094 - val_loss: 0.3823 - val_acc: 0.8506
Epoch 335/400
614/614 [==============================] - 0s 50us/step - loss: 0.3897 - acc: 0.8127 - val_loss: 0.3848 - val_acc: 0.8571
Epoch 336/400
614/614 [==============================] - 0s 63us/step - loss: 0.3923 - acc: 0.8111 - val_loss: 0.3831 - val_acc: 0.8442
Epoch 337/400
614/614 [==============================] - 0s 52us/step - loss: 0.3905 - acc: 0.8094 - val_loss: 0.3814 - val_acc: 0.8506
Epoch 338/400
614/614 [==============================] - 0s 63us/step - loss: 0.3903 - acc: 0.8111 - val_loss: 0.3815 - val_acc: 0.8442
Epoch 339/400
614/614 [==============================] - 0s 60us/step - loss: 0.3899 - acc: 0.8062 - val_loss: 0.3839 - val_acc: 0.8312
Epoch 340/400
614/614 [==============================] - 0s 67us/step - loss: 0.3891 - acc: 0.8127 - val_loss: 0.3820 - val_acc: 0.8442
Epoch 341/400
614/614 [==============================] - 0s 62us/step - loss: 0.3892 - acc: 0.8078 - val_loss: 0.3847 - val_acc: 0.8377
Epoch 342/400
614/614 [==============================] - 0s 76us/step - loss: 0.3919 - acc: 0.8094 - val_loss: 0.3817 - val_acc: 0.8377
Epoch 343/400
614/614 [==============================] - 0s 62us/step - loss: 0.3898 - acc: 0.8094 - val_loss: 0.3829 - val_acc: 0.8442
Epoch 344/400
614/614 [==============================] - 0s 54us/step - loss: 0.3909 - acc: 0.8143 - val_loss: 0.3837 - val_acc: 0.8377
Epoch 345/400
614/614 [==============================] - 0s 50us/step - loss: 0.3902 - acc: 0.8062 - val_loss: 0.3788 - val_acc: 0.8506
Epoch 346/400
614/614 [==============================] - 0s 52us/step - loss: 0.3877 - acc: 0.8127 - val_loss: 0.3819 - val_acc: 0.8377
Epoch 347/400
614/614 [==============================] - 0s 57us/step - loss: 0.3900 - acc: 0.8143 - val_loss: 0.3819 - val_acc: 0.8701
Epoch 348/400
614/614 [==============================] - 0s 52us/step - loss: 0.3897 - acc: 0.8094 - val_loss: 0.3805 - val_acc: 0.8442
Epoch 349/400
614/614 [==============================] - 0s 63us/step - loss: 0.3900 - acc: 0.8111 - val_loss: 0.3867 - val_acc: 0.8312
Epoch 350/400
614/614 [==============================] - 0s 45us/step - loss: 0.3886 - acc: 0.8094 - val_loss: 0.3822 - val_acc: 0.8442
Epoch 351/400
614/614 [==============================] - 0s 58us/step - loss: 0.3885 - acc: 0.8111 - val_loss: 0.3814 - val_acc: 0.8506
Epoch 352/400
614/614 [==============================] - 0s 47us/step - loss: 0.3884 - acc: 0.8094 - val_loss: 0.3791 - val_acc: 0.8506
Epoch 353/400
614/614 [==============================] - 0s 58us/step - loss: 0.3882 - acc: 0.8046 - val_loss: 0.3817 - val_acc: 0.8636
Epoch 354/400
614/614 [==============================] - 0s 58us/step - loss: 0.3879 - acc: 0.8111 - val_loss: 0.3840 - val_acc: 0.8312
Epoch 355/400
614/614 [==============================] - 0s 49us/step - loss: 0.3889 - acc: 0.8094 - val_loss: 0.3797 - val_acc: 0.8636
Epoch 356/400
614/614 [==============================] - 0s 47us/step - loss: 0.3887 - acc: 0.8094 - val_loss: 0.3809 - val_acc: 0.8506
Epoch 357/400
614/614 [==============================] - 0s 49us/step - loss: 0.3870 - acc: 0.8127 - val_loss: 0.3824 - val_acc: 0.8636
Epoch 358/400
614/614 [==============================] - 0s 47us/step - loss: 0.3880 - acc: 0.8078 - val_loss: 0.3802 - val_acc: 0.8636
Epoch 359/400
614/614 [==============================] - 0s 49us/step - loss: 0.3878 - acc: 0.8143 - val_loss: 0.3905 - val_acc: 0.8117
Epoch 360/400
614/614 [==============================] - 0s 67us/step - loss: 0.3892 - acc: 0.8046 - val_loss: 0.3812 - val_acc: 0.8506
Epoch 361/400
614/614 [==============================] - 0s 54us/step - loss: 0.3885 - acc: 0.8111 - val_loss: 0.3883 - val_acc: 0.8247
Epoch 362/400
614/614 [==============================] - 0s 45us/step - loss: 0.3863 - acc: 0.8078 - val_loss: 0.3794 - val_acc: 0.8636
Epoch 363/400
614/614 [==============================] - 0s 55us/step - loss: 0.3876 - acc: 0.8062 - val_loss: 0.3799 - val_acc: 0.8636
Epoch 364/400
614/614 [==============================] - 0s 52us/step - loss: 0.3872 - acc: 0.8078 - val_loss: 0.3853 - val_acc: 0.8377
Epoch 365/400
614/614 [==============================] - 0s 63us/step - loss: 0.3866 - acc: 0.8143 - val_loss: 0.3815 - val_acc: 0.8377
Epoch 366/400
614/614 [==============================] - 0s 55us/step - loss: 0.3875 - acc: 0.8111 - val_loss: 0.3925 - val_acc: 0.8117
Epoch 367/400
614/614 [==============================] - 0s 54us/step - loss: 0.3884 - acc: 0.8127 - val_loss: 0.3802 - val_acc: 0.8506
Epoch 368/400
614/614 [==============================] - 0s 71us/step - loss: 0.3873 - acc: 0.8094 - val_loss: 0.3796 - val_acc: 0.8636
Epoch 369/400
614/614 [==============================] - 0s 70us/step - loss: 0.3885 - acc: 0.8127 - val_loss: 0.3783 - val_acc: 0.8636
Epoch 370/400
614/614 [==============================] - 0s 63us/step - loss: 0.3879 - acc: 0.8127 - val_loss: 0.3843 - val_acc: 0.8377
Epoch 371/400
614/614 [==============================] - 0s 67us/step - loss: 0.3868 - acc: 0.8062 - val_loss: 0.3809 - val_acc: 0.8636
Epoch 372/400
614/614 [==============================] - 0s 63us/step - loss: 0.3862 - acc: 0.8127 - val_loss: 0.3805 - val_acc: 0.8636
Epoch 373/400
614/614 [==============================] - 0s 54us/step - loss: 0.3874 - acc: 0.8062 - val_loss: 0.3824 - val_acc: 0.8442
Epoch 374/400
614/614 [==============================] - 0s 44us/step - loss: 0.3866 - acc: 0.8111 - val_loss: 0.3816 - val_acc: 0.8377
Epoch 375/400
614/614 [==============================] - 0s 62us/step - loss: 0.3875 - acc: 0.8143 - val_loss: 0.3792 - val_acc: 0.8442
Epoch 376/400
614/614 [==============================] - 0s 67us/step - loss: 0.3852 - acc: 0.8062 - val_loss: 0.3800 - val_acc: 0.8701
Epoch 377/400
614/614 [==============================] - 0s 57us/step - loss: 0.3854 - acc: 0.8078 - val_loss: 0.3818 - val_acc: 0.8506
Epoch 378/400
614/614 [==============================] - 0s 55us/step - loss: 0.3860 - acc: 0.8111 - val_loss: 0.3793 - val_acc: 0.8377
Epoch 379/400
614/614 [==============================] - 0s 57us/step - loss: 0.3854 - acc: 0.8046 - val_loss: 0.3791 - val_acc: 0.8636
Epoch 380/400
614/614 [==============================] - 0s 67us/step - loss: 0.3862 - acc: 0.8111 - val_loss: 0.3784 - val_acc: 0.8701
Epoch 381/400
614/614 [==============================] - 0s 63us/step - loss: 0.3856 - acc: 0.8176 - val_loss: 0.3826 - val_acc: 0.8442
Epoch 382/400
614/614 [==============================] - 0s 67us/step - loss: 0.3869 - acc: 0.8062 - val_loss: 0.3780 - val_acc: 0.8636
Epoch 383/400
614/614 [==============================] - 0s 47us/step - loss: 0.3851 - acc: 0.8127 - val_loss: 0.3777 - val_acc: 0.8506
Epoch 384/400
614/614 [==============================] - 0s 54us/step - loss: 0.3849 - acc: 0.8111 - val_loss: 0.3762 - val_acc: 0.8506
Epoch 385/400
614/614 [==============================] - 0s 52us/step - loss: 0.3859 - acc: 0.8127 - val_loss: 0.3809 - val_acc: 0.8377
Epoch 386/400
614/614 [==============================] - 0s 52us/step - loss: 0.3847 - acc: 0.8078 - val_loss: 0.3787 - val_acc: 0.8442
Epoch 387/400
614/614 [==============================] - 0s 65us/step - loss: 0.3831 - acc: 0.8143 - val_loss: 0.3795 - val_acc: 0.8442
Epoch 388/400
614/614 [==============================] - 0s 55us/step - loss: 0.3846 - acc: 0.8111 - val_loss: 0.3777 - val_acc: 0.8442
Epoch 389/400
614/614 [==============================] - 0s 50us/step - loss: 0.3851 - acc: 0.8094 - val_loss: 0.3811 - val_acc: 0.8506
Epoch 390/400
614/614 [==============================] - 0s 50us/step - loss: 0.3842 - acc: 0.8078 - val_loss: 0.3810 - val_acc: 0.8377
Epoch 391/400
614/614 [==============================] - 0s 62us/step - loss: 0.3859 - acc: 0.8111 - val_loss: 0.3838 - val_acc: 0.8442
Epoch 392/400
614/614 [==============================] - 0s 57us/step - loss: 0.3854 - acc: 0.8127 - val_loss: 0.3820 - val_acc: 0.8442
Epoch 393/400
614/614 [==============================] - 0s 47us/step - loss: 0.3846 - acc: 0.8094 - val_loss: 0.3791 - val_acc: 0.8442
Epoch 394/400
614/614 [==============================] - 0s 52us/step - loss: 0.3851 - acc: 0.8127 - val_loss: 0.3820 - val_acc: 0.8506
Epoch 395/400
614/614 [==============================] - 0s 50us/step - loss: 0.3861 - acc: 0.8094 - val_loss: 0.3812 - val_acc: 0.8442
Epoch 396/400
614/614 [==============================] - 0s 47us/step - loss: 0.3858 - acc: 0.8143 - val_loss: 0.3780 - val_acc: 0.8701
Epoch 397/400
614/614 [==============================] - 0s 49us/step - loss: 0.3839 - acc: 0.8143 - val_loss: 0.3788 - val_acc: 0.8636
Epoch 398/400
614/614 [==============================] - 0s 60us/step - loss: 0.3847 - acc: 0.8143 - val_loss: 0.3765 - val_acc: 0.8506
Epoch 399/400
614/614 [==============================] - 0s 57us/step - loss: 0.3838 - acc: 0.8111 - val_loss: 0.3781 - val_acc: 0.8442
Epoch 400/400
614/614 [==============================] - 0s 52us/step - loss: 0.3841 - acc: 0.8143 - val_loss: 0.3851 - val_acc: 0.8506
Out[29]:
<keras.callbacks.History at 0x1affd942ef0>
In [30]:
model.summary()
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
dense_4 (Dense)              (None, 10)                90        
_________________________________________________________________
dense_5 (Dense)              (None, 10)                110       
_________________________________________________________________
dense_6 (Dense)              (None, 1)                 11        
=================================================================
Total params: 211
Trainable params: 211
Non-trainable params: 0
_________________________________________________________________

As evident from this highly unscientific test, the common wisdom that mean imputation is just as good is not necessarily true. Even with this overkill of a model, KNN imputed data performs significantly better than mean imputed data(0.8701 - epoch 396 vs 0.7987 - epoch 324 in this run)

Summary

Missing data is broadly classified into three categories: MCAR, MAR and MNAR. We show the abysmal performance of mean imputation and median imputation with a toy example. Next, we create an intuitive understanding of KNN imputation and write sample code for its implementation.

Finally, we apply the techniques to Pima Indian Diabetes set and use four different imputation strategies. We show the superiority of KNN imputation technique over other imputation strategies for both logistic regression and neural networks, discrediting a common belief about imputation techniques.

Further Reading