1.1 BPTT(BackPropagation Through TIme)의 문제
저번 시간에 다루었던 BPTT는 아래의 그림과 같이 모든 타임스템마다 처음부터 끝까지 역전파를 한다.
하지만 시간이 길 경우, 역전파의 길이는 길어진다. 즉, 깊은 네트워크가 형성된다.
깊은 네트워크는 Gradient Vanishing & Exploding 문제가 발생할 가능성이 크다.
깊은 네트워크는 역전파의 계산량이 많아지기 때문에 학습시간이 오래 걸린다
1.2 장기 의존성(Long_Term Dependency) 문제
RNN은 타임 스템t에서 이전 타임 스템(t - 1)의 상태를 입력으로 받는 구조 --> 이전의 정보가 현재의 타임 스템 t에 영향을 줄 수 있음
장기 의존성: 이론적으로 모든 이전 타임 스텝이 영향을 주지만 앞쪽의 타임 스템은 타음 스템이 길어질 수록 영향을 주지 못함
LSTM(Long Short-Term Memory)는 1995년에 제안된 구조로써 RNN의 장기 의존성 문제를 해결하고 학습 속도를 높임
모든 RNN은 신경망의 반복적인 모듈 체인의 형대를 갖음
LSTM에는 모듈 체인과 같은 구조가 있지만 1개의 신경 네트워크 계층을 갖는 대신 4개의 신경 네트워크 계층을 가짐
LSTM의 과정
LSTM는 두개의 input data ht와 ct가 있음, ct는 장기적인 기억, ht는 단기적인 기억을 저장
LSTM의 핵심은 ct에서 기억할 부분, 삭제할 부분, 그리고 읽어 들일 부분을 학습하는 것
ct는 셀의 왼쪽에서 오른쪽으로 통과하면서 forget gate와 input gate를 거침으로써 기억을 일부 잃고 얻음
과정을 거친 ct는 다시 tanh함수로 전달되어 ht와 yt를 만드는데 기반이 됨
forget gate layer
어떤 정보를 버릴지 선택하는 과정
ft에 의해 제어되며 장기 장태ct를 얼마나 삭제할지 제어
input gate layer
it에 의해 제어되며 gt의 어느 부분이 장기 상태ct에 더해져야 하는지 제어
기존 RNN의 셀과 같은 형태를 취함
output gate layer
ot는 장기 상태 ct의 어느 부분을 읽어서 ht와 yt로 출력해야 하는지 제어
파란색 박스가 입력값이고, 빨간색 박스가 우리가 원하는 출력값입니다.
1~4번째 음표를 데이터로 5번째 음표를 라벨값으로 학습을 시킵니다.
다음에는 2~5번째 음표를 데이터로 6번째 음표를 라벨값으로 학습을 시킵니다.
이후 한 음표씩 넘어가면서 노래 끝까지 학습시킵니다.
code2idx = {'c4':0, 'd4':1, 'e4':2, 'f4':3, 'g4':4, 'a4':5, 'b4':6,
'c8':7, 'd8':8, 'e8':9, 'f8':10, 'g8':11, 'a8':12, 'b8':13}
idx2code = {0:'c4', 1:'d4', 2:'e4', 3:'f4', 4:'g4', 5:'a4', 6:'b4',
7:'c8', 8:'d8', 9:'e8', 10:'f8', 11:'g8', 12:'a8', 13:'b8'}
import numpy as np
def seq2dataset(seq, window_size):
dataset = []
for i in range(len(seq)-window_size):
subset = seq[i:(i+window_size+1)]
dataset.append([code2idx[item] for item in subset])
return np.array(dataset)
seq = ['g8', 'e8', 'e4', 'f8', 'd8', 'd4', 'c8', 'd8', 'e8', 'f8', 'g8', 'g8', 'g4',
'g8', 'e8', 'e8', 'e8', 'f8', 'd8', 'd4', 'c8', 'e8', 'g8', 'g8', 'e8', 'e8', 'e4',
'd8', 'd8', 'd8', 'd8', 'd8', 'e8', 'f4', 'e8', 'e8', 'e8', 'e8', 'e8', 'f8', 'g4',
'g8', 'e8', 'e4', 'f8', 'd8', 'd4', 'c8', 'e8', 'g8', 'g8', 'e8', 'e8', 'e4']
dataset = seq2dataset(seq, window_size = 4)
print(dataset.shape)
print(dataset)
(50, 5) [[11 9 2 10 8] [ 9 2 10 8 1] [ 2 10 8 1 7] [10 8 1 7 8] [ 8 1 7 8 9] [ 1 7 8 9 10] [ 7 8 9 10 11] [ 8 9 10 11 11] [ 9 10 11 11 4] [10 11 11 4 11] [11 11 4 11 9] [11 4 11 9 9] [ 4 11 9 9 9] [11 9 9 9 10] [ 9 9 9 10 8] [ 9 9 10 8 1] [ 9 10 8 1 7] [10 8 1 7 9] [ 8 1 7 9 11] [ 1 7 9 11 11] [ 7 9 11 11 9] [ 9 11 11 9 9] [11 11 9 9 2] [11 9 9 2 8] [ 9 9 2 8 8] [ 9 2 8 8 8] [ 2 8 8 8 8] [ 8 8 8 8 8] [ 8 8 8 8 9] [ 8 8 8 9 3] [ 8 8 9 3 9] [ 8 9 3 9 9] [ 9 3 9 9 9] [ 3 9 9 9 9] [ 9 9 9 9 9] [ 9 9 9 9 10] [ 9 9 9 10 4] [ 9 9 10 4 11] [ 9 10 4 11 9] [10 4 11 9 2] [ 4 11 9 2 10] [11 9 2 10 8] [ 9 2 10 8 1] [ 2 10 8 1 7] [10 8 1 7 9] [ 8 1 7 9 11] [ 1 7 9 11 11] [ 7 9 11 11 9] [ 9 11 11 9 9] [11 11 9 9 2]]
# 0. 사용할 패키지 불러오기
import keras
import numpy as np
from keras.models import Sequential
from keras.layers import Dense
from keras.layers import LSTM
from keras.utils import np_utils
# 랜덤시드 고정시키기
np.random.seed(5)
# 손실 이력 클래스 정의
class LossHistory(keras.callbacks.Callback):
def init(self):
self.losses = []
def on_epoch_end(self, batch, logs={}):
self.losses.append(logs.get('loss'))
# 데이터셋 생성 함수
def seq2dataset(seq, window_size):
dataset = []
for i in range(len(seq)-window_size):
subset = seq[i:(i+window_size+1)]
dataset.append([code2idx[item] for item in subset])
return np.array(dataset)
# 1. 데이터 준비하기
# 코드 사전 정의
code2idx = {'c4':0, 'd4':1, 'e4':2, 'f4':3, 'g4':4, 'a4':5, 'b4':6,
'c8':7, 'd8':8, 'e8':9, 'f8':10, 'g8':11, 'a8':12, 'b8':13}
idx2code = {0:'c4', 1:'d4', 2:'e4', 3:'f4', 4:'g4', 5:'a4', 6:'b4',
7:'c8', 8:'d8', 9:'e8', 10:'f8', 11:'g8', 12:'a8', 13:'b8'}
# 시퀀스 데이터 정의
seq = ['g8', 'e8', 'e4', 'f8', 'd8', 'd4', 'c8', 'd8', 'e8', 'f8', 'g8', 'g8', 'g4',
'g8', 'e8', 'e8', 'e8', 'f8', 'd8', 'd4', 'c8', 'e8', 'g8', 'g8', 'e8', 'e8', 'e4',
'd8', 'd8', 'd8', 'd8', 'd8', 'e8', 'f4', 'e8', 'e8', 'e8', 'e8', 'e8', 'f8', 'g4',
'g8', 'e8', 'e4', 'f8', 'd8', 'd4', 'c8', 'e8', 'g8', 'g8', 'e8', 'e8', 'e4']
# 2. 데이터셋 생성하기
dataset = seq2dataset(seq, window_size = 4)
print(dataset.shape)
# 입력(X)과 출력(Y) 변수로 분리하기
x_train = dataset[:,0:4]
y_train = dataset[:,4]
max_idx_value = 13
# 입력값 정규화 시키기
x_train = x_train / float(max_idx_value)
# 입력을 (샘플 수, 타입스텝, 특성 수)로 형태 변환
x_train = np.reshape(x_train, (50, 4, 1))
# 라벨값에 대한 one-hot 인코딩 수행
y_train = np_utils.to_categorical(y_train)
one_hot_vec_size = y_train.shape[1]
print("one hot encoding vector size is ", one_hot_vec_size)
# 3. 모델 구성하기
model = Sequential()
model.add(LSTM(128, input_shape = (4, 1)))
model.add(Dense(one_hot_vec_size, activation='softmax'))
# 4. 모델 학습과정 설정하기
model.compile(loss='categorical_crossentropy', optimizer='adam', metrics=['accuracy'])
history = LossHistory() # 손실 이력 객체 생성
history.init()
# 5. 모델 학습시키기
model.fit(x_train, y_train, epochs=2000, batch_size=14, verbose=2, callbacks=[history])
# 6. 학습과정 살펴보기
%matplotlib inline
import matplotlib.pyplot as plt
plt.plot(history.losses)
plt.ylabel('loss')
plt.xlabel('epoch')
plt.legend(['train'], loc='upper left')
plt.show()
# 7. 모델 평가하기
scores = model.evaluate(x_train, y_train)
print("%s: %.2f%%" %(model.metrics_names[1], scores[1]*100))
# 8. 모델 사용하기
pred_count = 50 # 최대 예측 개수 정의
# 한 스텝 예측
seq_out = ['g8', 'e8', 'e4', 'f8']
pred_out = model.predict(x_train)
for i in range(pred_count):
idx = np.argmax(pred_out[i]) # one-hot 인코딩을 인덱스 값으로 변환
seq_out.append(idx2code[idx]) # seq_out는 최종 악보이므로 인덱스 값을 코드로 변환하여 저장
print("one step prediction : ", seq_out)
# 곡 전체 예측
seq_in = ['g8', 'e8', 'e4', 'f8']
seq_out = seq_in
seq_in = [code2idx[it] / float(max_idx_value) for it in seq_in] # 코드를 인덱스값으로 변환
for i in range(pred_count):
sample_in = np.array(seq_in)
sample_in = np.reshape(sample_in, (1, 4, 1)) # 샘플 수, 타입스텝 수, 속성 수
pred_out = model.predict(sample_in)
idx = np.argmax(pred_out)
seq_out.append(idx2code[idx])
seq_in.append(idx / float(max_idx_value))
seq_in.pop(0)
print("full song prediction : ", seq_out)
Using TensorFlow backend.
(50, 5) one hot encoding vector size is 12 Epoch 1/2000 - 1s - loss: 2.4859 - acc: 0.0400 Epoch 2/2000 - 0s - loss: 2.4483 - acc: 0.2400 Epoch 3/2000 - 0s - loss: 2.4081 - acc: 0.3400 Epoch 4/2000 - 0s - loss: 2.3629 - acc: 0.3400 Epoch 5/2000 - 0s - loss: 2.3070 - acc: 0.3400 Epoch 6/2000 - 0s - loss: 2.2403 - acc: 0.3400 Epoch 7/2000 - 0s - loss: 2.1650 - acc: 0.3400 Epoch 8/2000 - 0s - loss: 2.0775 - acc: 0.3400 Epoch 9/2000 - 0s - loss: 1.9978 - acc: 0.3400 Epoch 10/2000 - 0s - loss: 1.9623 - acc: 0.3400 Epoch 11/2000 - 0s - loss: 1.9637 - acc: 0.3400 Epoch 12/2000 - 0s - loss: 1.9550 - acc: 0.3400 Epoch 13/2000 - 0s - loss: 1.9377 - acc: 0.3400 Epoch 14/2000 - 0s - loss: 1.9230 - acc: 0.3400 Epoch 15/2000 - 0s - loss: 1.9217 - acc: 0.3400 Epoch 16/2000 - 0s - loss: 1.9177 - acc: 0.3400 Epoch 17/2000 - 0s - loss: 1.9181 - acc: 0.3400 Epoch 18/2000 - 0s - loss: 1.9123 - acc: 0.3400 Epoch 19/2000 - 0s - loss: 1.9071 - acc: 0.3400 Epoch 20/2000 - 0s - loss: 1.9025 - acc: 0.3400 Epoch 21/2000 - 0s - loss: 1.9009 - acc: 0.3400 Epoch 22/2000 - 0s - loss: 1.9001 - acc: 0.3400 Epoch 23/2000 - 0s - loss: 1.8979 - acc: 0.3400 Epoch 24/2000 - 0s - loss: 1.8989 - acc: 0.3400 Epoch 25/2000 - 0s - loss: 1.8964 - acc: 0.3400 Epoch 26/2000 - 0s - loss: 1.8945 - acc: 0.3400 Epoch 27/2000 - 0s - loss: 1.8994 - acc: 0.3400 Epoch 28/2000 - 0s - loss: 1.8921 - acc: 0.3400 Epoch 29/2000 - 0s - loss: 1.8905 - acc: 0.3400 Epoch 30/2000 - 0s - loss: 1.8930 - acc: 0.3400 Epoch 31/2000 - 0s - loss: 1.8933 - acc: 0.3400 Epoch 32/2000 - 0s - loss: 1.8900 - acc: 0.3400 Epoch 33/2000 - 0s - loss: 1.8882 - acc: 0.3400 Epoch 34/2000 - 0s - loss: 1.8873 - acc: 0.3400 Epoch 35/2000 - 0s - loss: 1.8910 - acc: 0.3400 Epoch 36/2000 - 0s - loss: 1.8858 - acc: 0.3400 Epoch 37/2000 - 0s - loss: 1.8849 - acc: 0.3400 Epoch 38/2000 - 0s - loss: 1.8845 - acc: 0.3400 Epoch 39/2000 - 0s - loss: 1.8830 - acc: 0.3400 Epoch 40/2000 - 0s - loss: 1.8829 - acc: 0.3400 Epoch 41/2000 - 0s - loss: 1.8814 - acc: 0.3400 Epoch 42/2000 - 0s - loss: 1.8832 - acc: 0.3400 Epoch 43/2000 - 0s - loss: 1.8819 - acc: 0.3400 Epoch 44/2000 - 0s - loss: 1.8798 - acc: 0.3400 Epoch 45/2000 - 0s - loss: 1.8799 - acc: 0.3400 Epoch 46/2000 - 0s - loss: 1.8786 - acc: 0.3400 Epoch 47/2000 - 0s - loss: 1.8807 - acc: 0.3400 Epoch 48/2000 - 0s - loss: 1.8809 - acc: 0.3400 Epoch 49/2000 - 0s - loss: 1.8770 - acc: 0.3400 Epoch 50/2000 - 0s - loss: 1.8745 - acc: 0.3400 Epoch 51/2000 - 0s - loss: 1.8719 - acc: 0.3400 Epoch 52/2000 - 0s - loss: 1.8777 - acc: 0.3400 Epoch 53/2000 - 0s - loss: 1.8745 - acc: 0.3400 Epoch 54/2000 - 0s - loss: 1.8764 - acc: 0.3400 Epoch 55/2000 - 0s - loss: 1.8729 - acc: 0.3400 Epoch 56/2000 - 0s - loss: 1.8761 - acc: 0.3400 Epoch 57/2000 - 0s - loss: 1.8724 - acc: 0.3400 Epoch 58/2000 - 0s - loss: 1.8721 - acc: 0.3400 Epoch 59/2000 - 0s - loss: 1.8731 - acc: 0.3400 Epoch 60/2000 - 0s - loss: 1.8694 - acc: 0.3400 Epoch 61/2000 - 0s - loss: 1.8697 - acc: 0.3400 Epoch 62/2000 - 0s - loss: 1.8678 - acc: 0.3400 Epoch 63/2000 - 0s - loss: 1.8671 - acc: 0.3400 Epoch 64/2000 - 0s - loss: 1.8642 - acc: 0.3400 Epoch 65/2000 - 0s - loss: 1.8688 - acc: 0.3400 Epoch 66/2000 - 0s - loss: 1.8612 - acc: 0.3400 Epoch 67/2000 - 0s - loss: 1.8650 - acc: 0.3400 Epoch 68/2000 - 0s - loss: 1.8609 - acc: 0.3400 Epoch 69/2000 - 0s - loss: 1.8611 - acc: 0.3400 Epoch 70/2000 - 0s - loss: 1.8590 - acc: 0.3400 Epoch 71/2000 - 0s - loss: 1.8588 - acc: 0.3400 Epoch 72/2000 - 0s - loss: 1.8572 - acc: 0.3400 Epoch 73/2000 - 0s - loss: 1.8571 - acc: 0.3400 Epoch 74/2000 - 0s - loss: 1.8550 - acc: 0.3400 Epoch 75/2000 - 0s - loss: 1.8510 - acc: 0.3400 Epoch 76/2000 - 0s - loss: 1.8539 - acc: 0.3400 Epoch 77/2000 - 0s - loss: 1.8526 - acc: 0.3400 Epoch 78/2000 - 0s - loss: 1.8497 - acc: 0.3400 Epoch 79/2000 - 0s - loss: 1.8468 - acc: 0.3400 Epoch 80/2000 - 0s - loss: 1.8470 - acc: 0.3400 Epoch 81/2000 - 0s - loss: 1.8436 - acc: 0.3400 Epoch 82/2000 - 0s - loss: 1.8431 - acc: 0.3400 Epoch 83/2000 - 0s - loss: 1.8438 - acc: 0.3400 Epoch 84/2000 - 0s - loss: 1.8371 - acc: 0.3400 Epoch 85/2000 - 0s - loss: 1.8371 - acc: 0.3400 Epoch 86/2000 - 0s - loss: 1.8371 - acc: 0.3400 Epoch 87/2000 - 0s - loss: 1.8343 - acc: 0.3400 Epoch 88/2000 - 0s - loss: 1.8332 - acc: 0.3400 Epoch 89/2000 - 0s - loss: 1.8367 - acc: 0.3400 Epoch 90/2000 - 0s - loss: 1.8294 - acc: 0.3400 Epoch 91/2000 - 0s - loss: 1.8295 - acc: 0.3400 Epoch 92/2000 - 0s - loss: 1.8192 - acc: 0.3400 Epoch 93/2000 - 0s - loss: 1.8209 - acc: 0.3400 Epoch 94/2000 - 0s - loss: 1.8211 - acc: 0.3400 Epoch 95/2000 - 0s - loss: 1.8174 - acc: 0.3400 Epoch 96/2000 - 0s - loss: 1.8070 - acc: 0.3400 Epoch 97/2000 - 0s - loss: 1.8063 - acc: 0.3400 Epoch 98/2000 - 0s - loss: 1.8097 - acc: 0.3400 Epoch 99/2000 - 0s - loss: 1.8007 - acc: 0.3400 Epoch 100/2000 - 0s - loss: 1.7993 - acc: 0.3400 Epoch 101/2000 - 0s - loss: 1.8053 - acc: 0.3400 Epoch 102/2000 - 0s - loss: 1.8067 - acc: 0.3400 Epoch 103/2000 - 0s - loss: 1.7992 - acc: 0.3400 Epoch 104/2000 - 0s - loss: 1.7910 - acc: 0.3200 Epoch 105/2000 - 0s - loss: 1.7865 - acc: 0.3400 Epoch 106/2000 - 0s - loss: 1.7808 - acc: 0.3400 Epoch 107/2000 - 0s - loss: 1.7856 - acc: 0.3400 Epoch 108/2000 - 0s - loss: 1.7795 - acc: 0.3800 Epoch 109/2000 - 0s - loss: 1.7889 - acc: 0.3600 Epoch 110/2000 - 0s - loss: 1.7679 - acc: 0.3800 Epoch 111/2000 - 0s - loss: 1.7651 - acc: 0.3600 Epoch 112/2000 - 0s - loss: 1.7607 - acc: 0.3800 Epoch 113/2000 - 0s - loss: 1.7589 - acc: 0.3800 Epoch 114/2000 - 0s - loss: 1.7570 - acc: 0.3800 Epoch 115/2000 - 0s - loss: 1.7518 - acc: 0.3600 Epoch 116/2000 - 0s - loss: 1.7421 - acc: 0.3800 Epoch 117/2000 - 0s - loss: 1.7457 - acc: 0.3800 Epoch 118/2000 - 0s - loss: 1.7404 - acc: 0.3600 Epoch 119/2000 - 0s - loss: 1.7351 - acc: 0.3400 Epoch 120/2000 - 0s - loss: 1.7314 - acc: 0.3600 Epoch 121/2000 - 0s - loss: 1.7349 - acc: 0.4000 Epoch 122/2000 - 0s - loss: 1.7392 - acc: 0.4200 Epoch 123/2000 - 0s - loss: 1.7215 - acc: 0.3400 Epoch 124/2000 - 0s - loss: 1.7164 - acc: 0.3400 Epoch 125/2000 - 0s - loss: 1.7236 - acc: 0.4200 Epoch 126/2000 - 0s - loss: 1.7138 - acc: 0.4000 Epoch 127/2000 - 0s - loss: 1.7096 - acc: 0.3400 Epoch 128/2000 - 0s - loss: 1.7007 - acc: 0.4000 Epoch 129/2000 - 0s - loss: 1.7000 - acc: 0.3600 Epoch 130/2000 - 0s - loss: 1.6979 - acc: 0.3800 Epoch 131/2000 - 0s - loss: 1.6977 - acc: 0.4000 Epoch 132/2000 - 0s - loss: 1.6934 - acc: 0.4000 Epoch 133/2000 - 0s - loss: 1.7058 - acc: 0.4000 Epoch 134/2000 - 0s - loss: 1.6928 - acc: 0.4000 Epoch 135/2000 - 0s - loss: 1.6850 - acc: 0.4000 Epoch 136/2000 - 0s - loss: 1.6924 - acc: 0.3800 Epoch 137/2000 - 0s - loss: 1.6992 - acc: 0.3400 Epoch 138/2000 - 0s - loss: 1.6796 - acc: 0.4000 Epoch 139/2000 - 0s - loss: 1.6969 - acc: 0.3600 Epoch 140/2000 - 0s - loss: 1.6744 - acc: 0.3800 Epoch 141/2000 - 0s - loss: 1.6773 - acc: 0.3600 Epoch 142/2000 - 0s - loss: 1.6665 - acc: 0.3600 Epoch 143/2000 - 0s - loss: 1.6656 - acc: 0.4000 Epoch 144/2000 - 0s - loss: 1.6642 - acc: 0.3600 Epoch 145/2000 - 0s - loss: 1.6584 - acc: 0.4000 Epoch 146/2000 - 0s - loss: 1.6567 - acc: 0.4000 Epoch 147/2000 - 0s - loss: 1.6531 - acc: 0.4000 Epoch 148/2000 - 0s - loss: 1.6549 - acc: 0.4000 Epoch 149/2000 - 0s - loss: 1.6503 - acc: 0.3800 Epoch 150/2000 - 0s - loss: 1.6528 - acc: 0.3800 Epoch 151/2000 - 0s - loss: 1.6517 - acc: 0.4000 Epoch 152/2000 - 0s - loss: 1.6626 - acc: 0.3600 Epoch 153/2000 - 0s - loss: 1.6387 - acc: 0.4400 Epoch 154/2000 - 0s - loss: 1.6668 - acc: 0.3400 Epoch 155/2000 - 0s - loss: 1.6528 - acc: 0.4000 Epoch 156/2000 - 0s - loss: 1.6489 - acc: 0.4200 Epoch 157/2000 - 0s - loss: 1.6539 - acc: 0.4200 Epoch 158/2000 - 0s - loss: 1.6301 - acc: 0.4400 Epoch 159/2000 - 0s - loss: 1.6464 - acc: 0.4200 Epoch 160/2000 - 0s - loss: 1.6359 - acc: 0.4200 Epoch 161/2000 - 0s - loss: 1.6323 - acc: 0.4200 Epoch 162/2000 - 0s - loss: 1.6315 - acc: 0.3800 Epoch 163/2000 - 0s - loss: 1.6398 - acc: 0.4200 Epoch 164/2000 - 0s - loss: 1.6220 - acc: 0.4400 Epoch 165/2000 - 0s - loss: 1.6233 - acc: 0.4400 Epoch 166/2000 - 0s - loss: 1.6238 - acc: 0.4400 Epoch 167/2000 - 0s - loss: 1.6134 - acc: 0.4400 Epoch 168/2000 - 0s - loss: 1.6183 - acc: 0.4200 Epoch 169/2000 - 0s - loss: 1.6139 - acc: 0.4400 Epoch 170/2000 - 0s - loss: 1.6020 - acc: 0.4400 Epoch 171/2000 - 0s - loss: 1.6155 - acc: 0.3800 Epoch 172/2000 - 0s - loss: 1.6115 - acc: 0.4200 Epoch 173/2000 - 0s - loss: 1.5988 - acc: 0.4400 Epoch 174/2000 - 0s - loss: 1.6010 - acc: 0.4400 Epoch 175/2000 - 0s - loss: 1.6037 - acc: 0.4400 Epoch 176/2000 - 0s - loss: 1.6042 - acc: 0.4400 Epoch 177/2000 - 0s - loss: 1.6037 - acc: 0.4000 Epoch 178/2000 - 0s - loss: 1.6012 - acc: 0.3800 Epoch 179/2000 - 0s - loss: 1.5942 - acc: 0.4200 Epoch 180/2000 - 0s - loss: 1.5831 - acc: 0.4400 Epoch 181/2000 - 0s - loss: 1.5840 - acc: 0.4400 Epoch 182/2000 - 0s - loss: 1.5876 - acc: 0.4800 Epoch 183/2000 - 0s - loss: 1.5780 - acc: 0.4600 Epoch 184/2000 - 0s - loss: 1.5753 - acc: 0.4000 Epoch 185/2000 - 0s - loss: 1.6072 - acc: 0.4000 Epoch 186/2000 - 0s - loss: 1.5759 - acc: 0.4000 Epoch 187/2000 - 0s - loss: 1.5763 - acc: 0.4200 Epoch 188/2000 - 0s - loss: 1.5887 - acc: 0.3800 Epoch 189/2000 - 0s - loss: 1.5715 - acc: 0.4200 Epoch 190/2000 - 0s - loss: 1.5931 - acc: 0.4200 Epoch 191/2000 - 0s - loss: 1.5734 - acc: 0.4200 Epoch 192/2000 - 0s - loss: 1.5739 - acc: 0.4800 Epoch 193/2000 - 0s - loss: 1.5656 - acc: 0.4600 Epoch 194/2000 - 0s - loss: 1.5528 - acc: 0.4400 Epoch 195/2000 - 0s - loss: 1.5600 - acc: 0.4600 Epoch 196/2000 - 0s - loss: 1.5589 - acc: 0.4600 Epoch 197/2000 - 0s - loss: 1.5494 - acc: 0.4400 Epoch 198/2000 - 0s - loss: 1.5533 - acc: 0.4400 Epoch 199/2000 - 0s - loss: 1.5471 - acc: 0.4000 Epoch 200/2000 - 0s - loss: 1.5420 - acc: 0.4200 Epoch 201/2000 - 0s - loss: 1.5455 - acc: 0.5000 Epoch 202/2000 - 0s - loss: 1.5451 - acc: 0.4600 Epoch 203/2000 - 0s - loss: 1.5440 - acc: 0.4800 Epoch 204/2000 - 0s - loss: 1.5448 - acc: 0.4000 Epoch 205/2000 - 0s - loss: 1.5453 - acc: 0.4200 Epoch 206/2000 - 0s - loss: 1.5415 - acc: 0.4600 Epoch 207/2000 - 0s - loss: 1.5327 - acc: 0.4800 Epoch 208/2000 - 0s - loss: 1.5278 - acc: 0.4800 Epoch 209/2000 - 0s - loss: 1.5396 - acc: 0.4000 Epoch 210/2000 - 0s - loss: 1.5270 - acc: 0.4200 Epoch 211/2000 - 0s - loss: 1.5175 - acc: 0.4600 Epoch 212/2000 - 0s - loss: 1.5201 - acc: 0.4400 Epoch 213/2000 - 0s - loss: 1.5303 - acc: 0.4600 Epoch 214/2000 - 0s - loss: 1.5175 - acc: 0.4600 Epoch 215/2000 - 0s - loss: 1.5259 - acc: 0.4400 Epoch 216/2000 - 0s - loss: 1.5200 - acc: 0.4400 Epoch 217/2000 - 0s - loss: 1.5072 - acc: 0.4800 Epoch 218/2000 - 0s - loss: 1.5201 - acc: 0.4400 Epoch 219/2000 - 0s - loss: 1.5108 - acc: 0.4400 Epoch 220/2000 - 0s - loss: 1.5093 - acc: 0.4400 Epoch 221/2000 - 0s - loss: 1.4988 - acc: 0.4800 Epoch 222/2000 - 0s - loss: 1.5081 - acc: 0.5200 Epoch 223/2000 - 0s - loss: 1.5049 - acc: 0.4600 Epoch 224/2000 - 0s - loss: 1.5289 - acc: 0.4400 Epoch 225/2000 - 0s - loss: 1.4981 - acc: 0.4400 Epoch 226/2000 - 0s - loss: 1.5185 - acc: 0.4800 Epoch 227/2000 - 0s - loss: 1.5077 - acc: 0.4400 Epoch 228/2000 - 0s - loss: 1.4949 - acc: 0.4400 Epoch 229/2000 - 0s - loss: 1.5021 - acc: 0.4600 Epoch 230/2000 - 0s - loss: 1.4870 - acc: 0.4600 Epoch 231/2000 - 0s - loss: 1.5013 - acc: 0.5000 Epoch 232/2000 - 0s - loss: 1.5024 - acc: 0.4800 Epoch 233/2000 - 0s - loss: 1.4763 - acc: 0.4800 Epoch 234/2000 - 0s - loss: 1.4988 - acc: 0.4200 Epoch 235/2000 - 0s - loss: 1.4770 - acc: 0.4600 Epoch 236/2000 - 0s - loss: 1.4734 - acc: 0.4600 Epoch 237/2000 - 0s - loss: 1.4837 - acc: 0.4400 Epoch 238/2000 - 0s - loss: 1.4731 - acc: 0.4800 Epoch 239/2000 - 0s - loss: 1.4652 - acc: 0.5200 Epoch 240/2000 - 0s - loss: 1.4658 - acc: 0.5000 Epoch 241/2000 - 0s - loss: 1.4673 - acc: 0.4800 Epoch 242/2000 - 0s - loss: 1.4766 - acc: 0.4400 Epoch 243/2000 - 0s - loss: 1.4573 - acc: 0.4800 Epoch 244/2000 - 0s - loss: 1.4468 - acc: 0.4800 Epoch 245/2000 - 0s - loss: 1.4530 - acc: 0.4800 Epoch 246/2000 - 0s - loss: 1.4581 - acc: 0.5000 Epoch 247/2000 - 0s - loss: 1.4458 - acc: 0.5000 Epoch 248/2000 - 0s - loss: 1.4583 - acc: 0.4600 Epoch 249/2000 - 0s - loss: 1.4475 - acc: 0.5000 Epoch 250/2000 - 0s - loss: 1.4510 - acc: 0.4800 Epoch 251/2000 - 0s - loss: 1.4373 - acc: 0.4800 Epoch 252/2000 - 0s - loss: 1.4327 - acc: 0.5000 Epoch 253/2000 - 0s - loss: 1.4366 - acc: 0.5000 Epoch 254/2000 - 0s - loss: 1.4344 - acc: 0.4800 Epoch 255/2000 - 0s - loss: 1.4335 - acc: 0.5400 Epoch 256/2000 - 0s - loss: 1.4287 - acc: 0.5200 Epoch 257/2000 - 0s - loss: 1.4266 - acc: 0.4800 Epoch 258/2000 - 0s - loss: 1.4298 - acc: 0.5000 Epoch 259/2000 - 0s - loss: 1.4226 - acc: 0.5000 Epoch 260/2000 - 0s - loss: 1.4387 - acc: 0.4400 Epoch 261/2000 - 0s - loss: 1.4276 - acc: 0.5000 Epoch 262/2000 - 0s - loss: 1.4093 - acc: 0.5200 Epoch 263/2000 - 0s - loss: 1.4089 - acc: 0.5200 Epoch 264/2000 - 0s - loss: 1.4154 - acc: 0.4800 Epoch 265/2000 - 0s - loss: 1.4059 - acc: 0.4800 Epoch 266/2000 - 0s - loss: 1.4070 - acc: 0.5000 Epoch 267/2000 - 0s - loss: 1.4044 - acc: 0.5000 Epoch 268/2000 - 0s - loss: 1.4016 - acc: 0.5200 Epoch 269/2000 - 0s - loss: 1.4005 - acc: 0.5000 Epoch 270/2000 - 0s - loss: 1.4018 - acc: 0.5200 Epoch 271/2000 - 0s - loss: 1.3885 - acc: 0.5200 Epoch 272/2000 - 0s - loss: 1.4030 - acc: 0.5000 Epoch 273/2000 - 0s - loss: 1.3984 - acc: 0.5000 Epoch 274/2000 - 0s - loss: 1.4009 - acc: 0.5000 Epoch 275/2000 - 0s - loss: 1.3807 - acc: 0.5200 Epoch 276/2000 - 0s - loss: 1.3928 - acc: 0.5000 Epoch 277/2000 - 0s - loss: 1.3818 - acc: 0.4800 Epoch 278/2000 - 0s - loss: 1.3878 - acc: 0.5200 Epoch 279/2000 - 0s - loss: 1.3826 - acc: 0.5000 Epoch 280/2000 - 0s - loss: 1.3762 - acc: 0.4800 Epoch 281/2000 - 0s - loss: 1.3913 - acc: 0.5000 Epoch 282/2000 - 0s - loss: 1.3768 - acc: 0.5200 Epoch 283/2000 - 0s - loss: 1.3891 - acc: 0.4800 Epoch 284/2000 - 0s - loss: 1.3701 - acc: 0.5000 Epoch 285/2000 - 0s - loss: 1.3742 - acc: 0.5000 Epoch 286/2000 - 0s - loss: 1.3692 - acc: 0.5200 Epoch 287/2000 - 0s - loss: 1.3626 - acc: 0.5600 Epoch 288/2000 - 0s - loss: 1.3596 - acc: 0.5400 Epoch 289/2000 - 0s - loss: 1.3513 - acc: 0.5400 Epoch 290/2000 - 0s - loss: 1.3698 - acc: 0.5000 Epoch 291/2000 - 0s - loss: 1.3579 - acc: 0.5000 Epoch 292/2000 - 0s - loss: 1.3723 - acc: 0.5200 Epoch 293/2000 - 0s - loss: 1.3479 - acc: 0.4800 Epoch 294/2000 - 0s - loss: 1.3494 - acc: 0.5000 Epoch 295/2000 - 0s - loss: 1.3404 - acc: 0.5600 Epoch 296/2000 - 0s - loss: 1.3631 - acc: 0.5000 Epoch 297/2000 - 0s - loss: 1.3580 - acc: 0.5200 Epoch 298/2000 - 0s - loss: 1.3501 - acc: 0.5400 Epoch 299/2000 - 0s - loss: 1.3515 - acc: 0.5200 Epoch 300/2000 - 0s - loss: 1.3365 - acc: 0.5200 Epoch 301/2000 - 0s - loss: 1.3652 - acc: 0.5000 Epoch 302/2000 - 0s - loss: 1.3440 - acc: 0.4800 Epoch 303/2000 - 0s - loss: 1.3541 - acc: 0.5400 Epoch 304/2000 - 0s - loss: 1.3250 - acc: 0.5400 Epoch 305/2000 - 0s - loss: 1.3269 - acc: 0.5600 Epoch 306/2000 - 0s - loss: 1.3285 - acc: 0.5600 Epoch 307/2000 - 0s - loss: 1.3237 - acc: 0.5200 Epoch 308/2000 - 0s - loss: 1.3161 - acc: 0.5400 Epoch 309/2000 - 0s - loss: 1.3264 - acc: 0.4800 Epoch 310/2000 - 0s - loss: 1.3258 - acc: 0.5000 Epoch 311/2000 - 0s - loss: 1.3236 - acc: 0.5400 Epoch 312/2000 - 0s - loss: 1.3271 - acc: 0.5200 Epoch 313/2000 - 0s - loss: 1.3132 - acc: 0.5400 Epoch 314/2000 - 0s - loss: 1.3051 - acc: 0.5400 Epoch 315/2000 - 0s - loss: 1.3065 - acc: 0.5600 Epoch 316/2000 - 0s - loss: 1.3069 - acc: 0.5000 Epoch 317/2000 - 0s - loss: 1.2968 - acc: 0.5400 Epoch 318/2000 - 0s - loss: 1.3096 - acc: 0.5400 Epoch 319/2000 - 0s - loss: 1.3163 - acc: 0.5200 Epoch 320/2000 - 0s - loss: 1.3052 - acc: 0.5200 Epoch 321/2000 - 0s - loss: 1.2949 - acc: 0.5400 Epoch 322/2000 - 0s - loss: 1.2882 - acc: 0.5400 Epoch 323/2000 - 0s - loss: 1.2960 - acc: 0.5600 Epoch 324/2000 - 0s - loss: 1.2876 - acc: 0.5200 Epoch 325/2000 - 0s - loss: 1.2825 - acc: 0.5200 Epoch 326/2000 - 0s - loss: 1.2841 - acc: 0.5400 Epoch 327/2000 - 0s - loss: 1.2823 - acc: 0.5400 Epoch 328/2000 - 0s - loss: 1.2830 - acc: 0.5600 Epoch 329/2000 - 0s - loss: 1.3260 - acc: 0.4600 Epoch 330/2000 - 0s - loss: 1.3212 - acc: 0.5000 Epoch 331/2000 - 0s - loss: 1.2946 - acc: 0.5200 Epoch 332/2000 - 0s - loss: 1.2906 - acc: 0.5400 Epoch 333/2000 - 0s - loss: 1.2811 - acc: 0.5400 Epoch 334/2000 - 0s - loss: 1.2697 - acc: 0.5600 Epoch 335/2000 - 0s - loss: 1.2750 - acc: 0.5800 Epoch 336/2000 - 0s - loss: 1.2834 - acc: 0.5000 Epoch 337/2000 - 0s - loss: 1.2576 - acc: 0.5400 Epoch 338/2000 - 0s - loss: 1.2842 - acc: 0.5600 Epoch 339/2000 - 0s - loss: 1.2574 - acc: 0.5000 Epoch 340/2000 - 0s - loss: 1.2872 - acc: 0.5000 Epoch 341/2000 - 0s - loss: 1.2474 - acc: 0.5200 Epoch 342/2000 - 0s - loss: 1.2728 - acc: 0.5400 Epoch 343/2000 - 0s - loss: 1.2503 - acc: 0.5200 Epoch 344/2000 - 0s - loss: 1.2782 - acc: 0.4600 Epoch 345/2000 - 0s - loss: 1.2383 - acc: 0.5200 Epoch 346/2000 - 0s - loss: 1.2781 - acc: 0.5000 Epoch 347/2000 - 0s - loss: 1.2465 - acc: 0.5200 Epoch 348/2000 - 0s - loss: 1.2758 - acc: 0.4600 Epoch 349/2000 - 0s - loss: 1.2755 - acc: 0.5200 Epoch 350/2000 - 0s - loss: 1.2791 - acc: 0.5200 Epoch 351/2000 - 0s - loss: 1.2313 - acc: 0.5200 Epoch 352/2000 - 0s - loss: 1.2710 - acc: 0.4800 Epoch 353/2000 - 0s - loss: 1.2377 - acc: 0.5400 Epoch 354/2000 - 0s - loss: 1.2498 - acc: 0.5600 Epoch 355/2000 - 0s - loss: 1.2508 - acc: 0.5400 Epoch 356/2000 - 0s - loss: 1.2575 - acc: 0.4800 Epoch 357/2000 - 0s - loss: 1.2511 - acc: 0.5400 Epoch 358/2000 - 0s - loss: 1.2355 - acc: 0.5600 Epoch 359/2000 - 0s - loss: 1.2239 - acc: 0.5400 Epoch 360/2000 - 0s - loss: 1.2260 - acc: 0.5000 Epoch 361/2000 - 0s - loss: 1.2306 - acc: 0.5000 Epoch 362/2000 - 0s - loss: 1.2272 - acc: 0.5600 Epoch 363/2000 - 0s - loss: 1.2233 - acc: 0.5400 Epoch 364/2000 - 0s - loss: 1.2229 - acc: 0.5200 Epoch 365/2000 - 0s - loss: 1.2181 - acc: 0.5200 Epoch 366/2000 - 0s - loss: 1.2178 - acc: 0.5400 Epoch 367/2000 - 0s - loss: 1.2188 - acc: 0.5400 Epoch 368/2000 - 0s - loss: 1.2247 - acc: 0.5000 Epoch 369/2000 - 0s - loss: 1.2181 - acc: 0.4800 Epoch 370/2000 - 0s - loss: 1.2057 - acc: 0.4800 Epoch 371/2000 - 0s - loss: 1.2080 - acc: 0.4800 Epoch 372/2000 - 0s - loss: 1.2104 - acc: 0.5000 Epoch 373/2000 - 0s - loss: 1.1993 - acc: 0.5200 Epoch 374/2000 - 0s - loss: 1.2358 - acc: 0.5600 Epoch 375/2000 - 0s - loss: 1.2455 - acc: 0.4600 Epoch 376/2000 - 0s - loss: 1.2063 - acc: 0.4400 Epoch 377/2000 - 0s - loss: 1.2051 - acc: 0.5400 Epoch 378/2000 - 0s - loss: 1.2029 - acc: 0.5600 Epoch 379/2000 - 0s - loss: 1.1943 - acc: 0.5000 Epoch 380/2000 - 0s - loss: 1.1982 - acc: 0.5000 Epoch 381/2000 - 0s - loss: 1.1988 - acc: 0.5000 Epoch 382/2000 - 0s - loss: 1.1978 - acc: 0.5200 Epoch 383/2000 - 0s - loss: 1.1956 - acc: 0.5400 Epoch 384/2000 - 0s - loss: 1.1817 - acc: 0.5600 Epoch 385/2000 - 0s - loss: 1.2002 - acc: 0.4600 Epoch 386/2000 - 0s - loss: 1.1734 - acc: 0.5000 Epoch 387/2000 - 0s - loss: 1.2126 - acc: 0.5800 Epoch 388/2000 - 0s - loss: 1.1940 - acc: 0.5600 Epoch 389/2000 - 0s - loss: 1.2259 - acc: 0.4800 Epoch 390/2000 - 0s - loss: 1.1845 - acc: 0.5000 Epoch 391/2000 - 0s - loss: 1.1781 - acc: 0.5400 Epoch 392/2000 - 0s - loss: 1.1757 - acc: 0.5600 Epoch 393/2000 - 0s - loss: 1.1749 - acc: 0.5200 Epoch 394/2000 - 0s - loss: 1.1722 - acc: 0.5000 Epoch 395/2000 - 0s - loss: 1.2181 - acc: 0.5400 Epoch 396/2000 - 0s - loss: 1.1661 - acc: 0.5000 Epoch 397/2000 - 0s - loss: 1.1999 - acc: 0.5000 Epoch 398/2000 - 0s - loss: 1.1916 - acc: 0.5800 Epoch 399/2000 - 0s - loss: 1.1915 - acc: 0.6000 Epoch 400/2000 - 0s - loss: 1.1668 - acc: 0.5000 Epoch 401/2000 - 0s - loss: 1.1653 - acc: 0.5400 Epoch 402/2000 - 0s - loss: 1.1618 - acc: 0.5800 Epoch 403/2000 - 0s - loss: 1.1538 - acc: 0.5600 Epoch 404/2000 - 0s - loss: 1.1706 - acc: 0.5000 Epoch 405/2000 - 0s - loss: 1.1753 - acc: 0.5200 Epoch 406/2000 - 0s - loss: 1.1463 - acc: 0.5800 Epoch 407/2000 - 0s - loss: 1.1521 - acc: 0.5800 Epoch 408/2000 - 0s - loss: 1.1479 - acc: 0.5800 Epoch 409/2000 - 0s - loss: 1.1695 - acc: 0.5400 Epoch 410/2000 - 0s - loss: 1.1572 - acc: 0.5600 Epoch 411/2000 - 0s - loss: 1.1400 - acc: 0.5400 Epoch 412/2000 - 0s - loss: 1.1451 - acc: 0.5400 Epoch 413/2000 - 0s - loss: 1.1381 - acc: 0.5600 Epoch 414/2000 - 0s - loss: 1.1485 - acc: 0.6000 Epoch 415/2000 - 0s - loss: 1.1357 - acc: 0.5200 Epoch 416/2000 - 0s - loss: 1.1369 - acc: 0.5400 Epoch 417/2000 - 0s - loss: 1.1397 - acc: 0.5400 Epoch 418/2000 - 0s - loss: 1.1231 - acc: 0.5400 Epoch 419/2000 - 0s - loss: 1.1438 - acc: 0.5400 Epoch 420/2000 - 0s - loss: 1.1345 - acc: 0.5400 Epoch 421/2000 - 0s - loss: 1.1195 - acc: 0.5800 Epoch 422/2000 - 0s - loss: 1.1274 - acc: 0.5400 Epoch 423/2000 - 0s - loss: 1.1176 - acc: 0.5600 Epoch 424/2000 - 0s - loss: 1.1180 - acc: 0.5800 Epoch 425/2000 - 0s - loss: 1.1444 - acc: 0.5200 Epoch 426/2000 - 0s - loss: 1.1159 - acc: 0.5800 Epoch 427/2000 - 0s - loss: 1.1381 - acc: 0.6000 Epoch 428/2000 - 0s - loss: 1.0978 - acc: 0.6200 Epoch 429/2000 - 0s - loss: 1.1527 - acc: 0.5400 Epoch 430/2000 - 0s - loss: 1.1087 - acc: 0.5800 Epoch 431/2000 - 0s - loss: 1.1356 - acc: 0.5800 Epoch 432/2000 - 0s - loss: 1.1202 - acc: 0.6000 Epoch 433/2000 - 0s - loss: 1.1210 - acc: 0.5600 Epoch 434/2000 - 0s - loss: 1.1164 - acc: 0.5800 Epoch 435/2000 - 0s - loss: 1.1051 - acc: 0.5800 Epoch 436/2000 - 0s - loss: 1.1017 - acc: 0.5400 Epoch 437/2000 - 0s - loss: 1.1179 - acc: 0.6200 Epoch 438/2000 - 0s - loss: 1.0974 - acc: 0.5800 Epoch 439/2000 - 0s - loss: 1.0979 - acc: 0.5600 Epoch 440/2000 - 0s - loss: 1.1081 - acc: 0.5800 Epoch 441/2000 - 0s - loss: 1.0996 - acc: 0.5800 Epoch 442/2000 - 0s - loss: 1.0914 - acc: 0.6000 Epoch 443/2000 - 0s - loss: 1.0880 - acc: 0.5400 Epoch 444/2000 - 0s - loss: 1.0826 - acc: 0.5600 Epoch 445/2000 - 0s - loss: 1.0909 - acc: 0.5600 Epoch 446/2000 - 0s - loss: 1.1007 - acc: 0.5400 Epoch 447/2000 - 0s - loss: 1.0800 - acc: 0.5200 Epoch 448/2000 - 0s - loss: 1.0896 - acc: 0.5400 Epoch 449/2000 - 0s - loss: 1.0710 - acc: 0.5400 Epoch 450/2000 - 0s - loss: 1.0740 - acc: 0.5800 Epoch 451/2000 - 0s - loss: 1.0673 - acc: 0.5800 Epoch 452/2000 - 0s - loss: 1.0697 - acc: 0.5600 Epoch 453/2000 - 0s - loss: 1.0666 - acc: 0.5600 Epoch 454/2000 - 0s - loss: 1.0687 - acc: 0.5800 Epoch 455/2000 - 0s - loss: 1.0835 - acc: 0.5400 Epoch 456/2000 - 0s - loss: 1.0707 - acc: 0.5600 Epoch 457/2000 - 0s - loss: 1.0769 - acc: 0.5800 Epoch 458/2000 - 0s - loss: 1.0701 - acc: 0.5400 Epoch 459/2000 - 0s - loss: 1.0644 - acc: 0.5400 Epoch 460/2000 - 0s - loss: 1.0851 - acc: 0.6000 Epoch 461/2000 - 0s - loss: 1.0873 - acc: 0.6000 Epoch 462/2000 - 0s - loss: 1.0591 - acc: 0.6400 Epoch 463/2000 - 0s - loss: 1.0705 - acc: 0.5800 Epoch 464/2000 - 0s - loss: 1.0435 - acc: 0.6000 Epoch 465/2000 - 0s - loss: 1.0548 - acc: 0.5600 Epoch 466/2000 - 0s - loss: 1.0495 - acc: 0.5800 Epoch 467/2000 - 0s - loss: 1.0522 - acc: 0.6000 Epoch 468/2000 - 0s - loss: 1.0422 - acc: 0.6000 Epoch 469/2000 - 0s - loss: 1.0441 - acc: 0.6000 Epoch 470/2000 - 0s - loss: 1.0391 - acc: 0.6000 Epoch 471/2000 - 0s - loss: 1.0751 - acc: 0.5800 Epoch 472/2000 - 0s - loss: 1.0378 - acc: 0.6200 Epoch 473/2000 - 0s - loss: 1.0441 - acc: 0.5800 Epoch 474/2000 - 0s - loss: 1.0314 - acc: 0.6400 Epoch 475/2000 - 0s - loss: 1.0335 - acc: 0.6200 Epoch 476/2000 - 0s - loss: 1.0272 - acc: 0.6200 Epoch 477/2000 - 0s - loss: 1.0294 - acc: 0.5600 Epoch 478/2000 - 0s - loss: 1.0221 - acc: 0.6000 Epoch 479/2000 - 0s - loss: 1.0206 - acc: 0.6000 Epoch 480/2000 - 0s - loss: 1.0253 - acc: 0.6000 Epoch 481/2000 - 0s - loss: 1.0234 - acc: 0.6000 Epoch 482/2000 - 0s - loss: 1.0235 - acc: 0.6000 Epoch 483/2000 - 0s - loss: 1.0039 - acc: 0.6400 Epoch 484/2000 - 0s - loss: 1.0099 - acc: 0.6400 Epoch 485/2000 - 0s - loss: 1.0163 - acc: 0.6200 Epoch 486/2000 - 0s - loss: 1.0276 - acc: 0.5400 Epoch 487/2000 - 0s - loss: 1.0199 - acc: 0.5400 Epoch 488/2000 - 0s - loss: 1.0092 - acc: 0.6000 Epoch 489/2000 - 0s - loss: 1.0070 - acc: 0.6400 Epoch 490/2000 - 0s - loss: 1.0037 - acc: 0.6000 Epoch 491/2000 - 0s - loss: 0.9999 - acc: 0.6000 Epoch 492/2000 - 0s - loss: 0.9949 - acc: 0.6400 Epoch 493/2000 - 0s - loss: 0.9949 - acc: 0.6000 Epoch 494/2000 - 0s - loss: 0.9961 - acc: 0.6000 Epoch 495/2000 - 0s - loss: 1.0098 - acc: 0.6200 Epoch 496/2000 - 0s - loss: 0.9962 - acc: 0.6200 Epoch 497/2000 - 0s - loss: 0.9922 - acc: 0.6000 Epoch 498/2000 - 0s - loss: 0.9809 - acc: 0.6000 Epoch 499/2000 - 0s - loss: 0.9885 - acc: 0.6200 Epoch 500/2000 - 0s - loss: 0.9928 - acc: 0.6400 Epoch 501/2000 - 0s - loss: 0.9832 - acc: 0.6400 Epoch 502/2000 - 0s - loss: 0.9953 - acc: 0.5400 Epoch 503/2000 - 0s - loss: 0.9763 - acc: 0.6200 Epoch 504/2000 - 0s - loss: 1.0003 - acc: 0.6400 Epoch 505/2000 - 0s - loss: 0.9747 - acc: 0.6400 Epoch 506/2000 - 0s - loss: 0.9708 - acc: 0.6400 Epoch 507/2000 - 0s - loss: 0.9694 - acc: 0.6200 Epoch 508/2000 - 0s - loss: 0.9751 - acc: 0.6200 Epoch 509/2000 - 0s - loss: 0.9816 - acc: 0.6400 Epoch 510/2000 - 0s - loss: 0.9668 - acc: 0.6400 Epoch 511/2000 - 0s - loss: 0.9706 - acc: 0.5800 Epoch 512/2000 - 0s - loss: 0.9736 - acc: 0.6000 Epoch 513/2000 - 0s - loss: 0.9604 - acc: 0.6400 Epoch 514/2000 - 0s - loss: 0.9658 - acc: 0.6200 Epoch 515/2000 - 0s - loss: 0.9604 - acc: 0.6600 Epoch 516/2000 - 0s - loss: 0.9617 - acc: 0.6400 Epoch 517/2000 - 0s - loss: 0.9697 - acc: 0.5800 Epoch 518/2000 - 0s - loss: 0.9495 - acc: 0.6400 Epoch 519/2000 - 0s - loss: 0.9482 - acc: 0.6400 Epoch 520/2000 - 0s - loss: 0.9479 - acc: 0.6200 Epoch 521/2000 - 0s - loss: 0.9418 - acc: 0.6800 Epoch 522/2000 - 0s - loss: 0.9451 - acc: 0.6400 Epoch 523/2000 - 0s - loss: 0.9386 - acc: 0.6400 Epoch 524/2000 - 0s - loss: 0.9428 - acc: 0.6200 Epoch 525/2000 - 0s - loss: 0.9378 - acc: 0.6600 Epoch 526/2000 - 0s - loss: 0.9463 - acc: 0.6000 Epoch 527/2000 - 0s - loss: 0.9382 - acc: 0.6000 Epoch 528/2000 - 0s - loss: 0.9603 - acc: 0.6000 Epoch 529/2000 - 0s - loss: 0.9441 - acc: 0.6200 Epoch 530/2000 - 0s - loss: 0.9248 - acc: 0.6400 Epoch 531/2000 - 0s - loss: 0.9367 - acc: 0.6400 Epoch 532/2000 - 0s - loss: 0.9348 - acc: 0.6400 Epoch 533/2000 - 0s - loss: 0.9236 - acc: 0.6800 Epoch 534/2000 - 0s - loss: 0.9183 - acc: 0.6400 Epoch 535/2000 - 0s - loss: 0.9216 - acc: 0.6400 Epoch 536/2000 - 0s - loss: 0.9109 - acc: 0.6400 Epoch 537/2000 - 0s - loss: 0.9270 - acc: 0.6400 Epoch 538/2000 - 0s - loss: 0.9330 - acc: 0.6600 Epoch 539/2000 - 0s - loss: 0.9140 - acc: 0.6000 Epoch 540/2000 - 0s - loss: 0.9616 - acc: 0.6200 Epoch 541/2000 - 0s - loss: 0.9247 - acc: 0.6200 Epoch 542/2000 - 0s - loss: 0.9389 - acc: 0.6200 Epoch 543/2000 - 0s - loss: 0.9321 - acc: 0.6600 Epoch 544/2000 - 0s - loss: 0.9244 - acc: 0.6000 Epoch 545/2000 - 0s - loss: 0.9382 - acc: 0.6400 Epoch 546/2000 - 0s - loss: 0.9187 - acc: 0.6800 Epoch 547/2000 - 0s - loss: 0.9047 - acc: 0.6200 Epoch 548/2000 - 0s - loss: 0.9343 - acc: 0.6400 Epoch 549/2000 - 0s - loss: 0.8865 - acc: 0.7000 Epoch 550/2000 - 0s - loss: 0.9062 - acc: 0.6400 Epoch 551/2000 - 0s - loss: 0.9019 - acc: 0.6800 Epoch 552/2000 - 0s - loss: 0.8915 - acc: 0.6800 Epoch 553/2000 - 0s - loss: 0.8849 - acc: 0.6600 Epoch 554/2000 - 0s - loss: 0.8983 - acc: 0.7000 Epoch 555/2000 - 0s - loss: 0.8852 - acc: 0.7000 Epoch 556/2000 - 0s - loss: 0.8701 - acc: 0.6800 Epoch 557/2000 - 0s - loss: 0.8827 - acc: 0.7000 Epoch 558/2000 - 0s - loss: 0.8918 - acc: 0.6400 Epoch 559/2000 - 0s - loss: 0.8711 - acc: 0.6400 Epoch 560/2000 - 0s - loss: 0.8839 - acc: 0.6600 Epoch 561/2000 - 0s - loss: 0.8737 - acc: 0.6800 Epoch 562/2000 - 0s - loss: 0.8738 - acc: 0.6800 Epoch 563/2000 - 0s - loss: 0.8662 - acc: 0.6600 Epoch 564/2000 - 0s - loss: 0.8596 - acc: 0.7000 Epoch 565/2000 - 0s - loss: 0.8680 - acc: 0.6800 Epoch 566/2000 - 0s - loss: 0.8717 - acc: 0.6600 Epoch 567/2000 - 0s - loss: 0.8894 - acc: 0.6400 Epoch 568/2000 - 0s - loss: 0.8829 - acc: 0.6400 Epoch 569/2000 - 0s - loss: 0.8677 - acc: 0.6600 Epoch 570/2000 - 0s - loss: 0.8644 - acc: 0.6800 Epoch 571/2000 - 0s - loss: 0.8561 - acc: 0.7000 Epoch 572/2000 - 0s - loss: 0.8536 - acc: 0.6800 Epoch 573/2000 - 0s - loss: 0.8388 - acc: 0.7000 Epoch 574/2000 - 0s - loss: 0.8466 - acc: 0.7200 Epoch 575/2000 - 0s - loss: 0.8556 - acc: 0.7000 Epoch 576/2000 - 0s - loss: 0.8502 - acc: 0.6400 Epoch 577/2000 - 0s - loss: 0.8435 - acc: 0.6800 Epoch 578/2000 - 0s - loss: 0.8397 - acc: 0.7000 Epoch 579/2000 - 0s - loss: 0.8450 - acc: 0.7000 Epoch 580/2000 - 0s - loss: 0.8512 - acc: 0.6400 Epoch 581/2000 - 0s - loss: 0.8432 - acc: 0.6800 Epoch 582/2000 - 0s - loss: 0.8333 - acc: 0.6800 Epoch 583/2000 - 0s - loss: 0.8362 - acc: 0.7000 Epoch 584/2000 - 0s - loss: 0.8255 - acc: 0.7200 Epoch 585/2000 - 0s - loss: 0.8335 - acc: 0.7000 Epoch 586/2000 - 0s - loss: 0.8426 - acc: 0.7000 Epoch 587/2000 - 0s - loss: 0.8336 - acc: 0.6800 Epoch 588/2000 - 0s - loss: 0.8229 - acc: 0.6800 Epoch 589/2000 - 0s - loss: 0.8294 - acc: 0.6800 Epoch 590/2000 - 0s - loss: 0.8415 - acc: 0.7000 Epoch 591/2000 - 0s - loss: 0.8349 - acc: 0.6800 Epoch 592/2000 - 0s - loss: 0.8071 - acc: 0.7200 Epoch 593/2000 - 0s - loss: 0.8366 - acc: 0.6800 Epoch 594/2000 - 0s - loss: 0.8140 - acc: 0.7400 Epoch 595/2000 - 0s - loss: 0.8190 - acc: 0.6800 Epoch 596/2000 - 0s - loss: 0.8467 - acc: 0.6400 Epoch 597/2000 - 0s - loss: 0.8096 - acc: 0.7200 Epoch 598/2000 - 0s - loss: 0.8524 - acc: 0.6000 Epoch 599/2000 - 0s - loss: 0.8256 - acc: 0.7000 Epoch 600/2000 - 0s - loss: 0.8122 - acc: 0.6800 Epoch 601/2000 - 0s - loss: 0.8259 - acc: 0.6800 Epoch 602/2000 - 0s - loss: 0.7964 - acc: 0.6800 Epoch 603/2000 - 0s - loss: 0.8145 - acc: 0.7000 Epoch 604/2000 - 0s - loss: 0.8047 - acc: 0.7200 Epoch 605/2000 - 0s - loss: 0.7872 - acc: 0.7200 Epoch 606/2000 - 0s - loss: 0.8105 - acc: 0.6800 Epoch 607/2000 - 0s - loss: 0.7879 - acc: 0.6800 Epoch 608/2000 - 0s - loss: 0.7828 - acc: 0.7400 Epoch 609/2000 - 0s - loss: 0.7794 - acc: 0.7200 Epoch 610/2000 - 0s - loss: 0.8077 - acc: 0.6600 Epoch 611/2000 - 0s - loss: 0.7860 - acc: 0.7200 Epoch 612/2000 - 0s - loss: 0.7798 - acc: 0.7000 Epoch 613/2000 - 0s - loss: 0.7784 - acc: 0.7200 Epoch 614/2000 - 0s - loss: 0.7775 - acc: 0.7200 Epoch 615/2000 - 0s - loss: 0.7745 - acc: 0.7000 Epoch 616/2000 - 0s - loss: 0.7678 - acc: 0.7200 Epoch 617/2000 - 0s - loss: 0.7800 - acc: 0.7200 Epoch 618/2000 - 0s - loss: 0.7751 - acc: 0.7200 Epoch 619/2000 - 0s - loss: 0.7614 - acc: 0.7400 Epoch 620/2000 - 0s - loss: 0.7728 - acc: 0.7000 Epoch 621/2000 - 0s - loss: 0.7586 - acc: 0.7200 Epoch 622/2000 - 0s - loss: 0.7638 - acc: 0.7400 Epoch 623/2000 - 0s - loss: 0.7801 - acc: 0.7200 Epoch 624/2000 - 0s - loss: 0.7685 - acc: 0.7200 Epoch 625/2000 - 0s - loss: 0.7593 - acc: 0.7400 Epoch 626/2000 - 0s - loss: 0.7529 - acc: 0.7800 Epoch 627/2000 - 0s - loss: 0.7575 - acc: 0.7200 Epoch 628/2000 - 0s - loss: 0.7530 - acc: 0.7000 Epoch 629/2000 - 0s - loss: 0.7446 - acc: 0.6800 Epoch 630/2000 - 0s - loss: 0.7614 - acc: 0.7600 Epoch 631/2000 - 0s - loss: 0.7438 - acc: 0.7600 Epoch 632/2000 - 0s - loss: 0.7560 - acc: 0.7200 Epoch 633/2000 - 0s - loss: 0.7631 - acc: 0.7000 Epoch 634/2000 - 0s - loss: 0.7685 - acc: 0.6600 Epoch 635/2000 - 0s - loss: 0.7534 - acc: 0.6800 Epoch 636/2000 - 0s - loss: 0.7393 - acc: 0.7400 Epoch 637/2000 - 0s - loss: 0.7598 - acc: 0.6600 Epoch 638/2000 - 0s - loss: 0.7469 - acc: 0.6800 Epoch 639/2000 - 0s - loss: 0.7526 - acc: 0.7000 Epoch 640/2000 - 0s - loss: 0.7365 - acc: 0.7000 Epoch 641/2000 - 0s - loss: 0.7366 - acc: 0.7200 Epoch 642/2000 - 0s - loss: 0.7389 - acc: 0.7200 Epoch 643/2000 - 0s - loss: 0.7327 - acc: 0.7200 Epoch 644/2000 - 0s - loss: 0.7618 - acc: 0.6800 Epoch 645/2000 - 0s - loss: 0.7515 - acc: 0.6800 Epoch 646/2000 - 0s - loss: 0.7534 - acc: 0.7200 Epoch 647/2000 - 0s - loss: 0.8017 - acc: 0.7000 Epoch 648/2000 - 0s - loss: 0.7558 - acc: 0.7600 Epoch 649/2000 - 0s - loss: 0.7549 - acc: 0.7400 Epoch 650/2000 - 0s - loss: 0.7400 - acc: 0.7400 Epoch 651/2000 - 0s - loss: 0.7258 - acc: 0.7400 Epoch 652/2000 - 0s - loss: 0.7277 - acc: 0.7000 Epoch 653/2000 - 0s - loss: 0.7170 - acc: 0.7000 Epoch 654/2000 - 0s - loss: 0.7270 - acc: 0.7200 Epoch 655/2000 - 0s - loss: 0.7280 - acc: 0.7200 Epoch 656/2000 - 0s - loss: 0.7156 - acc: 0.7400 Epoch 657/2000 - 0s - loss: 0.7593 - acc: 0.7000 Epoch 658/2000 - 0s - loss: 0.7233 - acc: 0.7200 Epoch 659/2000 - 0s - loss: 0.7233 - acc: 0.7000 Epoch 660/2000 - 0s - loss: 0.7284 - acc: 0.7200 Epoch 661/2000 - 0s - loss: 0.7088 - acc: 0.7600 Epoch 662/2000 - 0s - loss: 0.7287 - acc: 0.7200 Epoch 663/2000 - 0s - loss: 0.7035 - acc: 0.7600 Epoch 664/2000 - 0s - loss: 0.7182 - acc: 0.7200 Epoch 665/2000 - 0s - loss: 0.7029 - acc: 0.7600 Epoch 666/2000 - 0s - loss: 0.7592 - acc: 0.7000 Epoch 667/2000 - 0s - loss: 0.7097 - acc: 0.6800 Epoch 668/2000 - 0s - loss: 0.7451 - acc: 0.7000 Epoch 669/2000 - 0s - loss: 0.7123 - acc: 0.7000 Epoch 670/2000 - 0s - loss: 0.7028 - acc: 0.7800 Epoch 671/2000 - 0s - loss: 0.6981 - acc: 0.8000 Epoch 672/2000 - 0s - loss: 0.7390 - acc: 0.7200 Epoch 673/2000 - 0s - loss: 0.7076 - acc: 0.7000 Epoch 674/2000 - 0s - loss: 0.7213 - acc: 0.7000 Epoch 675/2000 - 0s - loss: 0.7247 - acc: 0.7000 Epoch 676/2000 - 0s - loss: 0.6994 - acc: 0.7200 Epoch 677/2000 - 0s - loss: 0.6940 - acc: 0.7400 Epoch 678/2000 - 0s - loss: 0.6956 - acc: 0.7200 Epoch 679/2000 - 0s - loss: 0.6786 - acc: 0.7200 Epoch 680/2000 - 0s - loss: 0.6796 - acc: 0.7400 Epoch 681/2000 - 0s - loss: 0.6952 - acc: 0.7200 Epoch 682/2000 - 0s - loss: 0.6814 - acc: 0.7600 Epoch 683/2000 - 0s - loss: 0.6793 - acc: 0.7200 Epoch 684/2000 - 0s - loss: 0.6917 - acc: 0.7400 Epoch 685/2000 - 0s - loss: 0.6882 - acc: 0.7000 Epoch 686/2000 - 0s - loss: 0.6827 - acc: 0.8000 Epoch 687/2000 - 0s - loss: 0.6761 - acc: 0.8000 Epoch 688/2000 - 0s - loss: 0.6767 - acc: 0.7800 Epoch 689/2000 - 0s - loss: 0.6744 - acc: 0.7600 Epoch 690/2000 - 0s - loss: 0.6878 - acc: 0.7800 Epoch 691/2000 - 0s - loss: 0.6839 - acc: 0.8400 Epoch 692/2000 - 0s - loss: 0.6943 - acc: 0.7800 Epoch 693/2000 - 0s - loss: 0.6749 - acc: 0.7600 Epoch 694/2000 - 0s - loss: 0.6801 - acc: 0.7200 Epoch 695/2000 - 0s - loss: 0.6778 - acc: 0.7200 Epoch 696/2000 - 0s - loss: 0.6832 - acc: 0.8000 Epoch 697/2000 - 0s - loss: 0.6829 - acc: 0.7200 Epoch 698/2000 - 0s - loss: 0.6575 - acc: 0.7600 Epoch 699/2000 - 0s - loss: 0.6604 - acc: 0.7800 Epoch 700/2000 - 0s - loss: 0.6845 - acc: 0.7400 Epoch 701/2000 - 0s - loss: 0.6874 - acc: 0.7800 Epoch 702/2000 - 0s - loss: 0.6788 - acc: 0.7800 Epoch 703/2000 - 0s - loss: 0.6638 - acc: 0.8000 Epoch 704/2000 - 0s - loss: 0.6920 - acc: 0.7800 Epoch 705/2000 - 0s - loss: 0.6522 - acc: 0.7200 Epoch 706/2000 - 0s - loss: 0.6768 - acc: 0.7600 Epoch 707/2000 - 0s - loss: 0.6525 - acc: 0.8200 Epoch 708/2000 - 0s - loss: 0.6606 - acc: 0.8200 Epoch 709/2000 - 0s - loss: 0.6595 - acc: 0.7800 Epoch 710/2000 - 0s - loss: 0.6722 - acc: 0.7800 Epoch 711/2000 - 0s - loss: 0.6600 - acc: 0.8400 Epoch 712/2000 - 0s - loss: 0.6670 - acc: 0.8000 Epoch 713/2000 - 0s - loss: 0.6295 - acc: 0.8400 Epoch 714/2000 - 0s - loss: 0.6483 - acc: 0.7400 Epoch 715/2000 - 0s - loss: 0.6517 - acc: 0.7800 Epoch 716/2000 - 0s - loss: 0.6355 - acc: 0.7800 Epoch 717/2000 - 0s - loss: 0.6317 - acc: 0.8400 Epoch 718/2000 - 0s - loss: 0.6384 - acc: 0.8000 Epoch 719/2000 - 0s - loss: 0.6293 - acc: 0.8000 Epoch 720/2000 - 0s - loss: 0.6419 - acc: 0.8000 Epoch 721/2000 - 0s - loss: 0.6324 - acc: 0.8000 Epoch 722/2000 - 0s - loss: 0.6308 - acc: 0.8200 Epoch 723/2000 - 0s - loss: 0.6251 - acc: 0.7800 Epoch 724/2000 - 0s - loss: 0.6251 - acc: 0.7800 Epoch 725/2000 - 0s - loss: 0.6334 - acc: 0.8200 Epoch 726/2000 - 0s - loss: 0.6351 - acc: 0.8200 Epoch 727/2000 - 0s - loss: 0.6328 - acc: 0.8200 Epoch 728/2000 - 0s - loss: 0.6429 - acc: 0.8000 Epoch 729/2000 - 0s - loss: 0.6171 - acc: 0.8400 Epoch 730/2000 - 0s - loss: 0.6355 - acc: 0.7600 Epoch 731/2000 - 0s - loss: 0.6334 - acc: 0.8200 Epoch 732/2000 - 0s - loss: 0.6241 - acc: 0.8200 Epoch 733/2000 - 0s - loss: 0.6368 - acc: 0.7800 Epoch 734/2000 - 0s - loss: 0.6485 - acc: 0.8000 Epoch 735/2000 - 0s - loss: 0.6281 - acc: 0.8000 Epoch 736/2000 - 0s - loss: 0.6267 - acc: 0.7800 Epoch 737/2000 - 0s - loss: 0.6137 - acc: 0.8000 Epoch 738/2000 - 0s - loss: 0.6316 - acc: 0.7400 Epoch 739/2000 - 0s - loss: 0.6270 - acc: 0.8000 Epoch 740/2000 - 0s - loss: 0.6229 - acc: 0.8400 Epoch 741/2000 - 0s - loss: 0.6089 - acc: 0.8600 Epoch 742/2000 - 0s - loss: 0.6235 - acc: 0.7800 Epoch 743/2000 - 0s - loss: 0.6250 - acc: 0.8000 Epoch 744/2000 - 0s - loss: 0.5932 - acc: 0.8000 Epoch 745/2000 - 0s - loss: 0.6389 - acc: 0.7800 Epoch 746/2000 - 0s - loss: 0.6443 - acc: 0.8000 Epoch 747/2000 - 0s - loss: 0.6544 - acc: 0.7600 Epoch 748/2000 - 0s - loss: 0.6091 - acc: 0.8200 Epoch 749/2000 - 0s - loss: 0.6342 - acc: 0.7600 Epoch 750/2000 - 0s - loss: 0.6041 - acc: 0.8200 Epoch 751/2000 - 0s - loss: 0.6207 - acc: 0.8000 Epoch 752/2000 - 0s - loss: 0.6296 - acc: 0.7600 Epoch 753/2000 - 0s - loss: 0.6251 - acc: 0.8000 Epoch 754/2000 - 0s - loss: 0.6190 - acc: 0.7800 Epoch 755/2000 - 0s - loss: 0.6068 - acc: 0.8600 Epoch 756/2000 - 0s - loss: 0.6029 - acc: 0.8200 Epoch 757/2000 - 0s - loss: 0.6130 - acc: 0.8000 Epoch 758/2000 - 0s - loss: 0.6109 - acc: 0.8200 Epoch 759/2000 - 0s - loss: 0.6051 - acc: 0.7800 Epoch 760/2000 - 0s - loss: 0.6148 - acc: 0.7600 Epoch 761/2000 - 0s - loss: 0.6445 - acc: 0.7600 Epoch 762/2000 - 0s - loss: 0.6090 - acc: 0.8200 Epoch 763/2000 - 0s - loss: 0.6343 - acc: 0.7800 Epoch 764/2000 - 0s - loss: 0.5889 - acc: 0.8400 Epoch 765/2000 - 0s - loss: 0.5978 - acc: 0.8000 Epoch 766/2000 - 0s - loss: 0.5911 - acc: 0.8200 Epoch 767/2000 - 0s - loss: 0.5783 - acc: 0.8200 Epoch 768/2000 - 0s - loss: 0.5919 - acc: 0.8000 Epoch 769/2000 - 0s - loss: 0.5992 - acc: 0.8400 Epoch 770/2000 - 0s - loss: 0.6072 - acc: 0.8000 Epoch 771/2000 - 0s - loss: 0.5900 - acc: 0.8200 Epoch 772/2000 - 0s - loss: 0.5951 - acc: 0.8000 Epoch 773/2000 - 0s - loss: 0.5828 - acc: 0.8400 Epoch 774/2000 - 0s - loss: 0.5795 - acc: 0.8400 Epoch 775/2000 - 0s - loss: 0.5751 - acc: 0.8400 Epoch 776/2000 - 0s - loss: 0.5948 - acc: 0.8400 Epoch 777/2000 - 0s - loss: 0.6032 - acc: 0.7800 Epoch 778/2000 - 0s - loss: 0.5831 - acc: 0.8000 Epoch 779/2000 - 0s - loss: 0.5772 - acc: 0.8400 Epoch 780/2000 - 0s - loss: 0.5651 - acc: 0.8400 Epoch 781/2000 - 0s - loss: 0.5750 - acc: 0.8400 Epoch 782/2000 - 0s - loss: 0.5806 - acc: 0.8400 Epoch 783/2000 - 0s - loss: 0.5981 - acc: 0.8400 Epoch 784/2000 - 0s - loss: 0.5768 - acc: 0.8200 Epoch 785/2000 - 0s - loss: 0.5842 - acc: 0.7800 Epoch 786/2000 - 0s - loss: 0.5879 - acc: 0.8400 Epoch 787/2000 - 0s - loss: 0.5706 - acc: 0.8200 Epoch 788/2000 - 0s - loss: 0.5893 - acc: 0.7800 Epoch 789/2000 - 0s - loss: 0.5798 - acc: 0.8400 Epoch 790/2000 - 0s - loss: 0.5739 - acc: 0.8200 Epoch 791/2000 - 0s - loss: 0.5712 - acc: 0.8200 Epoch 792/2000 - 0s - loss: 0.5623 - acc: 0.8400 Epoch 793/2000 - 0s - loss: 0.5648 - acc: 0.8400 Epoch 794/2000 - 0s - loss: 0.5638 - acc: 0.8400 Epoch 795/2000 - 0s - loss: 0.5626 - acc: 0.8400 Epoch 796/2000 - 0s - loss: 0.5670 - acc: 0.8200 Epoch 797/2000 - 0s - loss: 0.5591 - acc: 0.8000 Epoch 798/2000 - 0s - loss: 0.5656 - acc: 0.8000 Epoch 799/2000 - 0s - loss: 0.5570 - acc: 0.8200 Epoch 800/2000 - 0s - loss: 0.5715 - acc: 0.8200 Epoch 801/2000 - 0s - loss: 0.5566 - acc: 0.8400 Epoch 802/2000 - 0s - loss: 0.5613 - acc: 0.8200 Epoch 803/2000 - 0s - loss: 0.5513 - acc: 0.8200 Epoch 804/2000 - 0s - loss: 0.5467 - acc: 0.8200 Epoch 805/2000 - 0s - loss: 0.5484 - acc: 0.8200 Epoch 806/2000 - 0s - loss: 0.5503 - acc: 0.8000 Epoch 807/2000 - 0s - loss: 0.5591 - acc: 0.8200 Epoch 808/2000 - 0s - loss: 0.5574 - acc: 0.8400 Epoch 809/2000 - 0s - loss: 0.5407 - acc: 0.8600 Epoch 810/2000 - 0s - loss: 0.5452 - acc: 0.8200 Epoch 811/2000 - 0s - loss: 0.5534 - acc: 0.8400 Epoch 812/2000 - 0s - loss: 0.5473 - acc: 0.8200 Epoch 813/2000 - 0s - loss: 0.5372 - acc: 0.8400 Epoch 814/2000 - 0s - loss: 0.5431 - acc: 0.8200 Epoch 815/2000 - 0s - loss: 0.5357 - acc: 0.8200 Epoch 816/2000 - 0s - loss: 0.5480 - acc: 0.8200 Epoch 817/2000 - 0s - loss: 0.5389 - acc: 0.8400 Epoch 818/2000 - 0s - loss: 0.5515 - acc: 0.8400 Epoch 819/2000 - 0s - loss: 0.6175 - acc: 0.8000 Epoch 820/2000 - 0s - loss: 0.5554 - acc: 0.8000 Epoch 821/2000 - 0s - loss: 0.5958 - acc: 0.7400 Epoch 822/2000 - 0s - loss: 0.6213 - acc: 0.8000 Epoch 823/2000 - 0s - loss: 0.5824 - acc: 0.8000 Epoch 824/2000 - 0s - loss: 0.5912 - acc: 0.8000 Epoch 825/2000 - 0s - loss: 0.5855 - acc: 0.8200 Epoch 826/2000 - 0s - loss: 0.5477 - acc: 0.7600 Epoch 827/2000 - 0s - loss: 0.5645 - acc: 0.8000 Epoch 828/2000 - 0s - loss: 0.5538 - acc: 0.8000 Epoch 829/2000 - 0s - loss: 0.5354 - acc: 0.8600 Epoch 830/2000 - 0s - loss: 0.5678 - acc: 0.7800 Epoch 831/2000 - 0s - loss: 0.5767 - acc: 0.8000 Epoch 832/2000 - 0s - loss: 0.5697 - acc: 0.8400 Epoch 833/2000 - 0s - loss: 0.5593 - acc: 0.8000 Epoch 834/2000 - 0s - loss: 0.5314 - acc: 0.8600 Epoch 835/2000 - 0s - loss: 0.5343 - acc: 0.8400 Epoch 836/2000 - 0s - loss: 0.5299 - acc: 0.8600 Epoch 837/2000 - 0s - loss: 0.5394 - acc: 0.8000 Epoch 838/2000 - 0s - loss: 0.5349 - acc: 0.8400 Epoch 839/2000 - 0s - loss: 0.5429 - acc: 0.8600 Epoch 840/2000 - 0s - loss: 0.5905 - acc: 0.7800 Epoch 841/2000 - 0s - loss: 0.5146 - acc: 0.8200 Epoch 842/2000 - 0s - loss: 0.5923 - acc: 0.8000 Epoch 843/2000 - 0s - loss: 0.5344 - acc: 0.7800 Epoch 844/2000 - 0s - loss: 0.5565 - acc: 0.8600 Epoch 845/2000 - 0s - loss: 0.5410 - acc: 0.8400 Epoch 846/2000 - 0s - loss: 0.5336 - acc: 0.8200 Epoch 847/2000 - 0s - loss: 0.5274 - acc: 0.8400 Epoch 848/2000 - 0s - loss: 0.5268 - acc: 0.8600 Epoch 849/2000 - 0s - loss: 0.5490 - acc: 0.8000 Epoch 850/2000 - 0s - loss: 0.5608 - acc: 0.7800 Epoch 851/2000 - 0s - loss: 0.5190 - acc: 0.8400 Epoch 852/2000 - 0s - loss: 0.5228 - acc: 0.8400 Epoch 853/2000 - 0s - loss: 0.5142 - acc: 0.8800 Epoch 854/2000 - 0s - loss: 0.5183 - acc: 0.8600 Epoch 855/2000 - 0s - loss: 0.5137 - acc: 0.8200 Epoch 856/2000 - 0s - loss: 0.5096 - acc: 0.8400 Epoch 857/2000 - 0s - loss: 0.5121 - acc: 0.8600 Epoch 858/2000 - 0s - loss: 0.5307 - acc: 0.8400 Epoch 859/2000 - 0s - loss: 0.5119 - acc: 0.8400 Epoch 860/2000 - 0s - loss: 0.5225 - acc: 0.8400 Epoch 861/2000 - 0s - loss: 0.5136 - acc: 0.8200 Epoch 862/2000 - 0s - loss: 0.5034 - acc: 0.8600 Epoch 863/2000 - 0s - loss: 0.5033 - acc: 0.8600 Epoch 864/2000 - 0s - loss: 0.5111 - acc: 0.8400 Epoch 865/2000 - 0s - loss: 0.5078 - acc: 0.8400 Epoch 866/2000 - 0s - loss: 0.5073 - acc: 0.8400 Epoch 867/2000 - 0s - loss: 0.5072 - acc: 0.8400 Epoch 868/2000 - 0s - loss: 0.5127 - acc: 0.8200 Epoch 869/2000 - 0s - loss: 0.5223 - acc: 0.7800 Epoch 870/2000 - 0s - loss: 0.5108 - acc: 0.8200 Epoch 871/2000 - 0s - loss: 0.5097 - acc: 0.8400 Epoch 872/2000 - 0s - loss: 0.5191 - acc: 0.8400 Epoch 873/2000 - 0s - loss: 0.5119 - acc: 0.8200 Epoch 874/2000 - 0s - loss: 0.5369 - acc: 0.7800 Epoch 875/2000 - 0s - loss: 0.5803 - acc: 0.7200 Epoch 876/2000 - 0s - loss: 0.5774 - acc: 0.8000 Epoch 877/2000 - 0s - loss: 0.5886 - acc: 0.8200 Epoch 878/2000 - 0s - loss: 0.5664 - acc: 0.7800 Epoch 879/2000 - 0s - loss: 0.5262 - acc: 0.8200 Epoch 880/2000 - 0s - loss: 0.5172 - acc: 0.8200 Epoch 881/2000 - 0s - loss: 0.5331 - acc: 0.8400 Epoch 882/2000 - 0s - loss: 0.5232 - acc: 0.8200 Epoch 883/2000 - 0s - loss: 0.5078 - acc: 0.8400 Epoch 884/2000 - 0s - loss: 0.5148 - acc: 0.8600 Epoch 885/2000 - 0s - loss: 0.5194 - acc: 0.8400 Epoch 886/2000 - 0s - loss: 0.5040 - acc: 0.8400 Epoch 887/2000 - 0s - loss: 0.5104 - acc: 0.8400 Epoch 888/2000 - 0s - loss: 0.5013 - acc: 0.8400 Epoch 889/2000 - 0s - loss: 0.4966 - acc: 0.8600 Epoch 890/2000 - 0s - loss: 0.4974 - acc: 0.8600 Epoch 891/2000 - 0s - loss: 0.5187 - acc: 0.8200 Epoch 892/2000 - 0s - loss: 0.4824 - acc: 0.8600 Epoch 893/2000 - 0s - loss: 0.5241 - acc: 0.7800 Epoch 894/2000 - 0s - loss: 0.5011 - acc: 0.8400 Epoch 895/2000 - 0s - loss: 0.5200 - acc: 0.8400 Epoch 896/2000 - 0s - loss: 0.4970 - acc: 0.8400 Epoch 897/2000 - 0s - loss: 0.5154 - acc: 0.8200 Epoch 898/2000 - 0s - loss: 0.5104 - acc: 0.8200 Epoch 899/2000 - 0s - loss: 0.4970 - acc: 0.8400 Epoch 900/2000 - 0s - loss: 0.5411 - acc: 0.8200 Epoch 901/2000 - 0s - loss: 0.5030 - acc: 0.8400 Epoch 902/2000 - 0s - loss: 0.5258 - acc: 0.8400 Epoch 903/2000 - 0s - loss: 0.5132 - acc: 0.8400 Epoch 904/2000 - 0s - loss: 0.5099 - acc: 0.8200 Epoch 905/2000 - 0s - loss: 0.5086 - acc: 0.8600 Epoch 906/2000 - 0s - loss: 0.4955 - acc: 0.8400 Epoch 907/2000 - 0s - loss: 0.4865 - acc: 0.8600 Epoch 908/2000 - 0s - loss: 0.4864 - acc: 0.8600 Epoch 909/2000 - 0s - loss: 0.4839 - acc: 0.8600 Epoch 910/2000 - 0s - loss: 0.5059 - acc: 0.8200 Epoch 911/2000 - 0s - loss: 0.4855 - acc: 0.8400 Epoch 912/2000 - 0s - loss: 0.4963 - acc: 0.7800 Epoch 913/2000 - 0s - loss: 0.4817 - acc: 0.8400 Epoch 914/2000 - 0s - loss: 0.4847 - acc: 0.8400 Epoch 915/2000 - 0s - loss: 0.4734 - acc: 0.8400 Epoch 916/2000 - 0s - loss: 0.5025 - acc: 0.8200 Epoch 917/2000 - 0s - loss: 0.4940 - acc: 0.8200 Epoch 918/2000 - 0s - loss: 0.4705 - acc: 0.8600 Epoch 919/2000 - 0s - loss: 0.4935 - acc: 0.8400 Epoch 920/2000 - 0s - loss: 0.4903 - acc: 0.8200 Epoch 921/2000 - 0s - loss: 0.4763 - acc: 0.8200 Epoch 922/2000 - 0s - loss: 0.4966 - acc: 0.8400 Epoch 923/2000 - 0s - loss: 0.4733 - acc: 0.8400 Epoch 924/2000 - 0s - loss: 0.4799 - acc: 0.8200 Epoch 925/2000 - 0s - loss: 0.4699 - acc: 0.8600 Epoch 926/2000 - 0s - loss: 0.4933 - acc: 0.8600 Epoch 927/2000 - 0s - loss: 0.4781 - acc: 0.8400 Epoch 928/2000 - 0s - loss: 0.4902 - acc: 0.8000 Epoch 929/2000 - 0s - loss: 0.4691 - acc: 0.8800 Epoch 930/2000 - 0s - loss: 0.4870 - acc: 0.8600 Epoch 931/2000 - 0s - loss: 0.4795 - acc: 0.8400 Epoch 932/2000 - 0s - loss: 0.4650 - acc: 0.8800 Epoch 933/2000 - 0s - loss: 0.4796 - acc: 0.8600 Epoch 934/2000 - 0s - loss: 0.4773 - acc: 0.8600 Epoch 935/2000 - 0s - loss: 0.4659 - acc: 0.8800 Epoch 936/2000 - 0s - loss: 0.4857 - acc: 0.8600 Epoch 937/2000 - 0s - loss: 0.4879 - acc: 0.8400 Epoch 938/2000 - 0s - loss: 0.4892 - acc: 0.8400 Epoch 939/2000 - 0s - loss: 0.4843 - acc: 0.8400 Epoch 940/2000 - 0s - loss: 0.4752 - acc: 0.8600 Epoch 941/2000 - 0s - loss: 0.4613 - acc: 0.8800 Epoch 942/2000 - 0s - loss: 0.4691 - acc: 0.8600 Epoch 943/2000 - 0s - loss: 0.4783 - acc: 0.8600 Epoch 944/2000 - 0s - loss: 0.4585 - acc: 0.8400 Epoch 945/2000 - 0s - loss: 0.4881 - acc: 0.8200 Epoch 946/2000 - 0s - loss: 0.4617 - acc: 0.8400 Epoch 947/2000 - 0s - loss: 0.4658 - acc: 0.8600 Epoch 948/2000 - 0s - loss: 0.4866 - acc: 0.8600 Epoch 949/2000 - 0s - loss: 0.4781 - acc: 0.8400 Epoch 950/2000 - 0s - loss: 0.4661 - acc: 0.8800 Epoch 951/2000 - 0s - loss: 0.4573 - acc: 0.8600 Epoch 952/2000 - 0s - loss: 0.4667 - acc: 0.8400 Epoch 953/2000 - 0s - loss: 0.4543 - acc: 0.8600 Epoch 954/2000 - 0s - loss: 0.4717 - acc: 0.8600 Epoch 955/2000 - 0s - loss: 0.4510 - acc: 0.8800 Epoch 956/2000 - 0s - loss: 0.4731 - acc: 0.8400 Epoch 957/2000 - 0s - loss: 0.4978 - acc: 0.8600 Epoch 958/2000 - 0s - loss: 0.4688 - acc: 0.8400 Epoch 959/2000 - 0s - loss: 0.4660 - acc: 0.8600 Epoch 960/2000 - 0s - loss: 0.4544 - acc: 0.8600 Epoch 961/2000 - 0s - loss: 0.4579 - acc: 0.8600 Epoch 962/2000 - 0s - loss: 0.4660 - acc: 0.8400 Epoch 963/2000 - 0s - loss: 0.4733 - acc: 0.8600 Epoch 964/2000 - 0s - loss: 0.4604 - acc: 0.8600 Epoch 965/2000 - 0s - loss: 0.4668 - acc: 0.8400 Epoch 966/2000 - 0s - loss: 0.4520 - acc: 0.8600 Epoch 967/2000 - 0s - loss: 0.4531 - acc: 0.8600 Epoch 968/2000 - 0s - loss: 0.4492 - acc: 0.8800 Epoch 969/2000 - 0s - loss: 0.4535 - acc: 0.8800 Epoch 970/2000 - 0s - loss: 0.4574 - acc: 0.8400 Epoch 971/2000 - 0s - loss: 0.4402 - acc: 0.8800 Epoch 972/2000 - 0s - loss: 0.4596 - acc: 0.8600 Epoch 973/2000 - 0s - loss: 0.4581 - acc: 0.8400 Epoch 974/2000 - 0s - loss: 0.4479 - acc: 0.8600 Epoch 975/2000 - 0s - loss: 0.4656 - acc: 0.8400 Epoch 976/2000 - 0s - loss: 0.4469 - acc: 0.8600 Epoch 977/2000 - 0s - loss: 0.4463 - acc: 0.8600 Epoch 978/2000 - 0s - loss: 0.4504 - acc: 0.8600 Epoch 979/2000 - 0s - loss: 0.4619 - acc: 0.8400 Epoch 980/2000 - 0s - loss: 0.4609 - acc: 0.8600 Epoch 981/2000 - 0s - loss: 0.4529 - acc: 0.8600 Epoch 982/2000 - 0s - loss: 0.4441 - acc: 0.8600 Epoch 983/2000 - 0s - loss: 0.4486 - acc: 0.8200 Epoch 984/2000 - 0s - loss: 0.4507 - acc: 0.8600 Epoch 985/2000 - 0s - loss: 0.4487 - acc: 0.8600 Epoch 986/2000 - 0s - loss: 0.4609 - acc: 0.8600 Epoch 987/2000 - 0s - loss: 0.4399 - acc: 0.8600 Epoch 988/2000 - 0s - loss: 0.4489 - acc: 0.8800 Epoch 989/2000 - 0s - loss: 0.4473 - acc: 0.8600 Epoch 990/2000 - 0s - loss: 0.4381 - acc: 0.8600 Epoch 991/2000 - 0s - loss: 0.4440 - acc: 0.8600 Epoch 992/2000 - 0s - loss: 0.4743 - acc: 0.8200 Epoch 993/2000 - 0s - loss: 0.4470 - acc: 0.8600 Epoch 994/2000 - 0s - loss: 0.4654 - acc: 0.8400 Epoch 995/2000 - 0s - loss: 0.4381 - acc: 0.8800 Epoch 996/2000 - 0s - loss: 0.4721 - acc: 0.8000 Epoch 997/2000 - 0s - loss: 0.4595 - acc: 0.8600 Epoch 998/2000 - 0s - loss: 0.4467 - acc: 0.8600 Epoch 999/2000 - 0s - loss: 0.4627 - acc: 0.8400 Epoch 1000/2000 - 0s - loss: 0.4345 - acc: 0.8400 Epoch 1001/2000 - 0s - loss: 0.4773 - acc: 0.8000 Epoch 1002/2000 - 0s - loss: 0.4271 - acc: 0.8800 Epoch 1003/2000 - 0s - loss: 0.4695 - acc: 0.8400 Epoch 1004/2000 - 0s - loss: 0.4425 - acc: 0.8200 Epoch 1005/2000 - 0s - loss: 0.4349 - acc: 0.8400 Epoch 1006/2000 - 0s - loss: 0.4408 - acc: 0.8400 Epoch 1007/2000 - 0s - loss: 0.4370 - acc: 0.8600 Epoch 1008/2000 - 0s - loss: 0.4557 - acc: 0.8600 Epoch 1009/2000 - 0s - loss: 0.4550 - acc: 0.8400 Epoch 1010/2000 - 0s - loss: 0.4605 - acc: 0.8400 Epoch 1011/2000 - 0s - loss: 0.4315 - acc: 0.8600 Epoch 1012/2000 - 0s - loss: 0.4738 - acc: 0.8200 Epoch 1013/2000 - 0s - loss: 0.4376 - acc: 0.8600 Epoch 1014/2000 - 0s - loss: 0.4425 - acc: 0.8600 Epoch 1015/2000 - 0s - loss: 0.4574 - acc: 0.8600 Epoch 1016/2000 - 0s - loss: 0.4297 - acc: 0.8400 Epoch 1017/2000 - 0s - loss: 0.4704 - acc: 0.8200 Epoch 1018/2000 - 0s - loss: 0.4397 - acc: 0.8600 Epoch 1019/2000 - 0s - loss: 0.4424 - acc: 0.8600 Epoch 1020/2000 - 0s - loss: 0.4287 - acc: 0.8600 Epoch 1021/2000 - 0s - loss: 0.4366 - acc: 0.8600 Epoch 1022/2000 - 0s - loss: 0.4479 - acc: 0.8400 Epoch 1023/2000 - 0s - loss: 0.4223 - acc: 0.8600 Epoch 1024/2000 - 0s - loss: 0.4653 - acc: 0.8200 Epoch 1025/2000 - 0s - loss: 0.4245 - acc: 0.8600 Epoch 1026/2000 - 0s - loss: 0.4460 - acc: 0.8600 Epoch 1027/2000 - 0s - loss: 0.4389 - acc: 0.8600 Epoch 1028/2000 - 0s - loss: 0.4242 - acc: 0.8600 Epoch 1029/2000 - 0s - loss: 0.4439 - acc: 0.8800 Epoch 1030/2000 - 0s - loss: 0.4189 - acc: 0.8600 Epoch 1031/2000 - 0s - loss: 0.4554 - acc: 0.8400 Epoch 1032/2000 - 0s - loss: 0.4439 - acc: 0.8400 Epoch 1033/2000 - 0s - loss: 0.4205 - acc: 0.8600 Epoch 1034/2000 - 0s - loss: 0.4399 - acc: 0.8200 Epoch 1035/2000 - 0s - loss: 0.4290 - acc: 0.8400 Epoch 1036/2000 - 0s - loss: 0.4284 - acc: 0.8600 Epoch 1037/2000 - 0s - loss: 0.4266 - acc: 0.8600 Epoch 1038/2000 - 0s - loss: 0.4187 - acc: 0.8400 Epoch 1039/2000 - 0s - loss: 0.4473 - acc: 0.8400 Epoch 1040/2000 - 0s - loss: 0.4590 - acc: 0.8400 Epoch 1041/2000 - 0s - loss: 0.4333 - acc: 0.8600 Epoch 1042/2000 - 0s - loss: 0.4323 - acc: 0.8600 Epoch 1043/2000 - 0s - loss: 0.4305 - acc: 0.8600 Epoch 1044/2000 - 0s - loss: 0.4361 - acc: 0.8400 Epoch 1045/2000 - 0s - loss: 0.4248 - acc: 0.8600 Epoch 1046/2000 - 0s - loss: 0.4334 - acc: 0.8600 Epoch 1047/2000 - 0s - loss: 0.4267 - acc: 0.8400 Epoch 1048/2000 - 0s - loss: 0.4357 - acc: 0.8400 Epoch 1049/2000 - 0s - loss: 0.4337 - acc: 0.8400 Epoch 1050/2000 - 0s - loss: 0.4502 - acc: 0.8400 Epoch 1051/2000 - 0s - loss: 0.4367 - acc: 0.8600 Epoch 1052/2000 - 0s - loss: 0.4404 - acc: 0.8400 Epoch 1053/2000 - 0s - loss: 0.4416 - acc: 0.8400 Epoch 1054/2000 - 0s - loss: 0.4388 - acc: 0.8400 Epoch 1055/2000 - 0s - loss: 0.4186 - acc: 0.8600 Epoch 1056/2000 - 0s - loss: 0.4733 - acc: 0.8600 Epoch 1057/2000 - 0s - loss: 0.4234 - acc: 0.8600 Epoch 1058/2000 - 0s - loss: 0.4197 - acc: 0.8400 Epoch 1059/2000 - 0s - loss: 0.4339 - acc: 0.8600 Epoch 1060/2000 - 0s - loss: 0.4040 - acc: 0.8800 Epoch 1061/2000 - 0s - loss: 0.4527 - acc: 0.8400 Epoch 1062/2000 - 0s - loss: 0.4252 - acc: 0.8400 Epoch 1063/2000 - 0s - loss: 0.4460 - acc: 0.8600 Epoch 1064/2000 - 0s - loss: 0.4227 - acc: 0.8600 Epoch 1065/2000 - 0s - loss: 0.4243 - acc: 0.8600 Epoch 1066/2000 - 0s - loss: 0.4298 - acc: 0.8400 Epoch 1067/2000 - 0s - loss: 0.4291 - acc: 0.8400 Epoch 1068/2000 - 0s - loss: 0.4144 - acc: 0.8600 Epoch 1069/2000 - 0s - loss: 0.4183 - acc: 0.8400 Epoch 1070/2000 - 0s - loss: 0.4080 - acc: 0.8600 Epoch 1071/2000 - 0s - loss: 0.4195 - acc: 0.8600 Epoch 1072/2000 - 0s - loss: 0.4085 - acc: 0.8600 Epoch 1073/2000 - 0s - loss: 0.4127 - acc: 0.8600 Epoch 1074/2000 - 0s - loss: 0.4135 - acc: 0.8600 Epoch 1075/2000 - 0s - loss: 0.4332 - acc: 0.8200 Epoch 1076/2000 - 0s - loss: 0.4230 - acc: 0.8600 Epoch 1077/2000 - 0s - loss: 0.4319 - acc: 0.8400 Epoch 1078/2000 - 0s - loss: 0.4456 - acc: 0.8600 Epoch 1079/2000 - 0s - loss: 0.4247 - acc: 0.8400 Epoch 1080/2000 - 0s - loss: 0.4649 - acc: 0.8200 Epoch 1081/2000 - 0s - loss: 0.4350 - acc: 0.8800 Epoch 1082/2000 - 0s - loss: 0.4164 - acc: 0.8600 Epoch 1083/2000 - 0s - loss: 0.4176 - acc: 0.8600 Epoch 1084/2000 - 0s - loss: 0.4072 - acc: 0.8600 Epoch 1085/2000 - 0s - loss: 0.4335 - acc: 0.8600 Epoch 1086/2000 - 0s - loss: 0.4169 - acc: 0.8200 Epoch 1087/2000 - 0s - loss: 0.4014 - acc: 0.8600 Epoch 1088/2000 - 0s - loss: 0.4124 - acc: 0.8600 Epoch 1089/2000 - 0s - loss: 0.4153 - acc: 0.8600 Epoch 1090/2000 - 0s - loss: 0.4089 - acc: 0.8600 Epoch 1091/2000 - 0s - loss: 0.4446 - acc: 0.8400 Epoch 1092/2000 - 0s - loss: 0.4114 - acc: 0.8600 Epoch 1093/2000 - 0s - loss: 0.4375 - acc: 0.8400 Epoch 1094/2000 - 0s - loss: 0.4048 - acc: 0.8800 Epoch 1095/2000 - 0s - loss: 0.4066 - acc: 0.8600 Epoch 1096/2000 - 0s - loss: 0.4232 - acc: 0.8400 Epoch 1097/2000 - 0s - loss: 0.3993 - acc: 0.8600 Epoch 1098/2000 - 0s - loss: 0.4062 - acc: 0.8800 Epoch 1099/2000 - 0s - loss: 0.4060 - acc: 0.8600 Epoch 1100/2000 - 0s - loss: 0.4161 - acc: 0.8400 Epoch 1101/2000 - 0s - loss: 0.4057 - acc: 0.8600 Epoch 1102/2000 - 0s - loss: 0.4053 - acc: 0.8800 Epoch 1103/2000 - 0s - loss: 0.4053 - acc: 0.8600 Epoch 1104/2000 - 0s - loss: 0.4100 - acc: 0.8400 Epoch 1105/2000 - 0s - loss: 0.4088 - acc: 0.8400 Epoch 1106/2000 - 0s - loss: 0.4318 - acc: 0.8600 Epoch 1107/2000 - 0s - loss: 0.4078 - acc: 0.8400 Epoch 1108/2000 - 0s - loss: 0.4255 - acc: 0.8600 Epoch 1109/2000 - 0s - loss: 0.4112 - acc: 0.8600 Epoch 1110/2000 - 0s - loss: 0.4375 - acc: 0.8200 Epoch 1111/2000 - 0s - loss: 0.4103 - acc: 0.8600 Epoch 1112/2000 - 0s - loss: 0.4114 - acc: 0.8600 Epoch 1113/2000 - 0s - loss: 0.4033 - acc: 0.8800 Epoch 1114/2000 - 0s - loss: 0.4474 - acc: 0.8400 Epoch 1115/2000 - 0s - loss: 0.4032 - acc: 0.8400 Epoch 1116/2000 - 0s - loss: 0.4277 - acc: 0.8600 Epoch 1117/2000 - 0s - loss: 0.4562 - acc: 0.8000 Epoch 1118/2000 - 0s - loss: 0.3767 - acc: 0.8600 Epoch 1119/2000 - 0s - loss: 0.4968 - acc: 0.8000 Epoch 1120/2000 - 0s - loss: 0.4066 - acc: 0.8800 Epoch 1121/2000 - 0s - loss: 0.4374 - acc: 0.8400 Epoch 1122/2000 - 0s - loss: 0.4313 - acc: 0.8600 Epoch 1123/2000 - 0s - loss: 0.4163 - acc: 0.8600 Epoch 1124/2000 - 0s - loss: 0.4128 - acc: 0.8400 Epoch 1125/2000 - 0s - loss: 0.4182 - acc: 0.8600 Epoch 1126/2000 - 0s - loss: 0.3876 - acc: 0.8600 Epoch 1127/2000 - 0s - loss: 0.4232 - acc: 0.8400 Epoch 1128/2000 - 0s - loss: 0.4035 - acc: 0.8400 Epoch 1129/2000 - 0s - loss: 0.4374 - acc: 0.8600 Epoch 1130/2000 - 0s - loss: 0.3992 - acc: 0.8400 Epoch 1131/2000 - 0s - loss: 0.4227 - acc: 0.8200 Epoch 1132/2000 - 0s - loss: 0.4073 - acc: 0.8800 Epoch 1133/2000 - 0s - loss: 0.3966 - acc: 0.8600 Epoch 1134/2000 - 0s - loss: 0.4064 - acc: 0.8800 Epoch 1135/2000 - 0s - loss: 0.3982 - acc: 0.8400 Epoch 1136/2000 - 0s - loss: 0.3960 - acc: 0.8600 Epoch 1137/2000 - 0s - loss: 0.3917 - acc: 0.8600 Epoch 1138/2000 - 0s - loss: 0.3889 - acc: 0.8600 Epoch 1139/2000 - 0s - loss: 0.4007 - acc: 0.8600 Epoch 1140/2000 - 0s - loss: 0.3962 - acc: 0.8600 Epoch 1141/2000 - 0s - loss: 0.3952 - acc: 0.8600 Epoch 1142/2000 - 0s - loss: 0.3900 - acc: 0.8800 Epoch 1143/2000 - 0s - loss: 0.3833 - acc: 0.8400 Epoch 1144/2000 - 0s - loss: 0.3926 - acc: 0.8400 Epoch 1145/2000 - 0s - loss: 0.3943 - acc: 0.8800 Epoch 1146/2000 - 0s - loss: 0.4076 - acc: 0.8400 Epoch 1147/2000 - 0s - loss: 0.3792 - acc: 0.8600 Epoch 1148/2000 - 0s - loss: 0.4130 - acc: 0.8400 Epoch 1149/2000 - 0s - loss: 0.4550 - acc: 0.8200 Epoch 1150/2000 - 0s - loss: 0.3960 - acc: 0.8400 Epoch 1151/2000 - 0s - loss: 0.4278 - acc: 0.8600 Epoch 1152/2000 - 0s - loss: 0.4264 - acc: 0.8600 Epoch 1153/2000 - 0s - loss: 0.4025 - acc: 0.8400 Epoch 1154/2000 - 0s - loss: 0.4220 - acc: 0.8600 Epoch 1155/2000 - 0s - loss: 0.3834 - acc: 0.9000 Epoch 1156/2000 - 0s - loss: 0.4184 - acc: 0.8400 Epoch 1157/2000 - 0s - loss: 0.4257 - acc: 0.8600 Epoch 1158/2000 - 0s - loss: 0.4024 - acc: 0.8800 Epoch 1159/2000 - 0s - loss: 0.4092 - acc: 0.8400 Epoch 1160/2000 - 0s - loss: 0.3848 - acc: 0.8800 Epoch 1161/2000 - 0s - loss: 0.3961 - acc: 0.8800 Epoch 1162/2000 - 0s - loss: 0.3935 - acc: 0.8600 Epoch 1163/2000 - 0s - loss: 0.3878 - acc: 0.8600 Epoch 1164/2000 - 0s - loss: 0.3895 - acc: 0.8600 Epoch 1165/2000 - 0s - loss: 0.4164 - acc: 0.8600 Epoch 1166/2000 - 0s - loss: 0.4016 - acc: 0.8400 Epoch 1167/2000 - 0s - loss: 0.3917 - acc: 0.8600 Epoch 1168/2000 - 0s - loss: 0.4142 - acc: 0.8600 Epoch 1169/2000 - 0s - loss: 0.4442 - acc: 0.8400 Epoch 1170/2000 - 0s - loss: 0.4236 - acc: 0.8200 Epoch 1171/2000 - 0s - loss: 0.4155 - acc: 0.8600 Epoch 1172/2000 - 0s - loss: 0.3907 - acc: 0.8400 Epoch 1173/2000 - 0s - loss: 0.3859 - acc: 0.8400 Epoch 1174/2000 - 0s - loss: 0.3920 - acc: 0.8600 Epoch 1175/2000 - 0s - loss: 0.3829 - acc: 0.8600 Epoch 1176/2000 - 0s - loss: 0.3892 - acc: 0.8600 Epoch 1177/2000 - 0s - loss: 0.3939 - acc: 0.8600 Epoch 1178/2000 - 0s - loss: 0.3788 - acc: 0.8800 Epoch 1179/2000 - 0s - loss: 0.3890 - acc: 0.8400 Epoch 1180/2000 - 0s - loss: 0.3886 - acc: 0.8600 Epoch 1181/2000 - 0s - loss: 0.3808 - acc: 0.8600 Epoch 1182/2000 - 0s - loss: 0.3876 - acc: 0.8800 Epoch 1183/2000 - 0s - loss: 0.3801 - acc: 0.8600 Epoch 1184/2000 - 0s - loss: 0.3814 - acc: 0.8800 Epoch 1185/2000 - 0s - loss: 0.3750 - acc: 0.8800 Epoch 1186/2000 - 0s - loss: 0.3844 - acc: 0.8800 Epoch 1187/2000 - 0s - loss: 0.3887 - acc: 0.8600 Epoch 1188/2000 - 0s - loss: 0.3801 - acc: 0.8600 Epoch 1189/2000 - 0s - loss: 0.3907 - acc: 0.8600 Epoch 1190/2000 - 0s - loss: 0.3957 - acc: 0.8600 Epoch 1191/2000 - 0s - loss: 0.3829 - acc: 0.8600 Epoch 1192/2000 - 0s - loss: 0.3819 - acc: 0.8600 Epoch 1193/2000 - 0s - loss: 0.3872 - acc: 0.8600 Epoch 1194/2000 - 0s - loss: 0.4042 - acc: 0.8400 Epoch 1195/2000 - 0s - loss: 0.3741 - acc: 0.8600 Epoch 1196/2000 - 0s - loss: 0.4322 - acc: 0.8400 Epoch 1197/2000 - 0s - loss: 0.3755 - acc: 0.8600 Epoch 1198/2000 - 0s - loss: 0.3921 - acc: 0.8800 Epoch 1199/2000 - 0s - loss: 0.4018 - acc: 0.8600 Epoch 1200/2000 - 0s - loss: 0.4029 - acc: 0.8400 Epoch 1201/2000 - 0s - loss: 0.3914 - acc: 0.8600 Epoch 1202/2000 - 0s - loss: 0.3754 - acc: 0.8600 Epoch 1203/2000 - 0s - loss: 0.3710 - acc: 0.8600 Epoch 1204/2000 - 0s - loss: 0.3793 - acc: 0.8800 Epoch 1205/2000 - 0s - loss: 0.3792 - acc: 0.8600 Epoch 1206/2000 - 0s - loss: 0.3717 - acc: 0.8600 Epoch 1207/2000 - 0s - loss: 0.3888 - acc: 0.8600 Epoch 1208/2000 - 0s - loss: 0.3711 - acc: 0.8600 Epoch 1209/2000 - 0s - loss: 0.3901 - acc: 0.8600 Epoch 1210/2000 - 0s - loss: 0.3607 - acc: 0.9000 Epoch 1211/2000 - 0s - loss: 0.4456 - acc: 0.8000 Epoch 1212/2000 - 0s - loss: 0.3927 - acc: 0.8800 Epoch 1213/2000 - 0s - loss: 0.4088 - acc: 0.8600 Epoch 1214/2000 - 0s - loss: 0.4529 - acc: 0.8000 Epoch 1215/2000 - 0s - loss: 0.3694 - acc: 0.8600 Epoch 1216/2000 - 0s - loss: 0.3919 - acc: 0.8800 Epoch 1217/2000 - 0s - loss: 0.3873 - acc: 0.8400 Epoch 1218/2000 - 0s - loss: 0.4009 - acc: 0.8600 Epoch 1219/2000 - 0s - loss: 0.3972 - acc: 0.8200 Epoch 1220/2000 - 0s - loss: 0.3679 - acc: 0.8600 Epoch 1221/2000 - 0s - loss: 0.3867 - acc: 0.8600 Epoch 1222/2000 - 0s - loss: 0.3822 - acc: 0.8600 Epoch 1223/2000 - 0s - loss: 0.3741 - acc: 0.8800 Epoch 1224/2000 - 0s - loss: 0.3655 - acc: 0.8600 Epoch 1225/2000 - 0s - loss: 0.3759 - acc: 0.8600 Epoch 1226/2000 - 0s - loss: 0.3736 - acc: 0.8600 Epoch 1227/2000 - 0s - loss: 0.3888 - acc: 0.8400 Epoch 1228/2000 - 0s - loss: 0.4359 - acc: 0.8600 Epoch 1229/2000 - 0s - loss: 0.4542 - acc: 0.8200 Epoch 1230/2000 - 0s - loss: 0.4768 - acc: 0.8200 Epoch 1231/2000 - 0s - loss: 0.3907 - acc: 0.8800 Epoch 1232/2000 - 0s - loss: 0.3934 - acc: 0.8600 Epoch 1233/2000 - 0s - loss: 0.3723 - acc: 0.8600 Epoch 1234/2000 - 0s - loss: 0.3726 - acc: 0.8600 Epoch 1235/2000 - 0s - loss: 0.3697 - acc: 0.9000 Epoch 1236/2000 - 0s - loss: 0.3997 - acc: 0.8600 Epoch 1237/2000 - 0s - loss: 0.3682 - acc: 0.8600 Epoch 1238/2000 - 0s - loss: 0.3692 - acc: 0.8600 Epoch 1239/2000 - 0s - loss: 0.4054 - acc: 0.8600 Epoch 1240/2000 - 0s - loss: 0.3720 - acc: 0.8600 Epoch 1241/2000 - 0s - loss: 0.3847 - acc: 0.8600 Epoch 1242/2000 - 0s - loss: 0.3970 - acc: 0.8600 Epoch 1243/2000 - 0s - loss: 0.3877 - acc: 0.8600 Epoch 1244/2000 - 0s - loss: 0.3764 - acc: 0.8600 Epoch 1245/2000 - 0s - loss: 0.3611 - acc: 0.8800 Epoch 1246/2000 - 0s - loss: 0.3750 - acc: 0.8600 Epoch 1247/2000 - 0s - loss: 0.3823 - acc: 0.8600 Epoch 1248/2000 - 0s - loss: 0.3700 - acc: 0.8600 Epoch 1249/2000 - 0s - loss: 0.3794 - acc: 0.8400 Epoch 1250/2000 - 0s - loss: 0.3771 - acc: 0.8400 Epoch 1251/2000 - 0s - loss: 0.3643 - acc: 0.8800 Epoch 1252/2000 - 0s - loss: 0.3795 - acc: 0.8600 Epoch 1253/2000 - 0s - loss: 0.3722 - acc: 0.8400 Epoch 1254/2000 - 0s - loss: 0.3642 - acc: 0.8800 Epoch 1255/2000 - 0s - loss: 0.3776 - acc: 0.8600 Epoch 1256/2000 - 0s - loss: 0.3724 - acc: 0.8600 Epoch 1257/2000 - 0s - loss: 0.3773 - acc: 0.8400 Epoch 1258/2000 - 0s - loss: 0.3746 - acc: 0.8800 Epoch 1259/2000 - 0s - loss: 0.3689 - acc: 0.8400 Epoch 1260/2000 - 0s - loss: 0.3786 - acc: 0.8200 Epoch 1261/2000 - 0s - loss: 0.3632 - acc: 0.8600 Epoch 1262/2000 - 0s - loss: 0.3622 - acc: 0.8600 Epoch 1263/2000 - 0s - loss: 0.3605 - acc: 0.8600 Epoch 1264/2000 - 0s - loss: 0.3683 - acc: 0.8400 Epoch 1265/2000 - 0s - loss: 0.3591 - acc: 0.8800 Epoch 1266/2000 - 0s - loss: 0.3635 - acc: 0.8600 Epoch 1267/2000 - 0s - loss: 0.3779 - acc: 0.8600 Epoch 1268/2000 - 0s - loss: 0.3616 - acc: 0.9000 Epoch 1269/2000 - 0s - loss: 0.3668 - acc: 0.8800 Epoch 1270/2000 - 0s - loss: 0.3875 - acc: 0.8400 Epoch 1271/2000 - 0s - loss: 0.3409 - acc: 0.8600 Epoch 1272/2000 - 0s - loss: 0.3906 - acc: 0.8600 Epoch 1273/2000 - 0s - loss: 0.3887 - acc: 0.8400 Epoch 1274/2000 - 0s - loss: 0.3695 - acc: 0.8600 Epoch 1275/2000 - 0s - loss: 0.3649 - acc: 0.8600 Epoch 1276/2000 - 0s - loss: 0.3641 - acc: 0.8800 Epoch 1277/2000 - 0s - loss: 0.3562 - acc: 0.8600 Epoch 1278/2000 - 0s - loss: 0.3697 - acc: 0.8600 Epoch 1279/2000 - 0s - loss: 0.3530 - acc: 0.8400 Epoch 1280/2000 - 0s - loss: 0.3643 - acc: 0.8600 Epoch 1281/2000 - 0s - loss: 0.3594 - acc: 0.8600 Epoch 1282/2000 - 0s - loss: 0.3571 - acc: 0.8400 Epoch 1283/2000 - 0s - loss: 0.3584 - acc: 0.8800 Epoch 1284/2000 - 0s - loss: 0.3604 - acc: 0.8600 Epoch 1285/2000 - 0s - loss: 0.3546 - acc: 0.8600 Epoch 1286/2000 - 0s - loss: 0.3758 - acc: 0.8600 Epoch 1287/2000 - 0s - loss: 0.3522 - acc: 0.8400 Epoch 1288/2000 - 0s - loss: 0.3577 - acc: 0.8600 Epoch 1289/2000 - 0s - loss: 0.3561 - acc: 0.8400 Epoch 1290/2000 - 0s - loss: 0.3544 - acc: 0.8400 Epoch 1291/2000 - 0s - loss: 0.3542 - acc: 0.8400 Epoch 1292/2000 - 0s - loss: 0.3574 - acc: 0.8800 Epoch 1293/2000 - 0s - loss: 0.3531 - acc: 0.8800 Epoch 1294/2000 - 0s - loss: 0.3519 - acc: 0.8600 Epoch 1295/2000 - 0s - loss: 0.4034 - acc: 0.8400 Epoch 1296/2000 - 0s - loss: 0.3723 - acc: 0.8600 Epoch 1297/2000 - 0s - loss: 0.3773 - acc: 0.8600 Epoch 1298/2000 - 0s - loss: 0.3626 - acc: 0.8600 Epoch 1299/2000 - 0s - loss: 0.3913 - acc: 0.8600 Epoch 1300/2000 - 0s - loss: 0.3531 - acc: 0.8800 Epoch 1301/2000 - 0s - loss: 0.3650 - acc: 0.8600 Epoch 1302/2000 - 0s - loss: 0.3564 - acc: 0.8600 Epoch 1303/2000 - 0s - loss: 0.3546 - acc: 0.8600 Epoch 1304/2000 - 0s - loss: 0.3460 - acc: 0.8800 Epoch 1305/2000 - 0s - loss: 0.3553 - acc: 0.8600 Epoch 1306/2000 - 0s - loss: 0.3651 - acc: 0.8400 Epoch 1307/2000 - 0s - loss: 0.3518 - acc: 0.8600 Epoch 1308/2000 - 0s - loss: 0.3687 - acc: 0.8600 Epoch 1309/2000 - 0s - loss: 0.3534 - acc: 0.8600 Epoch 1310/2000 - 0s - loss: 0.3453 - acc: 0.8800 Epoch 1311/2000 - 0s - loss: 0.3595 - acc: 0.8800 Epoch 1312/2000 - 0s - loss: 0.3505 - acc: 0.8800 Epoch 1313/2000 - 0s - loss: 0.3655 - acc: 0.8400 Epoch 1314/2000 - 0s - loss: 0.3483 - acc: 0.8600 Epoch 1315/2000 - 0s - loss: 0.3600 - acc: 0.8600 Epoch 1316/2000 - 0s - loss: 0.3435 - acc: 0.8800 Epoch 1317/2000 - 0s - loss: 0.3455 - acc: 0.8800 Epoch 1318/2000 - 0s - loss: 0.3530 - acc: 0.8600 Epoch 1319/2000 - 0s - loss: 0.3591 - acc: 0.8600 Epoch 1320/2000 - 0s - loss: 0.3410 - acc: 0.8600 Epoch 1321/2000 - 0s - loss: 0.3669 - acc: 0.8400 Epoch 1322/2000 - 0s - loss: 0.3460 - acc: 0.8800 Epoch 1323/2000 - 0s - loss: 0.3631 - acc: 0.8800 Epoch 1324/2000 - 0s - loss: 0.3669 - acc: 0.8600 Epoch 1325/2000 - 0s - loss: 0.3367 - acc: 0.9000 Epoch 1326/2000 - 0s - loss: 0.3635 - acc: 0.8800 Epoch 1327/2000 - 0s - loss: 0.3469 - acc: 0.8600 Epoch 1328/2000 - 0s - loss: 0.3525 - acc: 0.8600 Epoch 1329/2000 - 0s - loss: 0.3566 - acc: 0.8800 Epoch 1330/2000 - 0s - loss: 0.3464 - acc: 0.8600 Epoch 1331/2000 - 0s - loss: 0.3527 - acc: 0.8600 Epoch 1332/2000 - 0s - loss: 0.3507 - acc: 0.8600 Epoch 1333/2000 - 0s - loss: 0.3656 - acc: 0.8600 Epoch 1334/2000 - 0s - loss: 0.3925 - acc: 0.8600 Epoch 1335/2000 - 0s - loss: 0.4435 - acc: 0.8400 Epoch 1336/2000 - 0s - loss: 0.3692 - acc: 0.8600 Epoch 1337/2000 - 0s - loss: 0.3849 - acc: 0.8400 Epoch 1338/2000 - 0s - loss: 0.3892 - acc: 0.8800 Epoch 1339/2000 - 0s - loss: 0.3513 - acc: 0.8600 Epoch 1340/2000 - 0s - loss: 0.3516 - acc: 0.8800 Epoch 1341/2000 - 0s - loss: 0.3641 - acc: 0.8600 Epoch 1342/2000 - 0s - loss: 0.3569 - acc: 0.8600 Epoch 1343/2000 - 0s - loss: 0.3486 - acc: 0.8600 Epoch 1344/2000 - 0s - loss: 0.3745 - acc: 0.8600 Epoch 1345/2000 - 0s - loss: 0.3595 - acc: 0.8800 Epoch 1346/2000 - 0s - loss: 0.3963 - acc: 0.8400 Epoch 1347/2000 - 0s - loss: 0.3363 - acc: 0.8600 Epoch 1348/2000 - 0s - loss: 0.4110 - acc: 0.8200 Epoch 1349/2000 - 0s - loss: 0.3776 - acc: 0.8800 Epoch 1350/2000 - 0s - loss: 0.3783 - acc: 0.8400 Epoch 1351/2000 - 0s - loss: 0.3732 - acc: 0.8200 Epoch 1352/2000 - 0s - loss: 0.3348 - acc: 0.8800 Epoch 1353/2000 - 0s - loss: 0.3537 - acc: 0.8800 Epoch 1354/2000 - 0s - loss: 0.3441 - acc: 0.8800 Epoch 1355/2000 - 0s - loss: 0.3482 - acc: 0.8400 Epoch 1356/2000 - 0s - loss: 0.3359 - acc: 0.8600 Epoch 1357/2000 - 0s - loss: 0.3667 - acc: 0.8800 Epoch 1358/2000 - 0s - loss: 0.3254 - acc: 0.9000 Epoch 1359/2000 - 0s - loss: 0.3535 - acc: 0.8400 Epoch 1360/2000 - 0s - loss: 0.3491 - acc: 0.8600 Epoch 1361/2000 - 0s - loss: 0.3399 - acc: 0.8600 Epoch 1362/2000 - 0s - loss: 0.3585 - acc: 0.8600 Epoch 1363/2000 - 0s - loss: 0.3517 - acc: 0.8400 Epoch 1364/2000 - 0s - loss: 0.3493 - acc: 0.8800 Epoch 1365/2000 - 0s - loss: 0.3500 - acc: 0.8600 Epoch 1366/2000 - 0s - loss: 0.3432 - acc: 0.8400 Epoch 1367/2000 - 0s - loss: 0.3362 - acc: 0.8600 Epoch 1368/2000 - 0s - loss: 0.3351 - acc: 0.8600 Epoch 1369/2000 - 0s - loss: 0.3395 - acc: 0.8400 Epoch 1370/2000 - 0s - loss: 0.3403 - acc: 0.8400 Epoch 1371/2000 - 0s - loss: 0.3400 - acc: 0.8600 Epoch 1372/2000 - 0s - loss: 0.3436 - acc: 0.8800 Epoch 1373/2000 - 0s - loss: 0.3365 - acc: 0.8600 Epoch 1374/2000 - 0s - loss: 0.3518 - acc: 0.8800 Epoch 1375/2000 - 0s - loss: 0.3519 - acc: 0.8800 Epoch 1376/2000 - 0s - loss: 0.3329 - acc: 0.8600 Epoch 1377/2000 - 0s - loss: 0.3503 - acc: 0.8800 Epoch 1378/2000 - 0s - loss: 0.3432 - acc: 0.8800 Epoch 1379/2000 - 0s - loss: 0.3462 - acc: 0.8800 Epoch 1380/2000 - 0s - loss: 0.3395 - acc: 0.8400 Epoch 1381/2000 - 0s - loss: 0.3381 - acc: 0.8400 Epoch 1382/2000 - 0s - loss: 0.3455 - acc: 0.9000 Epoch 1383/2000 - 0s - loss: 0.3426 - acc: 0.8600 Epoch 1384/2000 - 0s - loss: 0.3444 - acc: 0.8800 Epoch 1385/2000 - 0s - loss: 0.3474 - acc: 0.8600 Epoch 1386/2000 - 0s - loss: 0.3334 - acc: 0.8800 Epoch 1387/2000 - 0s - loss: 0.3503 - acc: 0.8600 Epoch 1388/2000 - 0s - loss: 0.3379 - acc: 0.8600 Epoch 1389/2000 - 0s - loss: 0.3469 - acc: 0.8000 Epoch 1390/2000 - 0s - loss: 0.3467 - acc: 0.8400 Epoch 1391/2000 - 0s - loss: 0.3388 - acc: 0.8600 Epoch 1392/2000 - 0s - loss: 0.3383 - acc: 0.8400 Epoch 1393/2000 - 0s - loss: 0.3489 - acc: 0.8400 Epoch 1394/2000 - 0s - loss: 0.3368 - acc: 0.8400 Epoch 1395/2000 - 0s - loss: 0.3344 - acc: 0.8800 Epoch 1396/2000 - 0s - loss: 0.3421 - acc: 0.8800 Epoch 1397/2000 - 0s - loss: 0.3380 - acc: 0.8400 Epoch 1398/2000 - 0s - loss: 0.3307 - acc: 0.8800 Epoch 1399/2000 - 0s - loss: 0.3348 - acc: 0.8800 Epoch 1400/2000 - 0s - loss: 0.3489 - acc: 0.8600 Epoch 1401/2000 - 0s - loss: 0.3399 - acc: 0.8800 Epoch 1402/2000 - 0s - loss: 0.3415 - acc: 0.8600 Epoch 1403/2000 - 0s - loss: 0.3375 - acc: 0.8600 Epoch 1404/2000 - 0s - loss: 0.3436 - acc: 0.8400 Epoch 1405/2000 - 0s - loss: 0.3615 - acc: 0.8800 Epoch 1406/2000 - 0s - loss: 0.3322 - acc: 0.8800 Epoch 1407/2000 - 0s - loss: 0.3429 - acc: 0.8600 Epoch 1408/2000 - 0s - loss: 0.3450 - acc: 0.8600 Epoch 1409/2000 - 0s - loss: 0.3208 - acc: 0.8800 Epoch 1410/2000 - 0s - loss: 0.3514 - acc: 0.8800 Epoch 1411/2000 - 0s - loss: 0.3579 - acc: 0.8400 Epoch 1412/2000 - 0s - loss: 0.3533 - acc: 0.8600 Epoch 1413/2000 - 0s - loss: 0.3440 - acc: 0.8800 Epoch 1414/2000 - 0s - loss: 0.3235 - acc: 0.8600 Epoch 1415/2000 - 0s - loss: 0.3737 - acc: 0.8600 Epoch 1416/2000 - 0s - loss: 0.3908 - acc: 0.8400 Epoch 1417/2000 - 0s - loss: 0.3493 - acc: 0.8400 Epoch 1418/2000 - 0s - loss: 0.3701 - acc: 0.8800 Epoch 1419/2000 - 0s - loss: 0.3682 - acc: 0.8800 Epoch 1420/2000 - 0s - loss: 0.3843 - acc: 0.8600 Epoch 1421/2000 - 0s - loss: 0.3284 - acc: 0.8800 Epoch 1422/2000 - 0s - loss: 0.3441 - acc: 0.8800 Epoch 1423/2000 - 0s - loss: 0.3513 - acc: 0.8600 Epoch 1424/2000 - 0s - loss: 0.3236 - acc: 0.8800 Epoch 1425/2000 - 0s - loss: 0.3445 - acc: 0.8600 Epoch 1426/2000 - 0s - loss: 0.3418 - acc: 0.8400 Epoch 1427/2000 - 0s - loss: 0.3529 - acc: 0.8800 Epoch 1428/2000 - 0s - loss: 0.3363 - acc: 0.8600 Epoch 1429/2000 - 0s - loss: 0.3293 - acc: 0.8600 Epoch 1430/2000 - 0s - loss: 0.3348 - acc: 0.8400 Epoch 1431/2000 - 0s - loss: 0.3264 - acc: 0.8600 Epoch 1432/2000 - 0s - loss: 0.3280 - acc: 0.9000 Epoch 1433/2000 - 0s - loss: 0.3442 - acc: 0.8600 Epoch 1434/2000 - 0s - loss: 0.3582 - acc: 0.8400 Epoch 1435/2000 - 0s - loss: 0.3445 - acc: 0.8400 Epoch 1436/2000 - 0s - loss: 0.3476 - acc: 0.8800 Epoch 1437/2000 - 0s - loss: 0.3529 - acc: 0.8200 Epoch 1438/2000 - 0s - loss: 0.3290 - acc: 0.8200 Epoch 1439/2000 - 0s - loss: 0.3345 - acc: 0.8800 Epoch 1440/2000 - 0s - loss: 0.3291 - acc: 0.8600 Epoch 1441/2000 - 0s - loss: 0.3233 - acc: 0.8600 Epoch 1442/2000 - 0s - loss: 0.3311 - acc: 0.8600 Epoch 1443/2000 - 0s - loss: 0.3331 - acc: 0.8800 Epoch 1444/2000 - 0s - loss: 0.3227 - acc: 0.8800 Epoch 1445/2000 - 0s - loss: 0.3256 - acc: 0.8800 Epoch 1446/2000 - 0s - loss: 0.3223 - acc: 0.8600 Epoch 1447/2000 - 0s - loss: 0.3271 - acc: 0.8800 Epoch 1448/2000 - 0s - loss: 0.3267 - acc: 0.8600 Epoch 1449/2000 - 0s - loss: 0.3232 - acc: 0.8800 Epoch 1450/2000 - 0s - loss: 0.3393 - acc: 0.8600 Epoch 1451/2000 - 0s - loss: 0.3713 - acc: 0.8200 Epoch 1452/2000 - 0s - loss: 0.3444 - acc: 0.8600 Epoch 1453/2000 - 0s - loss: 0.3646 - acc: 0.8800 Epoch 1454/2000 - 0s - loss: 0.3309 - acc: 0.8600 Epoch 1455/2000 - 0s - loss: 0.3442 - acc: 0.8600 Epoch 1456/2000 - 0s - loss: 0.3307 - acc: 0.8800 Epoch 1457/2000 - 0s - loss: 0.3355 - acc: 0.8600 Epoch 1458/2000 - 0s - loss: 0.3194 - acc: 0.8600 Epoch 1459/2000 - 0s - loss: 0.3261 - acc: 0.8600 Epoch 1460/2000 - 0s - loss: 0.3312 - acc: 0.8800 Epoch 1461/2000 - 0s - loss: 0.3183 - acc: 0.8800 Epoch 1462/2000 - 0s - loss: 0.3456 - acc: 0.8600 Epoch 1463/2000 - 0s - loss: 0.3241 - acc: 0.9200 Epoch 1464/2000 - 0s - loss: 0.3437 - acc: 0.8600 Epoch 1465/2000 - 0s - loss: 0.3185 - acc: 0.8800 Epoch 1466/2000 - 0s - loss: 0.3422 - acc: 0.8600 Epoch 1467/2000 - 0s - loss: 0.3499 - acc: 0.8800 Epoch 1468/2000 - 0s - loss: 0.3297 - acc: 0.8600 Epoch 1469/2000 - 0s - loss: 0.3893 - acc: 0.8600 Epoch 1470/2000 - 0s - loss: 0.3395 - acc: 0.8800 Epoch 1471/2000 - 0s - loss: 0.3731 - acc: 0.8600 Epoch 1472/2000 - 0s - loss: 0.3669 - acc: 0.8200 Epoch 1473/2000 - 0s - loss: 0.3458 - acc: 0.9000 Epoch 1474/2000 - 0s - loss: 0.3333 - acc: 0.8800 Epoch 1475/2000 - 0s - loss: 0.3382 - acc: 0.8800 Epoch 1476/2000 - 0s - loss: 0.3369 - acc: 0.8400 Epoch 1477/2000 - 0s - loss: 0.3338 - acc: 0.8600 Epoch 1478/2000 - 0s - loss: 0.3411 - acc: 0.8600 Epoch 1479/2000 - 0s - loss: 0.3293 - acc: 0.8400 Epoch 1480/2000 - 0s - loss: 0.3257 - acc: 0.8400 Epoch 1481/2000 - 0s - loss: 0.3187 - acc: 0.8800 Epoch 1482/2000 - 0s - loss: 0.3223 - acc: 0.8600 Epoch 1483/2000 - 0s - loss: 0.3233 - acc: 0.8600 Epoch 1484/2000 - 0s - loss: 0.3207 - acc: 0.8600 Epoch 1485/2000 - 0s - loss: 0.3186 - acc: 0.8600 Epoch 1486/2000 - 0s - loss: 0.3268 - acc: 0.8600 Epoch 1487/2000 - 0s - loss: 0.3236 - acc: 0.8600 Epoch 1488/2000 - 0s - loss: 0.3250 - acc: 0.8600 Epoch 1489/2000 - 0s - loss: 0.3424 - acc: 0.8600 Epoch 1490/2000 - 0s - loss: 0.3301 - acc: 0.8800 Epoch 1491/2000 - 0s - loss: 0.3263 - acc: 0.8400 Epoch 1492/2000 - 0s - loss: 0.3334 - acc: 0.8600 Epoch 1493/2000 - 0s - loss: 0.3343 - acc: 0.8600 Epoch 1494/2000 - 0s - loss: 0.3377 - acc: 0.8400 Epoch 1495/2000 - 0s - loss: 0.3123 - acc: 0.8800 Epoch 1496/2000 - 0s - loss: 0.3558 - acc: 0.8600 Epoch 1497/2000 - 0s - loss: 0.3068 - acc: 0.9200 Epoch 1498/2000 - 0s - loss: 0.4564 - acc: 0.7600 Epoch 1499/2000 - 0s - loss: 0.3474 - acc: 0.8800 Epoch 1500/2000 - 0s - loss: 0.3741 - acc: 0.8600 Epoch 1501/2000 - 0s - loss: 0.3419 - acc: 0.8600 Epoch 1502/2000 - 0s - loss: 0.3307 - acc: 0.8600 Epoch 1503/2000 - 0s - loss: 0.3456 - acc: 0.8800 Epoch 1504/2000 - 0s - loss: 0.3362 - acc: 0.8600 Epoch 1505/2000 - 0s - loss: 0.3292 - acc: 0.8600 Epoch 1506/2000 - 0s - loss: 0.3272 - acc: 0.8800 Epoch 1507/2000 - 0s - loss: 0.3388 - acc: 0.8600 Epoch 1508/2000 - 0s - loss: 0.3239 - acc: 0.8600 Epoch 1509/2000 - 0s - loss: 0.3155 - acc: 0.9000 Epoch 1510/2000 - 0s - loss: 0.3321 - acc: 0.8600 Epoch 1511/2000 - 0s - loss: 0.3254 - acc: 0.8800 Epoch 1512/2000 - 0s - loss: 0.3292 - acc: 0.9000 Epoch 1513/2000 - 0s - loss: 0.3301 - acc: 0.8800 Epoch 1514/2000 - 0s - loss: 0.3355 - acc: 0.8600 Epoch 1515/2000 - 0s - loss: 0.3397 - acc: 0.8600 Epoch 1516/2000 - 0s - loss: 0.3188 - acc: 0.8800 Epoch 1517/2000 - 0s - loss: 0.3114 - acc: 0.8800 Epoch 1518/2000 - 0s - loss: 0.3264 - acc: 0.8600 Epoch 1519/2000 - 0s - loss: 0.3184 - acc: 0.8600 Epoch 1520/2000 - 0s - loss: 0.3145 - acc: 0.8600 Epoch 1521/2000 - 0s - loss: 0.3189 - acc: 0.8800 Epoch 1522/2000 - 0s - loss: 0.3121 - acc: 0.8600 Epoch 1523/2000 - 0s - loss: 0.3133 - acc: 0.8600 Epoch 1524/2000 - 0s - loss: 0.3153 - acc: 0.8600 Epoch 1525/2000 - 0s - loss: 0.3190 - acc: 0.8400 Epoch 1526/2000 - 0s - loss: 0.3099 - acc: 0.9000 Epoch 1527/2000 - 0s - loss: 0.3197 - acc: 0.8800 Epoch 1528/2000 - 0s - loss: 0.3217 - acc: 0.8400 Epoch 1529/2000 - 0s - loss: 0.3135 - acc: 0.8600 Epoch 1530/2000 - 0s - loss: 0.3257 - acc: 0.8800 Epoch 1531/2000 - 0s - loss: 0.3183 - acc: 0.8800 Epoch 1532/2000 - 0s - loss: 0.3194 - acc: 0.8600 Epoch 1533/2000 - 0s - loss: 0.3287 - acc: 0.8400 Epoch 1534/2000 - 0s - loss: 0.3198 - acc: 0.8800 Epoch 1535/2000 - 0s - loss: 0.3161 - acc: 0.8800 Epoch 1536/2000 - 0s - loss: 0.3104 - acc: 0.8600 Epoch 1537/2000 - 0s - loss: 0.3142 - acc: 0.8600 Epoch 1538/2000 - 0s - loss: 0.3165 - acc: 0.8600 Epoch 1539/2000 - 0s - loss: 0.3180 - acc: 0.8800 Epoch 1540/2000 - 0s - loss: 0.3185 - acc: 0.8400 Epoch 1541/2000 - 0s - loss: 0.3145 - acc: 0.9000 Epoch 1542/2000 - 0s - loss: 0.3111 - acc: 0.8600 Epoch 1543/2000 - 0s - loss: 0.3114 - acc: 0.8400 Epoch 1544/2000 - 0s - loss: 0.3201 - acc: 0.8600 Epoch 1545/2000 - 0s - loss: 0.3105 - acc: 0.8800 Epoch 1546/2000 - 0s - loss: 0.3477 - acc: 0.8600 Epoch 1547/2000 - 0s - loss: 0.3065 - acc: 0.8800 Epoch 1548/2000 - 0s - loss: 0.3624 - acc: 0.8600 Epoch 1549/2000 - 0s - loss: 0.3261 - acc: 0.8200 Epoch 1550/2000 - 0s - loss: 0.3174 - acc: 0.9000 Epoch 1551/2000 - 0s - loss: 0.3209 - acc: 0.8600 Epoch 1552/2000 - 0s - loss: 0.3156 - acc: 0.8600 Epoch 1553/2000 - 0s - loss: 0.3232 - acc: 0.8800 Epoch 1554/2000 - 0s - loss: 0.3022 - acc: 0.8800 Epoch 1555/2000 - 0s - loss: 0.3217 - acc: 0.8800 Epoch 1556/2000 - 0s - loss: 0.3141 - acc: 0.8800 Epoch 1557/2000 - 0s - loss: 0.3135 - acc: 0.8800 Epoch 1558/2000 - 0s - loss: 0.3084 - acc: 0.8800 Epoch 1559/2000 - 0s - loss: 0.3148 - acc: 0.9000 Epoch 1560/2000 - 0s - loss: 0.3151 - acc: 0.8600 Epoch 1561/2000 - 0s - loss: 0.3111 - acc: 0.8800 Epoch 1562/2000 - 0s - loss: 0.3125 - acc: 0.8800 Epoch 1563/2000 - 0s - loss: 0.3078 - acc: 0.8600 Epoch 1564/2000 - 0s - loss: 0.3172 - acc: 0.8800 Epoch 1565/2000 - 0s - loss: 0.3084 - acc: 0.8800 Epoch 1566/2000 - 0s - loss: 0.3055 - acc: 0.8800 Epoch 1567/2000 - 0s - loss: 0.3236 - acc: 0.8800 Epoch 1568/2000 - 0s - loss: 0.3189 - acc: 0.8600 Epoch 1569/2000 - 0s - loss: 0.3235 - acc: 0.8600 Epoch 1570/2000 - 0s - loss: 0.3082 - acc: 0.8600 Epoch 1571/2000 - 0s - loss: 0.3184 - acc: 0.8600 Epoch 1572/2000 - 0s - loss: 0.3387 - acc: 0.8200 Epoch 1573/2000 - 0s - loss: 0.3104 - acc: 0.8800 Epoch 1574/2000 - 0s - loss: 0.3138 - acc: 0.8800 Epoch 1575/2000 - 0s - loss: 0.3132 - acc: 0.8600 Epoch 1576/2000 - 0s - loss: 0.3277 - acc: 0.8600 Epoch 1577/2000 - 0s - loss: 0.3048 - acc: 0.8800 Epoch 1578/2000 - 0s - loss: 0.3253 - acc: 0.8600 Epoch 1579/2000 - 0s - loss: 0.3237 - acc: 0.8600 Epoch 1580/2000 - 0s - loss: 0.3300 - acc: 0.8400 Epoch 1581/2000 - 0s - loss: 0.3258 - acc: 0.8400 Epoch 1582/2000 - 0s - loss: 0.3325 - acc: 0.8600 Epoch 1583/2000 - 0s - loss: 0.3113 - acc: 0.8600 Epoch 1584/2000 - 0s - loss: 0.3490 - acc: 0.8600 Epoch 1585/2000 - 0s - loss: 0.3269 - acc: 0.8600 Epoch 1586/2000 - 0s - loss: 0.3208 - acc: 0.8600 Epoch 1587/2000 - 0s - loss: 0.3116 - acc: 0.9000 Epoch 1588/2000 - 0s - loss: 0.3088 - acc: 0.8800 Epoch 1589/2000 - 0s - loss: 0.3090 - acc: 0.8800 Epoch 1590/2000 - 0s - loss: 0.3060 - acc: 0.8600 Epoch 1591/2000 - 0s - loss: 0.3054 - acc: 0.9000 Epoch 1592/2000 - 0s - loss: 0.3291 - acc: 0.8800 Epoch 1593/2000 - 0s - loss: 0.2922 - acc: 0.8800 Epoch 1594/2000 - 0s - loss: 0.3320 - acc: 0.8800 Epoch 1595/2000 - 0s - loss: 0.3142 - acc: 0.8800 Epoch 1596/2000 - 0s - loss: 0.3115 - acc: 0.8600 Epoch 1597/2000 - 0s - loss: 0.3341 - acc: 0.8600 Epoch 1598/2000 - 0s - loss: 0.3169 - acc: 0.8800 Epoch 1599/2000 - 0s - loss: 0.3073 - acc: 0.8800 Epoch 1600/2000 - 0s - loss: 0.3157 - acc: 0.8600 Epoch 1601/2000 - 0s - loss: 0.3096 - acc: 0.8400 Epoch 1602/2000 - 0s - loss: 0.2993 - acc: 0.8600 Epoch 1603/2000 - 0s - loss: 0.3113 - acc: 0.8800 Epoch 1604/2000 - 0s - loss: 0.3022 - acc: 0.8800 Epoch 1605/2000 - 0s - loss: 0.3108 - acc: 0.8800 Epoch 1606/2000 - 0s - loss: 0.3284 - acc: 0.9000 Epoch 1607/2000 - 0s - loss: 0.3254 - acc: 0.8800 Epoch 1608/2000 - 0s - loss: 0.3196 - acc: 0.8600 Epoch 1609/2000 - 0s - loss: 0.3116 - acc: 0.8600 Epoch 1610/2000 - 0s - loss: 0.3253 - acc: 0.8400 Epoch 1611/2000 - 0s - loss: 0.3157 - acc: 0.8400 Epoch 1612/2000 - 0s - loss: 0.3200 - acc: 0.9000 Epoch 1613/2000 - 0s - loss: 0.3208 - acc: 0.8800 Epoch 1614/2000 - 0s - loss: 0.3177 - acc: 0.8600 Epoch 1615/2000 - 0s - loss: 0.3317 - acc: 0.8800 Epoch 1616/2000 - 0s - loss: 0.3196 - acc: 0.8800 Epoch 1617/2000 - 0s - loss: 0.3225 - acc: 0.8800 Epoch 1618/2000 - 0s - loss: 0.3154 - acc: 0.8600 Epoch 1619/2000 - 0s - loss: 0.3089 - acc: 0.8800 Epoch 1620/2000 - 0s - loss: 0.3092 - acc: 0.8600 Epoch 1621/2000 - 0s - loss: 0.3115 - acc: 0.8800 Epoch 1622/2000 - 0s - loss: 0.3021 - acc: 0.8600 Epoch 1623/2000 - 0s - loss: 0.3044 - acc: 0.8400 Epoch 1624/2000 - 0s - loss: 0.3092 - acc: 0.8400 Epoch 1625/2000 - 0s - loss: 0.3064 - acc: 0.9000 Epoch 1626/2000 - 0s - loss: 0.3078 - acc: 0.8800 Epoch 1627/2000 - 0s - loss: 0.2989 - acc: 0.8800 Epoch 1628/2000 - 0s - loss: 0.3068 - acc: 0.8800 Epoch 1629/2000 - 0s - loss: 0.3024 - acc: 0.8600 Epoch 1630/2000 - 0s - loss: 0.3163 - acc: 0.8800 Epoch 1631/2000 - 0s - loss: 0.3112 - acc: 0.8800 Epoch 1632/2000 - 0s - loss: 0.3108 - acc: 0.8600 Epoch 1633/2000 - 0s - loss: 0.3037 - acc: 0.8400 Epoch 1634/2000 - 0s - loss: 0.3316 - acc: 0.8600 Epoch 1635/2000 - 0s - loss: 0.2877 - acc: 0.9000 Epoch 1636/2000 - 0s - loss: 0.3264 - acc: 0.8400 Epoch 1637/2000 - 0s - loss: 0.3005 - acc: 0.9000 Epoch 1638/2000 - 0s - loss: 0.3234 - acc: 0.8800 Epoch 1639/2000 - 0s - loss: 0.2893 - acc: 0.8800 Epoch 1640/2000 - 0s - loss: 0.3232 - acc: 0.8600 Epoch 1641/2000 - 0s - loss: 0.3361 - acc: 0.8600 Epoch 1642/2000 - 0s - loss: 0.3075 - acc: 0.8600 Epoch 1643/2000 - 0s - loss: 0.3262 - acc: 0.9200 Epoch 1644/2000 - 0s - loss: 0.3263 - acc: 0.8800 Epoch 1645/2000 - 0s - loss: 0.3304 - acc: 0.8600 Epoch 1646/2000 - 0s - loss: 0.3203 - acc: 0.8800 Epoch 1647/2000 - 0s - loss: 0.3203 - acc: 0.8600 Epoch 1648/2000 - 0s - loss: 0.3217 - acc: 0.8400 Epoch 1649/2000 - 0s - loss: 0.3106 - acc: 0.8600 Epoch 1650/2000 - 0s - loss: 0.3118 - acc: 0.8800 Epoch 1651/2000 - 0s - loss: 0.3201 - acc: 0.8800 Epoch 1652/2000 - 0s - loss: 0.3363 - acc: 0.8800 Epoch 1653/2000 - 0s - loss: 0.3224 - acc: 0.8400 Epoch 1654/2000 - 0s - loss: 0.3874 - acc: 0.8400 Epoch 1655/2000 - 0s - loss: 0.3190 - acc: 0.8200 Epoch 1656/2000 - 0s - loss: 0.3123 - acc: 0.8400 Epoch 1657/2000 - 0s - loss: 0.3050 - acc: 0.8600 Epoch 1658/2000 - 0s - loss: 0.3057 - acc: 0.8600 Epoch 1659/2000 - 0s - loss: 0.2977 - acc: 0.8800 Epoch 1660/2000 - 0s - loss: 0.3052 - acc: 0.8800 Epoch 1661/2000 - 0s - loss: 0.3053 - acc: 0.8800 Epoch 1662/2000 - 0s - loss: 0.3170 - acc: 0.8600 Epoch 1663/2000 - 0s - loss: 0.3116 - acc: 0.8400 Epoch 1664/2000 - 0s - loss: 0.3084 - acc: 0.8600 Epoch 1665/2000 - 0s - loss: 0.3033 - acc: 0.8600 Epoch 1666/2000 - 0s - loss: 0.3073 - acc: 0.8200 Epoch 1667/2000 - 0s - loss: 0.3058 - acc: 0.8400 Epoch 1668/2000 - 0s - loss: 0.2972 - acc: 0.8800 Epoch 1669/2000 - 0s - loss: 0.2884 - acc: 0.9200 Epoch 1670/2000 - 0s - loss: 0.3016 - acc: 0.8600 Epoch 1671/2000 - 0s - loss: 0.3076 - acc: 0.8600 Epoch 1672/2000 - 0s - loss: 0.3027 - acc: 0.8400 Epoch 1673/2000 - 0s - loss: 0.2981 - acc: 0.8800 Epoch 1674/2000 - 0s - loss: 0.3054 - acc: 0.8600 Epoch 1675/2000 - 0s - loss: 0.3103 - acc: 0.8800 Epoch 1676/2000 - 0s - loss: 0.3291 - acc: 0.8600 Epoch 1677/2000 - 0s - loss: 0.3038 - acc: 0.8800 Epoch 1678/2000 - 0s - loss: 0.3260 - acc: 0.9000 Epoch 1679/2000 - 0s - loss: 0.3012 - acc: 0.8800 Epoch 1680/2000 - 0s - loss: 0.3522 - acc: 0.8600 Epoch 1681/2000 - 0s - loss: 0.2928 - acc: 0.8800 Epoch 1682/2000 - 0s - loss: 0.3229 - acc: 0.8600 Epoch 1683/2000 - 0s - loss: 0.3033 - acc: 0.9000 Epoch 1684/2000 - 0s - loss: 0.3105 - acc: 0.8800 Epoch 1685/2000 - 0s - loss: 0.3043 - acc: 0.8800 Epoch 1686/2000 - 0s - loss: 0.2995 - acc: 0.8600 Epoch 1687/2000 - 0s - loss: 0.3197 - acc: 0.8200 Epoch 1688/2000 - 0s - loss: 0.3251 - acc: 0.8800 Epoch 1689/2000 - 0s - loss: 0.2893 - acc: 0.8800 Epoch 1690/2000 - 0s - loss: 0.3074 - acc: 0.8800 Epoch 1691/2000 - 0s - loss: 0.3005 - acc: 0.8400 Epoch 1692/2000 - 0s - loss: 0.2981 - acc: 0.8800 Epoch 1693/2000 - 0s - loss: 0.3011 - acc: 0.8400 Epoch 1694/2000 - 0s - loss: 0.3025 - acc: 0.8200 Epoch 1695/2000 - 0s - loss: 0.3024 - acc: 0.8600 Epoch 1696/2000 - 0s - loss: 0.2944 - acc: 0.9000 Epoch 1697/2000 - 0s - loss: 0.2933 - acc: 0.8800 Epoch 1698/2000 - 0s - loss: 0.2997 - acc: 0.8600 Epoch 1699/2000 - 0s - loss: 0.2998 - acc: 0.8600 Epoch 1700/2000 - 0s - loss: 0.2942 - acc: 0.8800 Epoch 1701/2000 - 0s - loss: 0.3129 - acc: 0.8600 Epoch 1702/2000 - 0s - loss: 0.2936 - acc: 0.8800 Epoch 1703/2000 - 0s - loss: 0.3150 - acc: 0.8600 Epoch 1704/2000 - 0s - loss: 0.2984 - acc: 0.8600 Epoch 1705/2000 - 0s - loss: 0.3069 - acc: 0.8600 Epoch 1706/2000 - 0s - loss: 0.3023 - acc: 0.8600 Epoch 1707/2000 - 0s - loss: 0.3035 - acc: 0.8800 Epoch 1708/2000 - 0s - loss: 0.3020 - acc: 0.8600 Epoch 1709/2000 - 0s - loss: 0.2921 - acc: 0.8600 Epoch 1710/2000 - 0s - loss: 0.2986 - acc: 0.8800 Epoch 1711/2000 - 0s - loss: 0.2971 - acc: 0.9000 Epoch 1712/2000 - 0s - loss: 0.2880 - acc: 0.9000 Epoch 1713/2000 - 0s - loss: 0.2897 - acc: 0.9000 Epoch 1714/2000 - 0s - loss: 0.2886 - acc: 0.9000 Epoch 1715/2000 - 0s - loss: 0.3046 - acc: 0.9000 Epoch 1716/2000 - 0s - loss: 0.2920 - acc: 0.8800 Epoch 1717/2000 - 0s - loss: 0.3026 - acc: 0.8600 Epoch 1718/2000 - 0s - loss: 0.2958 - acc: 0.8600 Epoch 1719/2000 - 0s - loss: 0.3032 - acc: 0.8800 Epoch 1720/2000 - 0s - loss: 0.2936 - acc: 0.8400 Epoch 1721/2000 - 0s - loss: 0.2969 - acc: 0.8800 Epoch 1722/2000 - 0s - loss: 0.2971 - acc: 0.8800 Epoch 1723/2000 - 0s - loss: 0.2965 - acc: 0.8800 Epoch 1724/2000 - 0s - loss: 0.2900 - acc: 0.9000 Epoch 1725/2000 - 0s - loss: 0.2993 - acc: 0.8800 Epoch 1726/2000 - 0s - loss: 0.2943 - acc: 0.8600 Epoch 1727/2000 - 0s - loss: 0.2861 - acc: 0.8600 Epoch 1728/2000 - 0s - loss: 0.3040 - acc: 0.8800 Epoch 1729/2000 - 0s - loss: 0.2877 - acc: 0.8800 Epoch 1730/2000 - 0s - loss: 0.2935 - acc: 0.8800 Epoch 1731/2000 - 0s - loss: 0.2890 - acc: 0.8800 Epoch 1732/2000 - 0s - loss: 0.2881 - acc: 0.8800 Epoch 1733/2000 - 0s - loss: 0.3010 - acc: 0.8600 Epoch 1734/2000 - 0s - loss: 0.3096 - acc: 0.8400 Epoch 1735/2000 - 0s - loss: 0.2924 - acc: 0.8800 Epoch 1736/2000 - 0s - loss: 0.3021 - acc: 0.8600 Epoch 1737/2000 - 0s - loss: 0.3064 - acc: 0.8800 Epoch 1738/2000 - 0s - loss: 0.2945 - acc: 0.8800 Epoch 1739/2000 - 0s - loss: 0.3044 - acc: 0.8600 Epoch 1740/2000 - 0s - loss: 0.3100 - acc: 0.8800 Epoch 1741/2000 - 0s - loss: 0.3203 - acc: 0.8800 Epoch 1742/2000 - 0s - loss: 0.3324 - acc: 0.8400 Epoch 1743/2000 - 0s - loss: 0.3079 - acc: 0.8800 Epoch 1744/2000 - 0s - loss: 0.3002 - acc: 0.8800 Epoch 1745/2000 - 0s - loss: 0.3108 - acc: 0.8400 Epoch 1746/2000 - 0s - loss: 0.2933 - acc: 0.8600 Epoch 1747/2000 - 0s - loss: 0.3108 - acc: 0.8800 Epoch 1748/2000 - 0s - loss: 0.2959 - acc: 0.9000 Epoch 1749/2000 - 0s - loss: 0.2987 - acc: 0.8600 Epoch 1750/2000 - 0s - loss: 0.2917 - acc: 0.8800 Epoch 1751/2000 - 0s - loss: 0.2961 - acc: 0.8800 Epoch 1752/2000 - 0s - loss: 0.2953 - acc: 0.8800 Epoch 1753/2000 - 0s - loss: 0.2903 - acc: 0.8600 Epoch 1754/2000 - 0s - loss: 0.2900 - acc: 0.8400 Epoch 1755/2000 - 0s - loss: 0.2989 - acc: 0.8600 Epoch 1756/2000 - 0s - loss: 0.2975 - acc: 0.8800 Epoch 1757/2000 - 0s - loss: 0.2905 - acc: 0.8800 Epoch 1758/2000 - 0s - loss: 0.3005 - acc: 0.8800 Epoch 1759/2000 - 0s - loss: 0.2859 - acc: 0.9000 Epoch 1760/2000 - 0s - loss: 0.3111 - acc: 0.8600 Epoch 1761/2000 - 0s - loss: 0.3141 - acc: 0.8400 Epoch 1762/2000 - 0s - loss: 0.3107 - acc: 0.8600 Epoch 1763/2000 - 0s - loss: 0.3207 - acc: 0.8600 Epoch 1764/2000 - 0s - loss: 0.2927 - acc: 0.8800 Epoch 1765/2000 - 0s - loss: 0.2995 - acc: 0.8800 Epoch 1766/2000 - 0s - loss: 0.3005 - acc: 0.8800 Epoch 1767/2000 - 0s - loss: 0.2811 - acc: 0.8800 Epoch 1768/2000 - 0s - loss: 0.3018 - acc: 0.8600 Epoch 1769/2000 - 0s - loss: 0.2974 - acc: 0.8400 Epoch 1770/2000 - 0s - loss: 0.2852 - acc: 0.8800 Epoch 1771/2000 - 0s - loss: 0.2823 - acc: 0.9000 Epoch 1772/2000 - 0s - loss: 0.2887 - acc: 0.8800 Epoch 1773/2000 - 0s - loss: 0.2930 - acc: 0.8800 Epoch 1774/2000 - 0s - loss: 0.2909 - acc: 0.8400 Epoch 1775/2000 - 0s - loss: 0.2945 - acc: 0.8600 Epoch 1776/2000 - 0s - loss: 0.2822 - acc: 0.8800 Epoch 1777/2000 - 0s - loss: 0.2894 - acc: 0.8600 Epoch 1778/2000 - 0s - loss: 0.2858 - acc: 0.9000 Epoch 1779/2000 - 0s - loss: 0.2924 - acc: 0.8600 Epoch 1780/2000 - 0s - loss: 0.2888 - acc: 0.8600 Epoch 1781/2000 - 0s - loss: 0.2900 - acc: 0.8600 Epoch 1782/2000 - 0s - loss: 0.2899 - acc: 0.8600 Epoch 1783/2000 - 0s - loss: 0.2924 - acc: 0.8800 Epoch 1784/2000 - 0s - loss: 0.2953 - acc: 0.8600 Epoch 1785/2000 - 0s - loss: 0.2873 - acc: 0.8800 Epoch 1786/2000 - 0s - loss: 0.2891 - acc: 0.8600 Epoch 1787/2000 - 0s - loss: 0.2866 - acc: 0.8600 Epoch 1788/2000 - 0s - loss: 0.2825 - acc: 0.8800 Epoch 1789/2000 - 0s - loss: 0.2840 - acc: 0.8600 Epoch 1790/2000 - 0s - loss: 0.2993 - acc: 0.8600 Epoch 1791/2000 - 0s - loss: 0.2773 - acc: 0.8800 Epoch 1792/2000 - 0s - loss: 0.3035 - acc: 0.8600 Epoch 1793/2000 - 0s - loss: 0.2906 - acc: 0.8400 Epoch 1794/2000 - 0s - loss: 0.2949 - acc: 0.8800 Epoch 1795/2000 - 0s - loss: 0.2874 - acc: 0.8600 Epoch 1796/2000 - 0s - loss: 0.2910 - acc: 0.8800 Epoch 1797/2000 - 0s - loss: 0.2816 - acc: 0.8800 Epoch 1798/2000 - 0s - loss: 0.2815 - acc: 0.9000 Epoch 1799/2000 - 0s - loss: 0.2909 - acc: 0.8400 Epoch 1800/2000 - 0s - loss: 0.2822 - acc: 0.8600 Epoch 1801/2000 - 0s - loss: 0.2891 - acc: 0.8800 Epoch 1802/2000 - 0s - loss: 0.2923 - acc: 0.8600 Epoch 1803/2000 - 0s - loss: 0.2922 - acc: 0.8800 Epoch 1804/2000 - 0s - loss: 0.2857 - acc: 0.8800 Epoch 1805/2000 - 0s - loss: 0.2850 - acc: 0.8800 Epoch 1806/2000 - 0s - loss: 0.2878 - acc: 0.8600 Epoch 1807/2000 - 0s - loss: 0.2851 - acc: 0.8400 Epoch 1808/2000 - 0s - loss: 0.2851 - acc: 0.8600 Epoch 1809/2000 - 0s - loss: 0.2883 - acc: 0.8800 Epoch 1810/2000 - 0s - loss: 0.2885 - acc: 0.8600 Epoch 1811/2000 - 0s - loss: 0.2894 - acc: 0.8600 Epoch 1812/2000 - 0s - loss: 0.2853 - acc: 0.9000 Epoch 1813/2000 - 0s - loss: 0.3019 - acc: 0.8600 Epoch 1814/2000 - 0s - loss: 0.2879 - acc: 0.8800 Epoch 1815/2000 - 0s - loss: 0.2841 - acc: 0.8800 Epoch 1816/2000 - 0s - loss: 0.3097 - acc: 0.8800 Epoch 1817/2000 - 0s - loss: 0.2839 - acc: 0.8800 Epoch 1818/2000 - 0s - loss: 0.3350 - acc: 0.8200 Epoch 1819/2000 - 0s - loss: 0.2752 - acc: 0.9000 Epoch 1820/2000 - 0s - loss: 0.3038 - acc: 0.9000 Epoch 1821/2000 - 0s - loss: 0.2931 - acc: 0.9000 Epoch 1822/2000 - 0s - loss: 0.2839 - acc: 0.8800 Epoch 1823/2000 - 0s - loss: 0.2965 - acc: 0.8800 Epoch 1824/2000 - 0s - loss: 0.2812 - acc: 0.8800 Epoch 1825/2000 - 0s - loss: 0.3015 - acc: 0.8800 Epoch 1826/2000 - 0s - loss: 0.2956 - acc: 0.9000 Epoch 1827/2000 - 0s - loss: 0.2924 - acc: 0.8600 Epoch 1828/2000 - 0s - loss: 0.2986 - acc: 0.8600 Epoch 1829/2000 - 0s - loss: 0.2847 - acc: 0.9000 Epoch 1830/2000 - 0s - loss: 0.3003 - acc: 0.8800 Epoch 1831/2000 - 0s - loss: 0.3023 - acc: 0.8600 Epoch 1832/2000 - 0s - loss: 0.3028 - acc: 0.8800 Epoch 1833/2000 - 0s - loss: 0.3224 - acc: 0.8400 Epoch 1834/2000 - 0s - loss: 0.3211 - acc: 0.8600 Epoch 1835/2000 - 0s - loss: 0.2992 - acc: 0.8800 Epoch 1836/2000 - 0s - loss: 0.3049 - acc: 0.8600 Epoch 1837/2000 - 0s - loss: 0.2887 - acc: 0.8600 Epoch 1838/2000 - 0s - loss: 0.3153 - acc: 0.8800 Epoch 1839/2000 - 0s - loss: 0.2878 - acc: 0.8600 Epoch 1840/2000 - 0s - loss: 0.2957 - acc: 0.8600 Epoch 1841/2000 - 0s - loss: 0.2910 - acc: 0.8600 Epoch 1842/2000 - 0s - loss: 0.2874 - acc: 0.8800 Epoch 1843/2000 - 0s - loss: 0.2839 - acc: 0.8800 Epoch 1844/2000 - 0s - loss: 0.2882 - acc: 0.8800 Epoch 1845/2000 - 0s - loss: 0.2968 - acc: 0.8800 Epoch 1846/2000 - 0s - loss: 0.2853 - acc: 0.8800 Epoch 1847/2000 - 0s - loss: 0.2898 - acc: 0.8600 Epoch 1848/2000 - 0s - loss: 0.2827 - acc: 0.8800 Epoch 1849/2000 - 0s - loss: 0.2906 - acc: 0.8800 Epoch 1850/2000 - 0s - loss: 0.2851 - acc: 0.9000 Epoch 1851/2000 - 0s - loss: 0.2853 - acc: 0.8800 Epoch 1852/2000 - 0s - loss: 0.2839 - acc: 0.8400 Epoch 1853/2000 - 0s - loss: 0.2780 - acc: 0.8800 Epoch 1854/2000 - 0s - loss: 0.2951 - acc: 0.8800 Epoch 1855/2000 - 0s - loss: 0.2855 - acc: 0.8400 Epoch 1856/2000 - 0s - loss: 0.2842 - acc: 0.8800 Epoch 1857/2000 - 0s - loss: 0.2883 - acc: 0.8800 Epoch 1858/2000 - 0s - loss: 0.2800 - acc: 0.8800 Epoch 1859/2000 - 0s - loss: 0.2924 - acc: 0.8600 Epoch 1860/2000 - 0s - loss: 0.2840 - acc: 0.8600 Epoch 1861/2000 - 0s - loss: 0.2891 - acc: 0.9000 Epoch 1862/2000 - 0s - loss: 0.3048 - acc: 0.8400 Epoch 1863/2000 - 0s - loss: 0.2817 - acc: 0.8600 Epoch 1864/2000 - 0s - loss: 0.2946 - acc: 0.8600 Epoch 1865/2000 - 0s - loss: 0.2892 - acc: 0.8600 Epoch 1866/2000 - 0s - loss: 0.2868 - acc: 0.8800 Epoch 1867/2000 - 0s - loss: 0.2989 - acc: 0.8800 Epoch 1868/2000 - 0s - loss: 0.2875 - acc: 0.8800 Epoch 1869/2000 - 0s - loss: 0.2909 - acc: 0.8400 Epoch 1870/2000 - 0s - loss: 0.2942 - acc: 0.8600 Epoch 1871/2000 - 0s - loss: 0.2946 - acc: 0.8400 Epoch 1872/2000 - 0s - loss: 0.2813 - acc: 0.8600 Epoch 1873/2000 - 0s - loss: 0.2830 - acc: 0.8600 Epoch 1874/2000 - 0s - loss: 0.2892 - acc: 0.8600 Epoch 1875/2000 - 0s - loss: 0.2937 - acc: 0.8400 Epoch 1876/2000 - 0s - loss: 0.2773 - acc: 0.8800 Epoch 1877/2000 - 0s - loss: 0.3076 - acc: 0.8800 Epoch 1878/2000 - 0s - loss: 0.3819 - acc: 0.8200 Epoch 1879/2000 - 0s - loss: 0.2872 - acc: 0.9000 Epoch 1880/2000 - 0s - loss: 0.3192 - acc: 0.8600 Epoch 1881/2000 - 0s - loss: 0.3429 - acc: 0.8400 Epoch 1882/2000 - 0s - loss: 0.2861 - acc: 0.8800 Epoch 1883/2000 - 0s - loss: 0.3409 - acc: 0.8600 Epoch 1884/2000 - 0s - loss: 0.3014 - acc: 0.8600 Epoch 1885/2000 - 0s - loss: 0.2941 - acc: 0.8800 Epoch 1886/2000 - 0s - loss: 0.2853 - acc: 0.8800 Epoch 1887/2000 - 0s - loss: 0.3011 - acc: 0.8600 Epoch 1888/2000 - 0s - loss: 0.2796 - acc: 0.8800 Epoch 1889/2000 - 0s - loss: 0.2809 - acc: 0.8800 Epoch 1890/2000 - 0s - loss: 0.2809 - acc: 0.8800 Epoch 1891/2000 - 0s - loss: 0.2791 - acc: 0.8800 Epoch 1892/2000 - 0s - loss: 0.2831 - acc: 0.8600 Epoch 1893/2000 - 0s - loss: 0.2847 - acc: 0.8800 Epoch 1894/2000 - 0s - loss: 0.2880 - acc: 0.8600 Epoch 1895/2000 - 0s - loss: 0.2765 - acc: 0.8800 Epoch 1896/2000 - 0s - loss: 0.2874 - acc: 0.8600 Epoch 1897/2000 - 0s - loss: 0.2849 - acc: 0.8600 Epoch 1898/2000 - 0s - loss: 0.2948 - acc: 0.8800 Epoch 1899/2000 - 0s - loss: 0.2868 - acc: 0.8800 Epoch 1900/2000 - 0s - loss: 0.2867 - acc: 0.8800 Epoch 1901/2000 - 0s - loss: 0.2892 - acc: 0.8600 Epoch 1902/2000 - 0s - loss: 0.2782 - acc: 0.8800 Epoch 1903/2000 - 0s - loss: 0.2837 - acc: 0.8600 Epoch 1904/2000 - 0s - loss: 0.2779 - acc: 0.8800 Epoch 1905/2000 - 0s - loss: 0.2816 - acc: 0.8800 Epoch 1906/2000 - 0s - loss: 0.2783 - acc: 0.8800 Epoch 1907/2000 - 0s - loss: 0.2787 - acc: 0.9000 Epoch 1908/2000 - 0s - loss: 0.2756 - acc: 0.8800 Epoch 1909/2000 - 0s - loss: 0.2886 - acc: 0.8600 Epoch 1910/2000 - 0s - loss: 0.2738 - acc: 0.8600 Epoch 1911/2000 - 0s - loss: 0.3007 - acc: 0.8800 Epoch 1912/2000 - 0s - loss: 0.3025 - acc: 0.8400 Epoch 1913/2000 - 0s - loss: 0.2854 - acc: 0.8600 Epoch 1914/2000 - 0s - loss: 0.2797 - acc: 0.9000 Epoch 1915/2000 - 0s - loss: 0.2776 - acc: 0.8800 Epoch 1916/2000 - 0s - loss: 0.2866 - acc: 0.8800 Epoch 1917/2000 - 0s - loss: 0.2798 - acc: 0.8600 Epoch 1918/2000 - 0s - loss: 0.3053 - acc: 0.8800 Epoch 1919/2000 - 0s - loss: 0.3035 - acc: 0.8600 Epoch 1920/2000 - 0s - loss: 0.2764 - acc: 0.8800 Epoch 1921/2000 - 0s - loss: 0.3380 - acc: 0.8400 Epoch 1922/2000 - 0s - loss: 0.3396 - acc: 0.8400 Epoch 1923/2000 - 0s - loss: 0.3198 - acc: 0.8600 Epoch 1924/2000 - 0s - loss: 0.3620 - acc: 0.8600 Epoch 1925/2000 - 0s - loss: 0.3892 - acc: 0.8200 Epoch 1926/2000 - 0s - loss: 0.3465 - acc: 0.8400 Epoch 1927/2000 - 0s - loss: 0.3278 - acc: 0.8800 Epoch 1928/2000 - 0s - loss: 0.3524 - acc: 0.8600 Epoch 1929/2000 - 0s - loss: 0.3029 - acc: 0.8800 Epoch 1930/2000 - 0s - loss: 0.2846 - acc: 0.9000 Epoch 1931/2000 - 0s - loss: 0.2936 - acc: 0.8800 Epoch 1932/2000 - 0s - loss: 0.2825 - acc: 0.8800 Epoch 1933/2000 - 0s - loss: 0.2859 - acc: 0.9000 Epoch 1934/2000 - 0s - loss: 0.2844 - acc: 0.8400 Epoch 1935/2000 - 0s - loss: 0.2975 - acc: 0.8400 Epoch 1936/2000 - 0s - loss: 0.2860 - acc: 0.9000 Epoch 1937/2000 - 0s - loss: 0.3025 - acc: 0.8800 Epoch 1938/2000 - 0s - loss: 0.2885 - acc: 0.9000 Epoch 1939/2000 - 0s - loss: 0.2830 - acc: 0.8800 Epoch 1940/2000 - 0s - loss: 0.2892 - acc: 0.8600 Epoch 1941/2000 - 0s - loss: 0.2777 - acc: 0.8600 Epoch 1942/2000 - 0s - loss: 0.2789 - acc: 0.8800 Epoch 1943/2000 - 0s - loss: 0.2781 - acc: 0.9000 Epoch 1944/2000 - 0s - loss: 0.2819 - acc: 0.8600 Epoch 1945/2000 - 0s - loss: 0.2915 - acc: 0.8800 Epoch 1946/2000 - 0s - loss: 0.2850 - acc: 0.8800 Epoch 1947/2000 - 0s - loss: 0.2816 - acc: 0.9000 Epoch 1948/2000 - 0s - loss: 0.2800 - acc: 0.9000 Epoch 1949/2000 - 0s - loss: 0.2887 - acc: 0.8600 Epoch 1950/2000 - 0s - loss: 0.2940 - acc: 0.8400 Epoch 1951/2000 - 0s - loss: 0.2749 - acc: 0.8400 Epoch 1952/2000 - 0s - loss: 0.2921 - acc: 0.8800 Epoch 1953/2000 - 0s - loss: 0.2767 - acc: 0.9000 Epoch 1954/2000 - 0s - loss: 0.2931 - acc: 0.8600 Epoch 1955/2000 - 0s - loss: 0.2831 - acc: 0.8800 Epoch 1956/2000 - 0s - loss: 0.2726 - acc: 0.8800 Epoch 1957/2000 - 0s - loss: 0.2834 - acc: 0.9000 Epoch 1958/2000 - 0s - loss: 0.2818 - acc: 0.8800 Epoch 1959/2000 - 0s - loss: 0.2747 - acc: 0.9000 Epoch 1960/2000 - 0s - loss: 0.2706 - acc: 0.9000 Epoch 1961/2000 - 0s - loss: 0.2814 - acc: 0.8800 Epoch 1962/2000 - 0s - loss: 0.2746 - acc: 0.9000 Epoch 1963/2000 - 0s - loss: 0.2769 - acc: 0.8800 Epoch 1964/2000 - 0s - loss: 0.2910 - acc: 0.9000 Epoch 1965/2000 - 0s - loss: 0.2706 - acc: 0.9000 Epoch 1966/2000 - 0s - loss: 0.2820 - acc: 0.8600 Epoch 1967/2000 - 0s - loss: 0.2898 - acc: 0.8000 Epoch 1968/2000 - 0s - loss: 0.2822 - acc: 0.8400 Epoch 1969/2000 - 0s - loss: 0.2727 - acc: 0.9000 Epoch 1970/2000 - 0s - loss: 0.2873 - acc: 0.9000 Epoch 1971/2000 - 0s - loss: 0.2815 - acc: 0.8600 Epoch 1972/2000 - 0s - loss: 0.2966 - acc: 0.8400 Epoch 1973/2000 - 0s - loss: 0.2986 - acc: 0.8800 Epoch 1974/2000 - 0s - loss: 0.2872 - acc: 0.8800 Epoch 1975/2000 - 0s - loss: 0.2812 - acc: 0.8800 Epoch 1976/2000 - 0s - loss: 0.3044 - acc: 0.8800 Epoch 1977/2000 - 0s - loss: 0.2852 - acc: 0.8600 Epoch 1978/2000 - 0s - loss: 0.2883 - acc: 0.8800 Epoch 1979/2000 - 0s - loss: 0.2957 - acc: 0.8600 Epoch 1980/2000 - 0s - loss: 0.2780 - acc: 0.8600 Epoch 1981/2000 - 0s - loss: 0.2842 - acc: 0.8600 Epoch 1982/2000 - 0s - loss: 0.2786 - acc: 0.8600 Epoch 1983/2000 - 0s - loss: 0.2779 - acc: 0.8600 Epoch 1984/2000 - 0s - loss: 0.2842 - acc: 0.8800 Epoch 1985/2000 - 0s - loss: 0.2753 - acc: 0.9000 Epoch 1986/2000 - 0s - loss: 0.2826 - acc: 0.8800 Epoch 1987/2000 - 0s - loss: 0.2745 - acc: 0.8400 Epoch 1988/2000 - 0s - loss: 0.2785 - acc: 0.8800 Epoch 1989/2000 - 0s - loss: 0.2742 - acc: 0.8600 Epoch 1990/2000 - 0s - loss: 0.2787 - acc: 0.9000 Epoch 1991/2000 - 0s - loss: 0.2766 - acc: 0.8600 Epoch 1992/2000 - 0s - loss: 0.2975 - acc: 0.8800 Epoch 1993/2000 - 0s - loss: 0.2764 - acc: 0.8800 Epoch 1994/2000 - 0s - loss: 0.2946 - acc: 0.8600 Epoch 1995/2000 - 0s - loss: 0.2962 - acc: 0.9000 Epoch 1996/2000 - 0s - loss: 0.2824 - acc: 0.8800 Epoch 1997/2000 - 0s - loss: 0.3031 - acc: 0.8800 Epoch 1998/2000 - 0s - loss: 0.2780 - acc: 0.9000 Epoch 1999/2000 - 0s - loss: 0.2941 - acc: 0.8600 Epoch 2000/2000 - 0s - loss: 0.2838 - acc: 0.8800
C:\Users\User\Anaconda3\lib\site-packages\matplotlib\font_manager.py:1328: UserWarning: findfont: Font family ['NanumBarunGothic'] not found. Falling back to DejaVu Sans (prop.get_family(), self.defaultFamily[fontext]))
50/50 [==============================] - 0s 2ms/step acc: 90.00% one step prediction : ['g8', 'e8', 'e4', 'f8', 'd8', 'd4', 'c8', 'e8', 'g8', 'f8', 'g8', 'g8', 'g4', 'g8', 'e8', 'e8', 'e8', 'f8', 'g4', 'd4', 'c8', 'e8', 'g8', 'g8', 'e8', 'e8', 'e4', 'd8', 'd8', 'd8', 'd8', 'd8', 'd8', 'f4', 'e8', 'e8', 'e8', 'e8', 'f8', 'f8', 'g4', 'g8', 'e8', 'e4', 'f8', 'd8', 'd4', 'c8', 'e8', 'g8', 'g8', 'e8', 'e8', 'e4'] full song prediction : ['g8', 'e8', 'e4', 'f8', 'd8', 'd4', 'c8', 'e8', 'g8', 'g8', 'e8', 'e8', 'e4', 'd8', 'd8', 'd8', 'd8', 'd8', 'd8', 'd8', 'd8', 'd8', 'd8', 'd8', 'd8', 'd8', 'd8', 'd8', 'd8', 'd8', 'd8', 'd8', 'd8', 'd8', 'd8', 'd8', 'd8', 'd8', 'd8', 'd8', 'd8', 'd8', 'd8', 'd8', 'd8', 'd8', 'd8', 'd8', 'd8', 'd8', 'd8', 'd8', 'd8', 'd8']
# 0. 사용할 패키지 불러오기
import keras
import numpy as np
from keras.models import Sequential
from keras.layers import Dense, LSTM
from keras.utils import np_utils
# 랜덤시드 고정시키기
np.random.seed(5)
# 손실 이력 클래스 정의
class LossHistory(keras.callbacks.Callback):
def init(self):
self.losses = []
def on_epoch_end(self, batch, logs={}):
self.losses.append(logs.get('loss'))
# 데이터셋 생성 함수
def seq2dataset(seq, window_size):
dataset = []
for i in range(len(seq)-window_size):
subset = seq[i:(i+window_size+1)]
dataset.append([code2idx[item] for item in subset])
return np.array(dataset)
# 1. 데이터 준비하기
# 코드 사전 정의
code2idx = {'c4':0, 'd4':1, 'e4':2, 'f4':3, 'g4':4, 'a4':5, 'b4':6,
'c8':7, 'd8':8, 'e8':9, 'f8':10, 'g8':11, 'a8':12, 'b8':13}
idx2code = {0:'c4', 1:'d4', 2:'e4', 3:'f4', 4:'g4', 5:'a4', 6:'b4',
7:'c8', 8:'d8', 9:'e8', 10:'f8', 11:'g8', 12:'a8', 13:'b8'}
# 시퀀스 데이터 정의
seq = ['g8', 'e8', 'e4', 'f8', 'd8', 'd4', 'c8', 'd8', 'e8', 'f8', 'g8', 'g8', 'g4',
'g8', 'e8', 'e8', 'e8', 'f8', 'd8', 'd4', 'c8', 'e8', 'g8', 'g8', 'e8', 'e8', 'e4',
'd8', 'd8', 'd8', 'd8', 'd8', 'e8', 'f4', 'e8', 'e8', 'e8', 'e8', 'e8', 'f8', 'g4',
'g8', 'e8', 'e4', 'f8', 'd8', 'd4', 'c8', 'e8', 'g8', 'g8', 'e8', 'e8', 'e4']
# 2. 데이터셋 생성하기
dataset = seq2dataset(seq, window_size = 4)
print(dataset.shape)
# 입력(X)과 출력(Y) 변수로 분리하기
x_train = dataset[:,0:4]
y_train = dataset[:,4]
max_idx_value = 13
# 입력값 정규화 시키기
x_train = x_train / float(max_idx_value)
# 입력을 (샘플 수, 타임스텝, 특성 수)로 형태 변환
x_train = np.reshape(x_train, (50, 4, 1))
# 라벨값에 대한 one-hot 인코딩 수행
y_train = np_utils.to_categorical(y_train)
one_hot_vec_size = y_train.shape[1]
print("one hot encoding vector size is ", one_hot_vec_size)
# 3. 모델 구성하기
model = Sequential()
model.add(LSTM(128, batch_input_shape = (1, 4, 1), stateful=True))
model.add(Dense(one_hot_vec_size, activation='softmax'))
# 4. 모델 학습과정 설정하기
model.compile(loss='categorical_crossentropy', optimizer='adam', metrics=['accuracy'])
# 5. 모델 학습시키기
num_epochs = 2000
history = LossHistory() # 손실 이력 객체 생성
history.init()
for epoch_idx in range(num_epochs):
print ('epochs : ' + str(epoch_idx) )
model.fit(x_train, y_train, epochs=1, batch_size=1, verbose=2, shuffle=False, callbacks=[history]) # 50 is X.shape[0]
model.reset_states()
# 6. 학습과정 살펴보기
%matplotlib inline
import matplotlib.pyplot as plt
plt.plot(history.losses)
plt.ylabel('loss')
plt.xlabel('epoch')
plt.legend(['train'], loc='upper left')
plt.show()
# 7. 모델 평가하기
scores = model.evaluate(x_train, y_train, batch_size=1)
print("%s: %.2f%%" %(model.metrics_names[1], scores[1]*100))
model.reset_states()
# 8. 모델 사용하기
pred_count = 50 # 최대 예측 개수 정의
# 한 스텝 예측
seq_out = ['g8', 'e8', 'e4', 'f8']
pred_out = model.predict(x_train, batch_size=1)
for i in range(pred_count):
idx = np.argmax(pred_out[i]) # one-hot 인코딩을 인덱스 값으로 변환
seq_out.append(idx2code[idx]) # seq_out는 최종 악보이므로 인덱스 값을 코드로 변환하여 저장
model.reset_states()
print("one step prediction : ", seq_out)
# 곡 전체 예측
seq_in = ['g8', 'e8', 'e4', 'f8']
seq_out = seq_in
seq_in = [code2idx[it] / float(max_idx_value) for it in seq_in] # 코드를 인덱스값으로 변환
for i in range(pred_count):
sample_in = np.array(seq_in)
sample_in = np.reshape(sample_in, (1, 4, 1)) # 샘플 수, 타입스텝 수, 속성 수
pred_out = model.predict(sample_in)
idx = np.argmax(pred_out)
seq_out.append(idx2code[idx])
seq_in.append(idx / float(max_idx_value))
seq_in.pop(0)
model.reset_states()
print("full song prediction : ", seq_out)
(50, 5) one hot encoding vector size is 12 epochs : 0 Epoch 1/1 - 1s - loss: 2.3461 - acc: 0.1400 epochs : 1 Epoch 1/1 - 0s - loss: 2.0398 - acc: 0.3400 epochs : 2 Epoch 1/1 - 0s - loss: 1.9631 - acc: 0.3400 epochs : 3 Epoch 1/1 - 0s - loss: 1.9467 - acc: 0.3400 epochs : 4 Epoch 1/1 - 0s - loss: 1.9369 - acc: 0.3400 epochs : 5 Epoch 1/1 - 0s - loss: 1.9302 - acc: 0.3400 epochs : 6 Epoch 1/1 - 0s - loss: 1.9250 - acc: 0.3600 epochs : 7 Epoch 1/1 - 0s - loss: 1.9209 - acc: 0.3600 epochs : 8 Epoch 1/1 - 0s - loss: 1.9173 - acc: 0.3600 epochs : 9 Epoch 1/1 - 0s - loss: 1.9141 - acc: 0.3600 epochs : 10 Epoch 1/1 - 0s - loss: 1.9111 - acc: 0.3600 epochs : 11 Epoch 1/1 - 0s - loss: 1.9082 - acc: 0.3600 epochs : 12 Epoch 1/1 - 0s - loss: 1.9051 - acc: 0.3600 epochs : 13 Epoch 1/1 - 0s - loss: 1.9017 - acc: 0.3600 epochs : 14 Epoch 1/1 - 0s - loss: 1.8978 - acc: 0.3600 epochs : 15 Epoch 1/1 - 0s - loss: 1.9008 - acc: 0.3600 epochs : 16 Epoch 1/1 - 0s - loss: 1.8893 - acc: 0.3600 epochs : 17 Epoch 1/1 - 0s - loss: 2.0851 - acc: 0.2800 epochs : 18 Epoch 1/1 - 0s - loss: 1.9002 - acc: 0.3600 epochs : 19 Epoch 1/1 - 0s - loss: 1.8913 - acc: 0.3600 epochs : 20 Epoch 1/1 - 0s - loss: 1.8851 - acc: 0.3600 epochs : 21 Epoch 1/1 - 0s - loss: 1.8910 - acc: 0.3600 epochs : 22 Epoch 1/1 - 0s - loss: 1.8907 - acc: 0.3600 epochs : 23 Epoch 1/1 - 0s - loss: 1.8693 - acc: 0.3600 epochs : 24 Epoch 1/1 - 0s - loss: 1.8769 - acc: 0.3600 epochs : 25 Epoch 1/1 - 0s - loss: 1.8711 - acc: 0.3600 epochs : 26 Epoch 1/1 - 0s - loss: 1.8685 - acc: 0.3800 epochs : 27 Epoch 1/1 - 0s - loss: 1.8614 - acc: 0.3600 epochs : 28 Epoch 1/1 - 0s - loss: 1.8563 - acc: 0.3600 epochs : 29 Epoch 1/1 - 0s - loss: 1.8259 - acc: 0.3600 epochs : 30 Epoch 1/1 - 0s - loss: 1.9163 - acc: 0.3400 epochs : 31 Epoch 1/1 - 0s - loss: 1.8395 - acc: 0.3600 epochs : 32 Epoch 1/1 - 0s - loss: 1.8120 - acc: 0.3600 epochs : 33 Epoch 1/1 - 0s - loss: 1.7923 - acc: 0.3800 epochs : 34 Epoch 1/1 - 0s - loss: 1.7707 - acc: 0.4000 epochs : 35 Epoch 1/1 - 0s - loss: 1.7395 - acc: 0.4000 epochs : 36 Epoch 1/1 - 0s - loss: 1.6927 - acc: 0.4400 epochs : 37 Epoch 1/1 - 0s - loss: 1.6724 - acc: 0.4600 epochs : 38 Epoch 1/1 - 0s - loss: 1.8516 - acc: 0.3400 epochs : 39 Epoch 1/1 - 0s - loss: 1.7422 - acc: 0.4000 epochs : 40 Epoch 1/1 - 0s - loss: 2.0135 - acc: 0.3000 epochs : 41 Epoch 1/1 - 0s - loss: 1.7495 - acc: 0.3600 epochs : 42 Epoch 1/1 - 0s - loss: 1.6402 - acc: 0.4000 epochs : 43 Epoch 1/1 - 0s - loss: 1.6042 - acc: 0.3800 epochs : 44 Epoch 1/1 - 0s - loss: 1.6125 - acc: 0.4400 epochs : 45 Epoch 1/1 - 0s - loss: 1.5033 - acc: 0.4800 epochs : 46 Epoch 1/1 - 0s - loss: 1.6433 - acc: 0.3600 epochs : 47 Epoch 1/1 - 0s - loss: 1.7001 - acc: 0.4000 epochs : 48 Epoch 1/1 - 0s - loss: 1.6930 - acc: 0.3600 epochs : 49 Epoch 1/1 - 0s - loss: 1.6042 - acc: 0.4400 epochs : 50 Epoch 1/1 - 0s - loss: 1.5688 - acc: 0.3800 epochs : 51 Epoch 1/1 - 0s - loss: 1.5670 - acc: 0.4400 epochs : 52 Epoch 1/1 - 0s - loss: 1.4977 - acc: 0.4200 epochs : 53 Epoch 1/1 - 0s - loss: 1.4363 - acc: 0.4200 epochs : 54 Epoch 1/1 - 0s - loss: 1.5216 - acc: 0.4000 epochs : 55 Epoch 1/1 - 0s - loss: 1.5166 - acc: 0.3800 epochs : 56 Epoch 1/1 - 0s - loss: 1.4235 - acc: 0.4400 epochs : 57 Epoch 1/1 - 0s - loss: 1.3543 - acc: 0.4400 epochs : 58 Epoch 1/1 - 0s - loss: 1.5378 - acc: 0.4400 epochs : 59 Epoch 1/1 - 0s - loss: 1.5260 - acc: 0.4800 epochs : 60 Epoch 1/1 - 0s - loss: 1.3605 - acc: 0.5000 epochs : 61 Epoch 1/1 - 0s - loss: 1.3481 - acc: 0.4200 epochs : 62 Epoch 1/1 - 0s - loss: 1.5583 - acc: 0.4000 epochs : 63 Epoch 1/1 - 0s - loss: 1.4906 - acc: 0.4400 epochs : 64 Epoch 1/1 - 0s - loss: 1.3755 - acc: 0.5200 epochs : 65 Epoch 1/1 - 0s - loss: 1.4200 - acc: 0.4200 epochs : 66 Epoch 1/1 - 0s - loss: 1.2937 - acc: 0.5600 epochs : 67 Epoch 1/1 - 0s - loss: 1.3975 - acc: 0.4600 epochs : 68 Epoch 1/1 - 0s - loss: 1.4139 - acc: 0.4400 epochs : 69 Epoch 1/1 - 0s - loss: 1.3784 - acc: 0.4400 epochs : 70 Epoch 1/1 - 0s - loss: 1.2822 - acc: 0.5000 epochs : 71 Epoch 1/1 - 0s - loss: 1.1538 - acc: 0.6400 epochs : 72 Epoch 1/1 - 0s - loss: 1.2311 - acc: 0.5800 epochs : 73 Epoch 1/1 - 0s - loss: 1.4360 - acc: 0.4800 epochs : 74 Epoch 1/1 - 0s - loss: 1.3943 - acc: 0.4600 epochs : 75 Epoch 1/1 - 0s - loss: 1.3315 - acc: 0.5000 epochs : 76 Epoch 1/1 - 0s - loss: 1.3295 - acc: 0.5200 epochs : 77 Epoch 1/1 - 0s - loss: 1.2963 - acc: 0.4800 epochs : 78 Epoch 1/1 - 0s - loss: 1.1611 - acc: 0.5200 epochs : 79 Epoch 1/1 - 0s - loss: 1.2836 - acc: 0.5800 epochs : 80 Epoch 1/1 - 0s - loss: 1.2244 - acc: 0.5200 epochs : 81 Epoch 1/1 - 0s - loss: 1.1034 - acc: 0.6400 epochs : 82 Epoch 1/1 - 0s - loss: 1.2804 - acc: 0.5200 epochs : 83 Epoch 1/1 - 0s - loss: 1.5015 - acc: 0.3400 epochs : 84 Epoch 1/1 - 0s - loss: 1.1730 - acc: 0.5000 epochs : 85 Epoch 1/1 - 0s - loss: 1.2441 - acc: 0.5200 epochs : 86 Epoch 1/1 - 0s - loss: 1.2689 - acc: 0.5000 epochs : 87 Epoch 1/1 - 0s - loss: 1.0842 - acc: 0.5800 epochs : 88 Epoch 1/1 - 0s - loss: 1.2467 - acc: 0.5200 epochs : 89 Epoch 1/1 - 0s - loss: 1.1173 - acc: 0.5400 epochs : 90 Epoch 1/1 - 0s - loss: 1.0635 - acc: 0.5600 epochs : 91 Epoch 1/1 - 0s - loss: 0.9538 - acc: 0.5800 epochs : 92 Epoch 1/1 - 0s - loss: 1.4125 - acc: 0.5400 epochs : 93 Epoch 1/1 - 0s - loss: 1.2997 - acc: 0.4600 epochs : 94 Epoch 1/1 - 0s - loss: 1.3308 - acc: 0.4400 epochs : 95 Epoch 1/1 - 0s - loss: 1.0619 - acc: 0.6400 epochs : 96 Epoch 1/1 - 0s - loss: 1.0047 - acc: 0.6000 epochs : 97 Epoch 1/1 - 0s - loss: 1.2192 - acc: 0.5200 epochs : 98 Epoch 1/1 - 0s - loss: 1.0208 - acc: 0.6000 epochs : 99 Epoch 1/1 - 0s - loss: 1.2868 - acc: 0.5600 epochs : 100 Epoch 1/1 - 0s - loss: 1.4195 - acc: 0.4000 epochs : 101 Epoch 1/1 - 0s - loss: 1.1357 - acc: 0.5600 epochs : 102 Epoch 1/1 - 0s - loss: 1.1618 - acc: 0.5600 epochs : 103 Epoch 1/1 - 0s - loss: 1.1361 - acc: 0.4800 epochs : 104 Epoch 1/1 - 0s - loss: 1.3524 - acc: 0.5000 epochs : 105 Epoch 1/1 - 0s - loss: 1.0148 - acc: 0.5800 epochs : 106 Epoch 1/1 - 0s - loss: 0.9178 - acc: 0.6000 epochs : 107 Epoch 1/1 - 0s - loss: 0.8336 - acc: 0.7000 epochs : 108 Epoch 1/1 - 0s - loss: 0.8325 - acc: 0.7600 epochs : 109 Epoch 1/1 - 0s - loss: 0.8469 - acc: 0.7400 epochs : 110 Epoch 1/1 - 0s - loss: 1.2097 - acc: 0.5600 epochs : 111 Epoch 1/1 - 0s - loss: 1.1707 - acc: 0.5400 epochs : 112 Epoch 1/1 - 0s - loss: 0.8757 - acc: 0.6600 epochs : 113 Epoch 1/1 - 0s - loss: 0.9201 - acc: 0.6200 epochs : 114 Epoch 1/1 - 0s - loss: 1.1955 - acc: 0.6000 epochs : 115 Epoch 1/1 - 0s - loss: 0.9000 - acc: 0.6200 epochs : 116 Epoch 1/1 - 0s - loss: 0.8819 - acc: 0.7000 epochs : 117 Epoch 1/1 - 0s - loss: 0.7805 - acc: 0.7200 epochs : 118 Epoch 1/1 - 0s - loss: 1.0184 - acc: 0.5600 epochs : 119 Epoch 1/1 - 0s - loss: 0.7047 - acc: 0.7600 epochs : 120 Epoch 1/1 - 0s - loss: 0.6886 - acc: 0.7800 epochs : 121 Epoch 1/1 - 0s - loss: 0.8760 - acc: 0.6800 epochs : 122 Epoch 1/1 - 0s - loss: 0.8830 - acc: 0.6400 epochs : 123 Epoch 1/1 - 0s - loss: 0.6787 - acc: 0.7400 epochs : 124 Epoch 1/1 - 0s - loss: 0.6329 - acc: 0.7400 epochs : 125 Epoch 1/1 - 0s - loss: 0.6068 - acc: 0.7600 epochs : 126 Epoch 1/1 - 0s - loss: 1.0617 - acc: 0.6000 epochs : 127 Epoch 1/1 - 0s - loss: 1.7620 - acc: 0.4600 epochs : 128 Epoch 1/1 - 0s - loss: 1.4124 - acc: 0.4800 epochs : 129 Epoch 1/1 - 0s - loss: 0.9042 - acc: 0.6200 epochs : 130 Epoch 1/1 - 0s - loss: 0.7203 - acc: 0.7400 epochs : 131 Epoch 1/1 - 0s - loss: 0.6124 - acc: 0.7400 epochs : 132 Epoch 1/1 - 0s - loss: 0.5854 - acc: 0.7400 epochs : 133 Epoch 1/1 - 0s - loss: 0.6479 - acc: 0.7600 epochs : 134 Epoch 1/1 - 0s - loss: 0.7311 - acc: 0.7800 epochs : 135 Epoch 1/1 - 0s - loss: 0.5934 - acc: 0.7400 epochs : 136 Epoch 1/1 - 0s - loss: 0.8303 - acc: 0.6800 epochs : 137 Epoch 1/1 - 0s - loss: 0.6655 - acc: 0.7600 epochs : 138 Epoch 1/1 - 0s - loss: 1.3427 - acc: 0.4600 epochs : 139 Epoch 1/1 - 0s - loss: 1.1441 - acc: 0.5600 epochs : 140 Epoch 1/1 - 0s - loss: 0.6649 - acc: 0.7400 epochs : 141 Epoch 1/1 - 0s - loss: 0.6892 - acc: 0.7200 epochs : 142 Epoch 1/1 - 0s - loss: 0.8471 - acc: 0.7000 epochs : 143 Epoch 1/1 - 0s - loss: 0.5414 - acc: 0.8400 epochs : 144 Epoch 1/1 - 0s - loss: 0.4501 - acc: 0.8800 epochs : 145 Epoch 1/1 - 0s - loss: 0.3513 - acc: 0.9600 epochs : 146 Epoch 1/1 - 0s - loss: 1.0208 - acc: 0.6600 epochs : 147 Epoch 1/1 - 0s - loss: 0.6217 - acc: 0.7400 epochs : 148 Epoch 1/1 - 0s - loss: 0.4934 - acc: 0.8200 epochs : 149 Epoch 1/1 - 0s - loss: 0.9784 - acc: 0.7400 epochs : 150 Epoch 1/1 - 0s - loss: 1.3046 - acc: 0.4800 epochs : 151 Epoch 1/1 - 0s - loss: 1.9467 - acc: 0.4400 epochs : 152 Epoch 1/1 - 0s - loss: 0.8994 - acc: 0.5800 epochs : 153 Epoch 1/1 - 0s - loss: 0.7793 - acc: 0.7600 epochs : 154 Epoch 1/1 - 0s - loss: 0.8074 - acc: 0.6600 epochs : 155 Epoch 1/1 - 0s - loss: 0.5317 - acc: 0.8400 epochs : 156 Epoch 1/1 - 0s - loss: 0.7800 - acc: 0.7000 epochs : 157 Epoch 1/1 - 0s - loss: 0.9233 - acc: 0.7400 epochs : 158 Epoch 1/1 - 0s - loss: 0.7384 - acc: 0.6600 epochs : 159 Epoch 1/1 - 0s - loss: 0.7423 - acc: 0.7400 epochs : 160 Epoch 1/1 - 0s - loss: 0.6547 - acc: 0.8400 epochs : 161 Epoch 1/1 - 0s - loss: 0.8108 - acc: 0.6800 epochs : 162 Epoch 1/1 - 0s - loss: 0.4406 - acc: 0.9000 epochs : 163 Epoch 1/1 - 0s - loss: 0.5034 - acc: 0.7800 epochs : 164 Epoch 1/1 - 0s - loss: 0.7881 - acc: 0.7200 epochs : 165 Epoch 1/1 - 0s - loss: 1.1397 - acc: 0.5800 epochs : 166 Epoch 1/1 - 0s - loss: 1.6420 - acc: 0.5200 epochs : 167 Epoch 1/1 - 0s - loss: 1.3899 - acc: 0.4000 epochs : 168 Epoch 1/1 - 0s - loss: 1.0242 - acc: 0.4800 epochs : 169 Epoch 1/1 - 0s - loss: 0.6920 - acc: 0.7200 epochs : 170 Epoch 1/1 - 0s - loss: 0.6830 - acc: 0.7200 epochs : 171 Epoch 1/1 - 0s - loss: 0.6391 - acc: 0.7400 epochs : 172 Epoch 1/1 - 0s - loss: 0.4618 - acc: 0.8600 epochs : 173 Epoch 1/1 - 0s - loss: 1.1460 - acc: 0.5400 epochs : 174 Epoch 1/1 - 0s - loss: 1.3860 - acc: 0.3800 epochs : 175 Epoch 1/1 - 0s - loss: 0.9422 - acc: 0.6400 epochs : 176 Epoch 1/1 - 0s - loss: 0.6981 - acc: 0.7000 epochs : 177 Epoch 1/1 - 0s - loss: 0.7121 - acc: 0.7200 epochs : 178 Epoch 1/1 - 0s - loss: 0.6175 - acc: 0.7800 epochs : 179 Epoch 1/1 - 0s - loss: 0.8263 - acc: 0.6600 epochs : 180 Epoch 1/1 - 0s - loss: 0.8019 - acc: 0.7200 epochs : 181 Epoch 1/1 - 0s - loss: 1.4334 - acc: 0.4600 epochs : 182 Epoch 1/1 - 0s - loss: 0.8801 - acc: 0.6200 epochs : 183 Epoch 1/1 - 0s - loss: 0.5390 - acc: 0.8600 epochs : 184 Epoch 1/1 - 0s - loss: 0.3714 - acc: 0.9600 epochs : 185 Epoch 1/1 - 0s - loss: 0.6650 - acc: 0.8200 epochs : 186 Epoch 1/1 - 0s - loss: 0.6450 - acc: 0.7200 epochs : 187 Epoch 1/1 - 0s - loss: 1.1980 - acc: 0.5200 epochs : 188 Epoch 1/1 - 0s - loss: 0.5868 - acc: 0.7400 epochs : 189 Epoch 1/1 - 0s - loss: 1.0364 - acc: 0.5000 epochs : 190 Epoch 1/1 - 0s - loss: 1.2180 - acc: 0.5400 epochs : 191 Epoch 1/1 - 0s - loss: 0.9476 - acc: 0.6200 epochs : 192 Epoch 1/1 - 0s - loss: 0.8389 - acc: 0.6400 epochs : 193 Epoch 1/1 - 0s - loss: 0.7536 - acc: 0.6200 epochs : 194 Epoch 1/1 - 0s - loss: 0.7081 - acc: 0.7000 epochs : 195 Epoch 1/1 - 0s - loss: 0.8066 - acc: 0.7000 epochs : 196 Epoch 1/1 - 0s - loss: 0.7070 - acc: 0.6800 epochs : 197 Epoch 1/1 - 0s - loss: 0.5186 - acc: 0.8800 epochs : 198 Epoch 1/1 - 0s - loss: 0.5322 - acc: 0.7800 epochs : 199 Epoch 1/1 - 0s - loss: 0.4979 - acc: 0.8000 epochs : 200 Epoch 1/1 - 0s - loss: 0.3966 - acc: 0.8400 epochs : 201 Epoch 1/1 - 0s - loss: 0.5296 - acc: 0.8000 epochs : 202 Epoch 1/1 - 0s - loss: 0.5571 - acc: 0.7800 epochs : 203 Epoch 1/1 - 0s - loss: 0.3212 - acc: 0.9200 epochs : 204 Epoch 1/1 - 0s - loss: 0.2493 - acc: 0.9800 epochs : 205 Epoch 1/1 - 0s - loss: 0.2266 - acc: 0.9400 epochs : 206 Epoch 1/1 - 0s - loss: 0.2103 - acc: 0.9600 epochs : 207 Epoch 1/1 - 0s - loss: 0.2957 - acc: 0.9200 epochs : 208 Epoch 1/1 - 0s - loss: 0.6250 - acc: 0.7600 epochs : 209 Epoch 1/1 - 0s - loss: 0.7158 - acc: 0.6400 epochs : 210 Epoch 1/1 - 0s - loss: 0.3969 - acc: 0.9200 epochs : 211 Epoch 1/1 - 0s - loss: 0.6136 - acc: 0.7400 epochs : 212 Epoch 1/1 - 0s - loss: 0.3666 - acc: 0.8400 epochs : 213 Epoch 1/1 - 0s - loss: 0.3976 - acc: 0.8200 epochs : 214 Epoch 1/1 - 0s - loss: 0.6922 - acc: 0.7600 epochs : 215 Epoch 1/1 - 0s - loss: 1.9726 - acc: 0.3600 epochs : 216 Epoch 1/1 - 0s - loss: 0.8023 - acc: 0.6200 epochs : 217 Epoch 1/1 - 0s - loss: 0.8315 - acc: 0.6200 epochs : 218 Epoch 1/1 - 0s - loss: 0.8635 - acc: 0.6400 epochs : 219 Epoch 1/1 - 0s - loss: 0.4417 - acc: 0.9000 epochs : 220 Epoch 1/1 - 0s - loss: 0.3267 - acc: 0.9000 epochs : 221 Epoch 1/1 - 0s - loss: 0.2518 - acc: 0.9600 epochs : 222 Epoch 1/1 - 0s - loss: 0.1673 - acc: 1.0000 epochs : 223 Epoch 1/1 - 0s - loss: 0.1486 - acc: 1.0000 epochs : 224 Epoch 1/1 - 0s - loss: 0.1439 - acc: 0.9800 epochs : 225 Epoch 1/1 - 0s - loss: 0.1121 - acc: 1.0000 epochs : 226 Epoch 1/1 - 0s - loss: 0.0843 - acc: 1.0000 epochs : 227 Epoch 1/1 - 0s - loss: 0.1296 - acc: 0.9600 epochs : 228 Epoch 1/1 - 0s - loss: 0.1802 - acc: 1.0000 epochs : 229 Epoch 1/1 - 0s - loss: 0.2223 - acc: 0.9400 epochs : 230 Epoch 1/1 - 0s - loss: 0.4471 - acc: 0.8600 epochs : 231 Epoch 1/1 - 0s - loss: 0.3175 - acc: 0.8800 epochs : 232 Epoch 1/1 - 0s - loss: 0.6081 - acc: 0.7800 epochs : 233 Epoch 1/1 - 0s - loss: 1.3697 - acc: 0.5400 epochs : 234 Epoch 1/1 - 0s - loss: 1.5633 - acc: 0.3800 epochs : 235 Epoch 1/1 - 0s - loss: 1.0804 - acc: 0.6000 epochs : 236 Epoch 1/1 - 0s - loss: 0.7671 - acc: 0.6600 epochs : 237 Epoch 1/1 - 0s - loss: 0.4299 - acc: 0.8200 epochs : 238 Epoch 1/1 - 0s - loss: 0.6864 - acc: 0.7200 epochs : 239 Epoch 1/1 - 0s - loss: 0.5223 - acc: 0.8400 epochs : 240 Epoch 1/1 - 0s - loss: 0.3283 - acc: 0.9200 epochs : 241 Epoch 1/1 - 0s - loss: 0.2287 - acc: 0.9800 epochs : 242 Epoch 1/1 - 0s - loss: 0.1308 - acc: 0.9800 epochs : 243 Epoch 1/1 - 0s - loss: 0.1862 - acc: 0.9400 epochs : 244 Epoch 1/1 - 0s - loss: 0.0960 - acc: 1.0000 epochs : 245 Epoch 1/1 - 0s - loss: 0.0635 - acc: 1.0000 epochs : 246 Epoch 1/1 - 0s - loss: 0.0543 - acc: 1.0000 epochs : 247 Epoch 1/1 - 0s - loss: 0.0463 - acc: 1.0000 epochs : 248 Epoch 1/1 - 0s - loss: 0.0407 - acc: 1.0000 epochs : 249 Epoch 1/1 - 0s - loss: 0.0360 - acc: 1.0000 epochs : 250 Epoch 1/1 - 0s - loss: 0.0325 - acc: 1.0000 epochs : 251 Epoch 1/1 - 0s - loss: 0.0295 - acc: 1.0000 epochs : 252 Epoch 1/1 - 0s - loss: 0.0270 - acc: 1.0000 epochs : 253 Epoch 1/1 - 0s - loss: 0.0248 - acc: 1.0000 epochs : 254 Epoch 1/1 - 0s - loss: 0.0228 - acc: 1.0000 epochs : 255 Epoch 1/1 - 0s - loss: 0.0211 - acc: 1.0000 epochs : 256 Epoch 1/1 - 0s - loss: 0.0196 - acc: 1.0000 epochs : 257 Epoch 1/1 - 0s - loss: 0.0183 - acc: 1.0000 epochs : 258 Epoch 1/1 - 0s - loss: 0.0171 - acc: 1.0000 epochs : 259 Epoch 1/1 - 0s - loss: 0.0161 - acc: 1.0000 epochs : 260 Epoch 1/1 - 0s - loss: 0.0151 - acc: 1.0000 epochs : 261 Epoch 1/1 - 0s - loss: 0.0142 - acc: 1.0000 epochs : 262 Epoch 1/1 - 0s - loss: 0.0134 - acc: 1.0000 epochs : 263 Epoch 1/1 - 0s - loss: 0.0126 - acc: 1.0000 epochs : 264 Epoch 1/1 - 0s - loss: 0.0119 - acc: 1.0000 epochs : 265 Epoch 1/1 - 0s - loss: 0.0112 - acc: 1.0000 epochs : 266 Epoch 1/1 - 0s - loss: 0.0106 - acc: 1.0000 epochs : 267 Epoch 1/1 - 0s - loss: 0.0100 - acc: 1.0000 epochs : 268 Epoch 1/1 - 0s - loss: 0.0095 - acc: 1.0000 epochs : 269 Epoch 1/1 - 0s - loss: 0.0090 - acc: 1.0000 epochs : 270 Epoch 1/1 - 0s - loss: 0.0085 - acc: 1.0000 epochs : 271 Epoch 1/1 - 0s - loss: 0.0081 - acc: 1.0000 epochs : 272 Epoch 1/1 - 0s - loss: 0.0077 - acc: 1.0000 epochs : 273 Epoch 1/1 - 0s - loss: 0.0074 - acc: 1.0000 epochs : 274 Epoch 1/1 - 0s - loss: 0.0070 - acc: 1.0000 epochs : 275 Epoch 1/1 - 0s - loss: 0.0067 - acc: 1.0000 epochs : 276 Epoch 1/1 - 0s - loss: 0.0064 - acc: 1.0000 epochs : 277 Epoch 1/1 - 0s - loss: 0.0062 - acc: 1.0000 epochs : 278 Epoch 1/1 - 0s - loss: 0.0059 - acc: 1.0000 epochs : 279 Epoch 1/1 - 0s - loss: 0.0056 - acc: 1.0000 epochs : 280 Epoch 1/1 - 0s - loss: 0.0054 - acc: 1.0000 epochs : 281 Epoch 1/1 - 0s - loss: 0.0052 - acc: 1.0000 epochs : 282 Epoch 1/1 - 0s - loss: 0.0051 - acc: 1.0000 epochs : 283 Epoch 1/1 - 0s - loss: 0.0047 - acc: 1.0000 epochs : 284 Epoch 1/1 - 0s - loss: 0.0047 - acc: 1.0000 epochs : 285 Epoch 1/1 - 0s - loss: 0.0043 - acc: 1.0000 epochs : 286 Epoch 1/1 - 0s - loss: 0.0044 - acc: 1.0000 epochs : 287 Epoch 1/1 - 0s - loss: 0.0040 - acc: 1.0000 epochs : 288 Epoch 1/1 - 0s - loss: 0.0040 - acc: 1.0000 epochs : 289 Epoch 1/1 - 0s - loss: 0.0037 - acc: 1.0000 epochs : 290 Epoch 1/1 - 0s - loss: 0.0037 - acc: 1.0000 epochs : 291 Epoch 1/1 - 0s - loss: 0.0035 - acc: 1.0000 epochs : 292 Epoch 1/1 - 0s - loss: 0.0034 - acc: 1.0000 epochs : 293 Epoch 1/1 - 0s - loss: 0.0032 - acc: 1.0000 epochs : 294 Epoch 1/1 - 0s - loss: 0.0031 - acc: 1.0000 epochs : 295 Epoch 1/1 - 0s - loss: 0.0030 - acc: 1.0000 epochs : 296 Epoch 1/1 - 0s - loss: 0.0029 - acc: 1.0000 epochs : 297 Epoch 1/1 - 0s - loss: 0.0028 - acc: 1.0000 epochs : 298 Epoch 1/1 - 0s - loss: 0.0027 - acc: 1.0000 epochs : 299 Epoch 1/1 - 0s - loss: 0.0026 - acc: 1.0000 epochs : 300 Epoch 1/1 - 0s - loss: 0.0025 - acc: 1.0000 epochs : 301 Epoch 1/1 - 0s - loss: 0.0024 - acc: 1.0000 epochs : 302 Epoch 1/1 - 0s - loss: 0.0023 - acc: 1.0000 epochs : 303 Epoch 1/1 - 0s - loss: 0.0023 - acc: 1.0000 epochs : 304 Epoch 1/1 - 0s - loss: 0.0022 - acc: 1.0000 epochs : 305 Epoch 1/1 - 0s - loss: 0.0021 - acc: 1.0000 epochs : 306 Epoch 1/1 - 0s - loss: 0.0020 - acc: 1.0000 epochs : 307 Epoch 1/1 - 0s - loss: 0.0020 - acc: 1.0000 epochs : 308 Epoch 1/1 - 0s - loss: 0.0019 - acc: 1.0000 epochs : 309 Epoch 1/1 - 0s - loss: 0.0018 - acc: 1.0000 epochs : 310 Epoch 1/1 - 0s - loss: 0.0018 - acc: 1.0000 epochs : 311 Epoch 1/1 - 0s - loss: 0.0017 - acc: 1.0000 epochs : 312 Epoch 1/1 - 0s - loss: 0.0017 - acc: 1.0000 epochs : 313 Epoch 1/1 - 0s - loss: 0.0016 - acc: 1.0000 epochs : 314 Epoch 1/1 - 0s - loss: 0.0016 - acc: 1.0000 epochs : 315 Epoch 1/1 - 0s - loss: 0.0015 - acc: 1.0000 epochs : 316 Epoch 1/1 - 0s - loss: 0.0015 - acc: 1.0000 epochs : 317 Epoch 1/1 - 0s - loss: 0.0014 - acc: 1.0000 epochs : 318 Epoch 1/1 - 0s - loss: 0.0014 - acc: 1.0000 epochs : 319 Epoch 1/1 - 0s - loss: 0.0013 - acc: 1.0000 epochs : 320 Epoch 1/1 - 0s - loss: 0.0013 - acc: 1.0000 epochs : 321 Epoch 1/1 - 0s - loss: 0.0013 - acc: 1.0000 epochs : 322 Epoch 1/1 - 0s - loss: 0.0012 - acc: 1.0000 epochs : 323 Epoch 1/1 - 0s - loss: 0.0012 - acc: 1.0000 epochs : 324 Epoch 1/1 - 0s - loss: 0.0012 - acc: 1.0000 epochs : 325 Epoch 1/1 - 0s - loss: 0.0011 - acc: 1.0000 epochs : 326 Epoch 1/1 - 0s - loss: 0.0011 - acc: 1.0000 epochs : 327 Epoch 1/1 - 0s - loss: 0.0011 - acc: 1.0000 epochs : 328 Epoch 1/1 - 0s - loss: 0.0010 - acc: 1.0000 epochs : 329 Epoch 1/1 - 0s - loss: 0.0010 - acc: 1.0000 epochs : 330 Epoch 1/1 - 0s - loss: 9.5758e-04 - acc: 1.0000 epochs : 331 Epoch 1/1 - 0s - loss: 9.4172e-04 - acc: 1.0000 epochs : 332 Epoch 1/1 - 0s - loss: 9.0909e-04 - acc: 1.0000 epochs : 333 Epoch 1/1 - 0s - loss: 8.9350e-04 - acc: 1.0000 epochs : 334 Epoch 1/1 - 0s - loss: 8.5414e-04 - acc: 1.0000 epochs : 335 Epoch 1/1 - 0s - loss: 8.3909e-04 - acc: 1.0000 epochs : 336 Epoch 1/1 - 0s - loss: 8.1118e-04 - acc: 1.0000 epochs : 337 Epoch 1/1 - 0s - loss: 7.9201e-04 - acc: 1.0000 epochs : 338 Epoch 1/1 - 0s - loss: 7.6579e-04 - acc: 1.0000 epochs : 339 Epoch 1/1 - 0s - loss: 7.5069e-04 - acc: 1.0000 epochs : 340 Epoch 1/1 - 0s - loss: 7.2416e-04 - acc: 1.0000 epochs : 341 Epoch 1/1 - 0s - loss: 7.0695e-04 - acc: 1.0000 epochs : 342 Epoch 1/1 - 0s - loss: 6.8908e-04 - acc: 1.0000 epochs : 343 Epoch 1/1 - 0s - loss: 6.7058e-04 - acc: 1.0000 epochs : 344 Epoch 1/1 - 0s - loss: 6.5245e-04 - acc: 1.0000 epochs : 345 Epoch 1/1 - 0s - loss: 6.3117e-04 - acc: 1.0000 epochs : 346 Epoch 1/1 - 0s - loss: 6.2339e-04 - acc: 1.0000 epochs : 347 Epoch 1/1 - 0s - loss: 5.9549e-04 - acc: 1.0000 epochs : 348 Epoch 1/1 - 0s - loss: 5.8741e-04 - acc: 1.0000 epochs : 349 Epoch 1/1 - 0s - loss: 5.6923e-04 - acc: 1.0000 epochs : 350 Epoch 1/1 - 0s - loss: 5.5276e-04 - acc: 1.0000 epochs : 351 Epoch 1/1 - 0s - loss: 5.3803e-04 - acc: 1.0000 epochs : 352 Epoch 1/1 - 0s - loss: 5.2931e-04 - acc: 1.0000 epochs : 353 Epoch 1/1 - 0s - loss: 5.1301e-04 - acc: 1.0000 epochs : 354 Epoch 1/1 - 0s - loss: 5.0280e-04 - acc: 1.0000 epochs : 355 Epoch 1/1 - 0s - loss: 4.9145e-04 - acc: 1.0000 epochs : 356 Epoch 1/1 - 0s - loss: 4.7788e-04 - acc: 1.0000 epochs : 357 Epoch 1/1 - 0s - loss: 4.6520e-04 - acc: 1.0000 epochs : 358 Epoch 1/1 - 0s - loss: 4.5781e-04 - acc: 1.0000 epochs : 359 Epoch 1/1 - 0s - loss: 4.4621e-04 - acc: 1.0000 epochs : 360 Epoch 1/1 - 0s - loss: 4.3105e-04 - acc: 1.0000 epochs : 361 Epoch 1/1 - 0s - loss: 4.3927e-04 - acc: 1.0000 epochs : 362 Epoch 1/1 - 0s - loss: 4.1523e-04 - acc: 1.0000 epochs : 363 Epoch 1/1 - 0s - loss: 4.0719e-04 - acc: 1.0000 epochs : 364 Epoch 1/1 - 0s - loss: 4.0553e-04 - acc: 1.0000 epochs : 365 Epoch 1/1 - 0s - loss: 3.9360e-04 - acc: 1.0000 epochs : 366 Epoch 1/1 - 0s - loss: 3.8518e-04 - acc: 1.0000 epochs : 367 Epoch 1/1 - 0s - loss: 3.7605e-04 - acc: 1.0000 epochs : 368 Epoch 1/1 - 0s - loss: 3.6455e-04 - acc: 1.0000 epochs : 369 Epoch 1/1 - 0s - loss: 3.5197e-04 - acc: 1.0000 epochs : 370 Epoch 1/1 - 0s - loss: 3.4798e-04 - acc: 1.0000 epochs : 371 Epoch 1/1 - 0s - loss: 3.4255e-04 - acc: 1.0000 epochs : 372 Epoch 1/1 - 0s - loss: 3.2186e-04 - acc: 1.0000 epochs : 373 Epoch 1/1 - 0s - loss: 3.1123e-04 - acc: 1.0000 epochs : 374 Epoch 1/1 - 0s - loss: 3.1011e-04 - acc: 1.0000 epochs : 375 Epoch 1/1 - 0s - loss: 2.9428e-04 - acc: 1.0000 epochs : 376 Epoch 1/1 - 0s - loss: 2.9011e-04 - acc: 1.0000 epochs : 377 Epoch 1/1 - 0s - loss: 2.8060e-04 - acc: 1.0000 epochs : 378 Epoch 1/1 - 0s - loss: 2.6966e-04 - acc: 1.0000 epochs : 379 Epoch 1/1 - 0s - loss: 2.6186e-04 - acc: 1.0000 epochs : 380 Epoch 1/1 - 0s - loss: 2.6281e-04 - acc: 1.0000 epochs : 381 Epoch 1/1 - 0s - loss: 2.4232e-04 - acc: 1.0000 epochs : 382 Epoch 1/1 - 0s - loss: 2.4168e-04 - acc: 1.0000 epochs : 383 Epoch 1/1 - 0s - loss: 2.3180e-04 - acc: 1.0000 epochs : 384 Epoch 1/1 - 0s - loss: 2.3016e-04 - acc: 1.0000 epochs : 385 Epoch 1/1 - 0s - loss: 2.1627e-04 - acc: 1.0000 epochs : 386 Epoch 1/1 - 0s - loss: 2.1456e-04 - acc: 1.0000 epochs : 387 Epoch 1/1 - 0s - loss: 2.0728e-04 - acc: 1.0000 epochs : 388 Epoch 1/1 - 0s - loss: 1.9862e-04 - acc: 1.0000 epochs : 389 Epoch 1/1 - 0s - loss: 1.9833e-04 - acc: 1.0000 epochs : 390 Epoch 1/1 - 0s - loss: 1.8727e-04 - acc: 1.0000 epochs : 391 Epoch 1/1 - 0s - loss: 1.8488e-04 - acc: 1.0000 epochs : 392 Epoch 1/1 - 0s - loss: 1.8050e-04 - acc: 1.0000 epochs : 393 Epoch 1/1 - 0s - loss: 1.7274e-04 - acc: 1.0000 epochs : 394 Epoch 1/1 - 0s - loss: 1.7541e-04 - acc: 1.0000 epochs : 395 Epoch 1/1 - 0s - loss: 1.6302e-04 - acc: 1.0000 epochs : 396 Epoch 1/1 - 0s - loss: 1.6685e-04 - acc: 1.0000 epochs : 397 Epoch 1/1 - 0s - loss: 1.5972e-04 - acc: 1.0000 epochs : 398 Epoch 1/1 - 0s - loss: 1.5493e-04 - acc: 1.0000 epochs : 399 Epoch 1/1 - 0s - loss: 1.4947e-04 - acc: 1.0000 epochs : 400 Epoch 1/1 - 0s - loss: 1.4471e-04 - acc: 1.0000 epochs : 401 Epoch 1/1 - 0s - loss: 1.4221e-04 - acc: 1.0000 epochs : 402 Epoch 1/1 - 0s - loss: 1.3727e-04 - acc: 1.0000 epochs : 403 Epoch 1/1 - 0s - loss: 1.3277e-04 - acc: 1.0000 epochs : 404 Epoch 1/1 - 0s - loss: 1.2787e-04 - acc: 1.0000 epochs : 405 Epoch 1/1 - 0s - loss: 1.2421e-04 - acc: 1.0000 epochs : 406 Epoch 1/1 - 0s - loss: 1.1949e-04 - acc: 1.0000 epochs : 407 Epoch 1/1 - 0s - loss: 1.1714e-04 - acc: 1.0000 epochs : 408 Epoch 1/1 - 0s - loss: 1.1361e-04 - acc: 1.0000 epochs : 409 Epoch 1/1 - 0s - loss: 1.1078e-04 - acc: 1.0000 epochs : 410 Epoch 1/1 - 0s - loss: 1.0801e-04 - acc: 1.0000 epochs : 411 Epoch 1/1 - 0s - loss: 1.0394e-04 - acc: 1.0000 epochs : 412 Epoch 1/1 - 0s - loss: 9.9666e-05 - acc: 1.0000 epochs : 413 Epoch 1/1 - 0s - loss: 9.7822e-05 - acc: 1.0000 epochs : 414 Epoch 1/1 - 0s - loss: 9.4105e-05 - acc: 1.0000 epochs : 415 Epoch 1/1 - 0s - loss: 9.3001e-05 - acc: 1.0000 epochs : 416 Epoch 1/1 - 0s - loss: 8.9304e-05 - acc: 1.0000 epochs : 417 Epoch 1/1 - 0s - loss: 8.9107e-05 - acc: 1.0000 epochs : 418 Epoch 1/1 - 0s - loss: 8.5131e-05 - acc: 1.0000 epochs : 419 Epoch 1/1 - 0s - loss: 8.2942e-05 - acc: 1.0000 epochs : 420 Epoch 1/1 - 0s - loss: 7.9938e-05 - acc: 1.0000 epochs : 421 Epoch 1/1 - 0s - loss: 7.8029e-05 - acc: 1.0000 epochs : 422 Epoch 1/1 - 0s - loss: 7.7862e-05 - acc: 1.0000 epochs : 423 Epoch 1/1 - 0s - loss: 8.1461e-05 - acc: 1.0000 epochs : 424 Epoch 1/1 - 0s - loss: 7.0205e-05 - acc: 1.0000 epochs : 425 Epoch 1/1 - 0s - loss: 8.2774e-05 - acc: 1.0000 epochs : 426 Epoch 1/1 - 0s - loss: 9.9650e-05 - acc: 1.0000 epochs : 427 Epoch 1/1 - 0s - loss: 0.2215 - acc: 0.9600 epochs : 428 Epoch 1/1 - 0s - loss: 2.9793 - acc: 0.4200 epochs : 429 Epoch 1/1 - 0s - loss: 3.3169 - acc: 0.2600 epochs : 430 Epoch 1/1 - 0s - loss: 1.9329 - acc: 0.3000 epochs : 431 Epoch 1/1 - 0s - loss: 1.7903 - acc: 0.2800 epochs : 432 Epoch 1/1 - 0s - loss: 1.6629 - acc: 0.3200 epochs : 433 Epoch 1/1 - 0s - loss: 1.5960 - acc: 0.4200 epochs : 434 Epoch 1/1 - 0s - loss: 1.5736 - acc: 0.3400 epochs : 435 Epoch 1/1 - 0s - loss: 1.5355 - acc: 0.3600 epochs : 436 Epoch 1/1 - 0s - loss: 1.4713 - acc: 0.4400 epochs : 437 Epoch 1/1 - 0s - loss: 1.3975 - acc: 0.5000 epochs : 438 Epoch 1/1 - 0s - loss: 1.4380 - acc: 0.4200 epochs : 439 Epoch 1/1 - 0s - loss: 1.4549 - acc: 0.3800 epochs : 440 Epoch 1/1 - 0s - loss: 1.7500 - acc: 0.3800 epochs : 441 Epoch 1/1 - 0s - loss: 1.5612 - acc: 0.3800 epochs : 442 Epoch 1/1 - 0s - loss: 1.3467 - acc: 0.5400 epochs : 443 Epoch 1/1 - 0s - loss: 1.1793 - acc: 0.5400 epochs : 444 Epoch 1/1 - 0s - loss: 1.4907 - acc: 0.5000 epochs : 445 Epoch 1/1 - 0s - loss: 1.0692 - acc: 0.5800 epochs : 446 Epoch 1/1 - 0s - loss: 1.0942 - acc: 0.5800 epochs : 447 Epoch 1/1 - 0s - loss: 0.8714 - acc: 0.6400 epochs : 448 Epoch 1/1 - 0s - loss: 1.2277 - acc: 0.5400 epochs : 449 Epoch 1/1 - 0s - loss: 0.9663 - acc: 0.5600 epochs : 450 Epoch 1/1 - 0s - loss: 1.0353 - acc: 0.6400 epochs : 451 Epoch 1/1 - 0s - loss: 1.1008 - acc: 0.6000 epochs : 452 Epoch 1/1 - 0s - loss: 0.9227 - acc: 0.7200 epochs : 453 Epoch 1/1 - 0s - loss: 0.8873 - acc: 0.7000 epochs : 454 Epoch 1/1 - 0s - loss: 1.1804 - acc: 0.6200 epochs : 455 Epoch 1/1 - 0s - loss: 0.9780 - acc: 0.6600 epochs : 456 Epoch 1/1 - 0s - loss: 0.9881 - acc: 0.6200 epochs : 457 Epoch 1/1 - 0s - loss: 0.8201 - acc: 0.7400 epochs : 458 Epoch 1/1 - 0s - loss: 0.7603 - acc: 0.7200 epochs : 459 Epoch 1/1 - 0s - loss: 0.8996 - acc: 0.6200 epochs : 460 Epoch 1/1 - 0s - loss: 1.4974 - acc: 0.5000 epochs : 461 Epoch 1/1 - 0s - loss: 1.1510 - acc: 0.5200 epochs : 462 Epoch 1/1 - 0s - loss: 1.3679 - acc: 0.4400 epochs : 463 Epoch 1/1 - 0s - loss: 0.9915 - acc: 0.6800 epochs : 464 Epoch 1/1 - 0s - loss: 0.9449 - acc: 0.6600 epochs : 465 Epoch 1/1 - 0s - loss: 0.8055 - acc: 0.7200 epochs : 466 Epoch 1/1 - 0s - loss: 0.6552 - acc: 0.7800 epochs : 467 Epoch 1/1 - 0s - loss: 0.7345 - acc: 0.7200 epochs : 468 Epoch 1/1 - 0s - loss: 0.8495 - acc: 0.7400 epochs : 469 Epoch 1/1 - 0s - loss: 0.5614 - acc: 0.8000 epochs : 470 Epoch 1/1 - 0s - loss: 0.5090 - acc: 0.8600 epochs : 471 Epoch 1/1 - 0s - loss: 0.6284 - acc: 0.7800 epochs : 472 Epoch 1/1 - 0s - loss: 0.4316 - acc: 0.8600 epochs : 473 Epoch 1/1 - 0s - loss: 0.3979 - acc: 0.8800 epochs : 474 Epoch 1/1 - 0s - loss: 0.3925 - acc: 0.8800 epochs : 475 Epoch 1/1 - 0s - loss: 1.2321 - acc: 0.6800 epochs : 476 Epoch 1/1 - 0s - loss: 0.7542 - acc: 0.7400 epochs : 477 Epoch 1/1 - 0s - loss: 0.5271 - acc: 0.8600 epochs : 478 Epoch 1/1 - 0s - loss: 1.0258 - acc: 0.5600 epochs : 479 Epoch 1/1 - 0s - loss: 0.6562 - acc: 0.7800 epochs : 480 Epoch 1/1 - 0s - loss: 0.6263 - acc: 0.7600 epochs : 481 Epoch 1/1 - 0s - loss: 0.5496 - acc: 0.8000 epochs : 482 Epoch 1/1 - 0s - loss: 0.4752 - acc: 0.8400 epochs : 483 Epoch 1/1 - 0s - loss: 0.3173 - acc: 0.9000 epochs : 484 Epoch 1/1 - 0s - loss: 0.3409 - acc: 0.9200 epochs : 485 Epoch 1/1 - 0s - loss: 1.0874 - acc: 0.6800 epochs : 486 Epoch 1/1 - 0s - loss: 1.2433 - acc: 0.5600 epochs : 487 Epoch 1/1 - 0s - loss: 0.7972 - acc: 0.6800 epochs : 488 Epoch 1/1 - 0s - loss: 0.7351 - acc: 0.7000 epochs : 489 Epoch 1/1 - 0s - loss: 0.7023 - acc: 0.7800 epochs : 490 Epoch 1/1 - 0s - loss: 0.5076 - acc: 0.8400 epochs : 491 Epoch 1/1 - 0s - loss: 0.6348 - acc: 0.7800 epochs : 492 Epoch 1/1 - 0s - loss: 0.6839 - acc: 0.7000 epochs : 493 Epoch 1/1 - 0s - loss: 0.7709 - acc: 0.7400 epochs : 494 Epoch 1/1 - 0s - loss: 0.5506 - acc: 0.7800 epochs : 495 Epoch 1/1 - 0s - loss: 0.6067 - acc: 0.7800 epochs : 496 Epoch 1/1 - 0s - loss: 0.5885 - acc: 0.7600 epochs : 497 Epoch 1/1 - 0s - loss: 0.6952 - acc: 0.7600 epochs : 498 Epoch 1/1 - 0s - loss: 0.9445 - acc: 0.6400 epochs : 499 Epoch 1/1 - 0s - loss: 0.6534 - acc: 0.7200 epochs : 500 Epoch 1/1 - 0s - loss: 0.9857 - acc: 0.7200 epochs : 501 Epoch 1/1 - 0s - loss: 1.0611 - acc: 0.5800 epochs : 502 Epoch 1/1 - 0s - loss: 0.8469 - acc: 0.7000 epochs : 503 Epoch 1/1 - 0s - loss: 0.7060 - acc: 0.7600 epochs : 504 Epoch 1/1 - 0s - loss: 1.0393 - acc: 0.5200 epochs : 505 Epoch 1/1 - 0s - loss: 0.5384 - acc: 0.7800 epochs : 506 Epoch 1/1 - 0s - loss: 0.5250 - acc: 0.8600 epochs : 507 Epoch 1/1 - 0s - loss: 0.7102 - acc: 0.7400 epochs : 508 Epoch 1/1 - 0s - loss: 1.2317 - acc: 0.5000 epochs : 509 Epoch 1/1 - 0s - loss: 0.7131 - acc: 0.7800 epochs : 510 Epoch 1/1 - 0s - loss: 0.7571 - acc: 0.7600 epochs : 511 Epoch 1/1 - 0s - loss: 0.2981 - acc: 0.9600 epochs : 512 Epoch 1/1 - 0s - loss: 0.2758 - acc: 0.9400 epochs : 513 Epoch 1/1 - 0s - loss: 0.2755 - acc: 0.9200 epochs : 514 Epoch 1/1 - 0s - loss: 0.2997 - acc: 0.9600 epochs : 515 Epoch 1/1 - 0s - loss: 0.2501 - acc: 0.9200 epochs : 516 Epoch 1/1 - 0s - loss: 0.1783 - acc: 0.9400 epochs : 517 Epoch 1/1 - 0s - loss: 0.3135 - acc: 0.8600 epochs : 518 Epoch 1/1 - 0s - loss: 0.9013 - acc: 0.7400 epochs : 519 Epoch 1/1 - 0s - loss: 0.6921 - acc: 0.7400 epochs : 520 Epoch 1/1 - 0s - loss: 0.3599 - acc: 0.8800 epochs : 521 Epoch 1/1 - 0s - loss: 0.2575 - acc: 0.9400 epochs : 522 Epoch 1/1 - 0s - loss: 0.2944 - acc: 0.9000 epochs : 523 Epoch 1/1 - 0s - loss: 0.2100 - acc: 0.9600 epochs : 524 Epoch 1/1 - 0s - loss: 0.1314 - acc: 0.9800 epochs : 525 Epoch 1/1 - 0s - loss: 0.3397 - acc: 0.8800 epochs : 526 Epoch 1/1 - 0s - loss: 0.9140 - acc: 0.6400 epochs : 527 Epoch 1/1 - 0s - loss: 0.5247 - acc: 0.8600 epochs : 528 Epoch 1/1 - 0s - loss: 1.2727 - acc: 0.6600 epochs : 529 Epoch 1/1 - 0s - loss: 0.5513 - acc: 0.7600 epochs : 530 Epoch 1/1 - 0s - loss: 0.4036 - acc: 0.8600 epochs : 531 Epoch 1/1 - 0s - loss: 0.4344 - acc: 0.8800 epochs : 532 Epoch 1/1 - 0s - loss: 0.2186 - acc: 0.9200 epochs : 533 Epoch 1/1 - 0s - loss: 0.2608 - acc: 0.9400 epochs : 534 Epoch 1/1 - 0s - loss: 0.1462 - acc: 0.9400 epochs : 535 Epoch 1/1 - 0s - loss: 0.1477 - acc: 0.9600 epochs : 536 Epoch 1/1 - 0s - loss: 0.1001 - acc: 0.9800 epochs : 537 Epoch 1/1 - 0s - loss: 0.0620 - acc: 1.0000 epochs : 538 Epoch 1/1 - 0s - loss: 0.0873 - acc: 0.9600 epochs : 539 Epoch 1/1 - 0s - loss: 0.0474 - acc: 1.0000 epochs : 540 Epoch 1/1 - 0s - loss: 0.0553 - acc: 1.0000 epochs : 541 Epoch 1/1 - 0s - loss: 0.0406 - acc: 1.0000 epochs : 542 Epoch 1/1 - 0s - loss: 0.0427 - acc: 1.0000 epochs : 543 Epoch 1/1 - 0s - loss: 0.0239 - acc: 1.0000 epochs : 544 Epoch 1/1 - 0s - loss: 0.0179 - acc: 1.0000 epochs : 545 Epoch 1/1 - 0s - loss: 0.0157 - acc: 1.0000 epochs : 546 Epoch 1/1 - 0s - loss: 0.0145 - acc: 1.0000 epochs : 547 Epoch 1/1 - 0s - loss: 0.0130 - acc: 1.0000 epochs : 548 Epoch 1/1 - 0s - loss: 0.0123 - acc: 1.0000 epochs : 549 Epoch 1/1 - 0s - loss: 0.0114 - acc: 1.0000 epochs : 550 Epoch 1/1 - 0s - loss: 0.0109 - acc: 1.0000 epochs : 551 Epoch 1/1 - 0s - loss: 0.0103 - acc: 1.0000 epochs : 552 Epoch 1/1 - 0s - loss: 0.0097 - acc: 1.0000 epochs : 553 Epoch 1/1 - 0s - loss: 0.0091 - acc: 1.0000 epochs : 554 Epoch 1/1 - 0s - loss: 0.0090 - acc: 1.0000 epochs : 555 Epoch 1/1 - 0s - loss: 0.0080 - acc: 1.0000 epochs : 556 Epoch 1/1 - 0s - loss: 0.0096 - acc: 1.0000 epochs : 557 Epoch 1/1 - 0s - loss: 0.0070 - acc: 1.0000 epochs : 558 Epoch 1/1 - 0s - loss: 0.0081 - acc: 1.0000 epochs : 559 Epoch 1/1 - 0s - loss: 0.0064 - acc: 1.0000 epochs : 560 Epoch 1/1 - 0s - loss: 0.0078 - acc: 1.0000 epochs : 561 Epoch 1/1 - 0s - loss: 0.0059 - acc: 1.0000 epochs : 562 Epoch 1/1 - 0s - loss: 0.0098 - acc: 1.0000 epochs : 563 Epoch 1/1 - 0s - loss: 0.0087 - acc: 1.0000 epochs : 564 Epoch 1/1 - 0s - loss: 0.1148 - acc: 0.9600 epochs : 565 Epoch 1/1 - 0s - loss: 0.4108 - acc: 0.8800 epochs : 566 Epoch 1/1 - 0s - loss: 1.3461 - acc: 0.6600 epochs : 567 Epoch 1/1 - 0s - loss: 1.9688 - acc: 0.4000 epochs : 568 Epoch 1/1 - 0s - loss: 1.2400 - acc: 0.5400 epochs : 569 Epoch 1/1 - 0s - loss: 0.8472 - acc: 0.6600 epochs : 570 Epoch 1/1 - 0s - loss: 0.7247 - acc: 0.7400 epochs : 571 Epoch 1/1 - 0s - loss: 0.6337 - acc: 0.7800 epochs : 572 Epoch 1/1 - 0s - loss: 0.8938 - acc: 0.6200 epochs : 573 Epoch 1/1 - 0s - loss: 1.0957 - acc: 0.5200 epochs : 574 Epoch 1/1 - 0s - loss: 0.7940 - acc: 0.7000 epochs : 575 Epoch 1/1 - 0s - loss: 0.7904 - acc: 0.7200 epochs : 576 Epoch 1/1 - 0s - loss: 0.5902 - acc: 0.7400 epochs : 577 Epoch 1/1 - 0s - loss: 0.6208 - acc: 0.7600 epochs : 578 Epoch 1/1 - 0s - loss: 0.8504 - acc: 0.6600 epochs : 579 Epoch 1/1 - 0s - loss: 0.4454 - acc: 0.8400 epochs : 580 Epoch 1/1 - 0s - loss: 0.6753 - acc: 0.7800 epochs : 581 Epoch 1/1 - 0s - loss: 0.6193 - acc: 0.7800 epochs : 582 Epoch 1/1 - 0s - loss: 0.4976 - acc: 0.8400 epochs : 583 Epoch 1/1 - 0s - loss: 0.4955 - acc: 0.8600 epochs : 584 Epoch 1/1 - 0s - loss: 0.4958 - acc: 0.8200 epochs : 585 Epoch 1/1 - 0s - loss: 0.6192 - acc: 0.7600 epochs : 586 Epoch 1/1 - 0s - loss: 0.4128 - acc: 0.8400 epochs : 587 Epoch 1/1 - 0s - loss: 0.2581 - acc: 0.9400 epochs : 588 Epoch 1/1 - 0s - loss: 0.1943 - acc: 0.9600 epochs : 589 Epoch 1/1 - 0s - loss: 0.1814 - acc: 0.9200 epochs : 590 Epoch 1/1 - 0s - loss: 0.2176 - acc: 0.9200 epochs : 591 Epoch 1/1 - 0s - loss: 0.7526 - acc: 0.6800 epochs : 592 Epoch 1/1 - 0s - loss: 1.3307 - acc: 0.5800 epochs : 593 Epoch 1/1 - 0s - loss: 0.5509 - acc: 0.7200 epochs : 594 Epoch 1/1 - 0s - loss: 0.5409 - acc: 0.7800 epochs : 595 Epoch 1/1 - 0s - loss: 0.7550 - acc: 0.6600 epochs : 596 Epoch 1/1 - 0s - loss: 1.1012 - acc: 0.6400 epochs : 597 Epoch 1/1 - 0s - loss: 0.7047 - acc: 0.7800 epochs : 598 Epoch 1/1 - 0s - loss: 0.4978 - acc: 0.8200 epochs : 599 Epoch 1/1 - 0s - loss: 0.6648 - acc: 0.8000 epochs : 600 Epoch 1/1 - 0s - loss: 0.4014 - acc: 0.9200 epochs : 601 Epoch 1/1 - 0s - loss: 0.4041 - acc: 0.8600 epochs : 602 Epoch 1/1 - 0s - loss: 0.3096 - acc: 0.9000 epochs : 603 Epoch 1/1 - 0s - loss: 0.3159 - acc: 0.8800 epochs : 604 Epoch 1/1 - 0s - loss: 0.2569 - acc: 0.9400 epochs : 605 Epoch 1/1 - 0s - loss: 0.1294 - acc: 0.9800 epochs : 606 Epoch 1/1 - 0s - loss: 0.0832 - acc: 1.0000 epochs : 607 Epoch 1/1 - 0s - loss: 0.0708 - acc: 1.0000 epochs : 608 Epoch 1/1 - 0s - loss: 0.0737 - acc: 1.0000 epochs : 609 Epoch 1/1 - 0s - loss: 0.0494 - acc: 1.0000 epochs : 610 Epoch 1/1 - 0s - loss: 0.0448 - acc: 1.0000 epochs : 611 Epoch 1/1 - 0s - loss: 0.0319 - acc: 1.0000 epochs : 612 Epoch 1/1 - 0s - loss: 0.0261 - acc: 1.0000 epochs : 613 Epoch 1/1 - 0s - loss: 0.0224 - acc: 1.0000 epochs : 614 Epoch 1/1 - 0s - loss: 0.0195 - acc: 1.0000 epochs : 615 Epoch 1/1 - 0s - loss: 0.0173 - acc: 1.0000 epochs : 616 Epoch 1/1 - 0s - loss: 0.0157 - acc: 1.0000 epochs : 617 Epoch 1/1 - 0s - loss: 0.0144 - acc: 1.0000 epochs : 618 Epoch 1/1 - 0s - loss: 0.0133 - acc: 1.0000 epochs : 619 Epoch 1/1 - 0s - loss: 0.0124 - acc: 1.0000 epochs : 620 Epoch 1/1 - 0s - loss: 0.0116 - acc: 1.0000 epochs : 621 Epoch 1/1 - 0s - loss: 0.0108 - acc: 1.0000 epochs : 622 Epoch 1/1 - 0s - loss: 0.0101 - acc: 1.0000 epochs : 623 Epoch 1/1 - 0s - loss: 0.0096 - acc: 1.0000 epochs : 624 Epoch 1/1 - 0s - loss: 0.0091 - acc: 1.0000 epochs : 625 Epoch 1/1 - 0s - loss: 0.0087 - acc: 1.0000 epochs : 626 Epoch 1/1 - 0s - loss: 0.0084 - acc: 1.0000 epochs : 627 Epoch 1/1 - 0s - loss: 0.0081 - acc: 1.0000 epochs : 628 Epoch 1/1 - 0s - loss: 0.0076 - acc: 1.0000 epochs : 629 Epoch 1/1 - 0s - loss: 0.0072 - acc: 1.0000 epochs : 630 Epoch 1/1 - 0s - loss: 0.0067 - acc: 1.0000 epochs : 631 Epoch 1/1 - 0s - loss: 0.0063 - acc: 1.0000 epochs : 632 Epoch 1/1 - 0s - loss: 0.0062 - acc: 1.0000 epochs : 633 Epoch 1/1 - 0s - loss: 0.0060 - acc: 1.0000 epochs : 634 Epoch 1/1 - 0s - loss: 0.0061 - acc: 1.0000 epochs : 635 Epoch 1/1 - 0s - loss: 0.0060 - acc: 1.0000 epochs : 636 Epoch 1/1 - 0s - loss: 0.0120 - acc: 1.0000 epochs : 637 Epoch 1/1 - 0s - loss: 0.3687 - acc: 0.9200 epochs : 638 Epoch 1/1 - 0s - loss: 1.2781 - acc: 0.6200 epochs : 639 Epoch 1/1 - 0s - loss: 1.5236 - acc: 0.5200 epochs : 640 Epoch 1/1 - 0s - loss: 1.4114 - acc: 0.4600 epochs : 641 Epoch 1/1 - 0s - loss: 0.5934 - acc: 0.7600 epochs : 642 Epoch 1/1 - 0s - loss: 0.8176 - acc: 0.6600 epochs : 643 Epoch 1/1 - 0s - loss: 0.9879 - acc: 0.6000 epochs : 644 Epoch 1/1 - 0s - loss: 0.5935 - acc: 0.7400 epochs : 645 Epoch 1/1 - 0s - loss: 0.3836 - acc: 0.8800 epochs : 646 Epoch 1/1 - 0s - loss: 0.3763 - acc: 0.8600 epochs : 647 Epoch 1/1 - 0s - loss: 0.6614 - acc: 0.7000 epochs : 648 Epoch 1/1 - 0s - loss: 0.8816 - acc: 0.6800 epochs : 649 Epoch 1/1 - 0s - loss: 0.5274 - acc: 0.8200 epochs : 650 Epoch 1/1 - 0s - loss: 0.4703 - acc: 0.8200 epochs : 651 Epoch 1/1 - 0s - loss: 0.6095 - acc: 0.8200 epochs : 652 Epoch 1/1 - 0s - loss: 0.1888 - acc: 0.9800 epochs : 653 Epoch 1/1 - 0s - loss: 0.1744 - acc: 0.9600 epochs : 654 Epoch 1/1 - 0s - loss: 0.1046 - acc: 1.0000 epochs : 655 Epoch 1/1 - 0s - loss: 0.1338 - acc: 0.9600 epochs : 656 Epoch 1/1 - 0s - loss: 0.1084 - acc: 0.9800 epochs : 657 Epoch 1/1 - 0s - loss: 0.2525 - acc: 0.9000 epochs : 658 Epoch 1/1 - 0s - loss: 0.1563 - acc: 0.9600 epochs : 659 Epoch 1/1 - 0s - loss: 0.2258 - acc: 0.9400 epochs : 660 Epoch 1/1 - 0s - loss: 0.2005 - acc: 0.9000 epochs : 661 Epoch 1/1 - 0s - loss: 0.9073 - acc: 0.6400 epochs : 662 Epoch 1/1 - 0s - loss: 1.4935 - acc: 0.5400 epochs : 663 Epoch 1/1 - 0s - loss: 0.7582 - acc: 0.7200 epochs : 664 Epoch 1/1 - 0s - loss: 0.3473 - acc: 0.8400 epochs : 665 Epoch 1/1 - 0s - loss: 0.1956 - acc: 0.9600 epochs : 666 Epoch 1/1 - 0s - loss: 0.0668 - acc: 1.0000 epochs : 667 Epoch 1/1 - 0s - loss: 0.0586 - acc: 1.0000 epochs : 668 Epoch 1/1 - 0s - loss: 0.0351 - acc: 1.0000 epochs : 669 Epoch 1/1 - 0s - loss: 0.0277 - acc: 1.0000 epochs : 670 Epoch 1/1 - 0s - loss: 0.0229 - acc: 1.0000 epochs : 671 Epoch 1/1 - 0s - loss: 0.0198 - acc: 1.0000 epochs : 672 Epoch 1/1 - 0s - loss: 0.0173 - acc: 1.0000 epochs : 673 Epoch 1/1 - 0s - loss: 0.0153 - acc: 1.0000 epochs : 674 Epoch 1/1 - 0s - loss: 0.0138 - acc: 1.0000 epochs : 675 Epoch 1/1 - 0s - loss: 0.0126 - acc: 1.0000 epochs : 676 Epoch 1/1 - 0s - loss: 0.0115 - acc: 1.0000 epochs : 677 Epoch 1/1 - 0s - loss: 0.0105 - acc: 1.0000 epochs : 678 Epoch 1/1 - 0s - loss: 0.0098 - acc: 1.0000 epochs : 679 Epoch 1/1 - 0s - loss: 0.0091 - acc: 1.0000 epochs : 680 Epoch 1/1 - 0s - loss: 0.0085 - acc: 1.0000 epochs : 681 Epoch 1/1 - 0s - loss: 0.0079 - acc: 1.0000 epochs : 682 Epoch 1/1 - 0s - loss: 0.0075 - acc: 1.0000 epochs : 683 Epoch 1/1 - 0s - loss: 0.0071 - acc: 1.0000 epochs : 684 Epoch 1/1 - 0s - loss: 0.0067 - acc: 1.0000 epochs : 685 Epoch 1/1 - 0s - loss: 0.0064 - acc: 1.0000 epochs : 686 Epoch 1/1 - 0s - loss: 0.0061 - acc: 1.0000 epochs : 687 Epoch 1/1 - 0s - loss: 0.0058 - acc: 1.0000 epochs : 688 Epoch 1/1 - 0s - loss: 0.0056 - acc: 1.0000 epochs : 689 Epoch 1/1 - 0s - loss: 0.0054 - acc: 1.0000 epochs : 690 Epoch 1/1 - 0s - loss: 0.0052 - acc: 1.0000 epochs : 691 Epoch 1/1 - 0s - loss: 0.0050 - acc: 1.0000 epochs : 692 Epoch 1/1 - 0s - loss: 0.0048 - acc: 1.0000 epochs : 693 Epoch 1/1 - 0s - loss: 0.0046 - acc: 1.0000 epochs : 694 Epoch 1/1 - 0s - loss: 0.0044 - acc: 1.0000 epochs : 695 Epoch 1/1 - 0s - loss: 0.0042 - acc: 1.0000 epochs : 696 Epoch 1/1 - 0s - loss: 0.0040 - acc: 1.0000 epochs : 697 Epoch 1/1 - 0s - loss: 0.0039 - acc: 1.0000 epochs : 698 Epoch 1/1 - 0s - loss: 0.0037 - acc: 1.0000 epochs : 699 Epoch 1/1 - 0s - loss: 0.0036 - acc: 1.0000 epochs : 700 Epoch 1/1 - 0s - loss: 0.0034 - acc: 1.0000 epochs : 701 Epoch 1/1 - 0s - loss: 0.0033 - acc: 1.0000 epochs : 702 Epoch 1/1 - 0s - loss: 0.0032 - acc: 1.0000 epochs : 703 Epoch 1/1 - 0s - loss: 0.0030 - acc: 1.0000 epochs : 704 Epoch 1/1 - 0s - loss: 0.0029 - acc: 1.0000 epochs : 705 Epoch 1/1 - 0s - loss: 0.0028 - acc: 1.0000 epochs : 706 Epoch 1/1 - 0s - loss: 0.0027 - acc: 1.0000 epochs : 707 Epoch 1/1 - 0s - loss: 0.0026 - acc: 1.0000 epochs : 708 Epoch 1/1 - 0s - loss: 0.0026 - acc: 1.0000 epochs : 709 Epoch 1/1 - 0s - loss: 0.0025 - acc: 1.0000 epochs : 710 Epoch 1/1 - 0s - loss: 0.0024 - acc: 1.0000 epochs : 711 Epoch 1/1 - 0s - loss: 0.0023 - acc: 1.0000 epochs : 712 Epoch 1/1 - 0s - loss: 0.0022 - acc: 1.0000 epochs : 713 Epoch 1/1 - 0s - loss: 0.0022 - acc: 1.0000 epochs : 714 Epoch 1/1 - 0s - loss: 0.0021 - acc: 1.0000 epochs : 715 Epoch 1/1 - 0s - loss: 0.0021 - acc: 1.0000 epochs : 716 Epoch 1/1 - 0s - loss: 0.0020 - acc: 1.0000 epochs : 717 Epoch 1/1 - 0s - loss: 0.0019 - acc: 1.0000 epochs : 718 Epoch 1/1 - 0s - loss: 0.0018 - acc: 1.0000 epochs : 719 Epoch 1/1 - 0s - loss: 0.0018 - acc: 1.0000 epochs : 720 Epoch 1/1 - 0s - loss: 0.0017 - acc: 1.0000 epochs : 721 Epoch 1/1 - 0s - loss: 0.0016 - acc: 1.0000 epochs : 722 Epoch 1/1 - 0s - loss: 0.0016 - acc: 1.0000 epochs : 723 Epoch 1/1 - 0s - loss: 0.0015 - acc: 1.0000 epochs : 724 Epoch 1/1 - 0s - loss: 0.0015 - acc: 1.0000 epochs : 725 Epoch 1/1 - 0s - loss: 0.0014 - acc: 1.0000 epochs : 726 Epoch 1/1 - 0s - loss: 0.0014 - acc: 1.0000 epochs : 727 Epoch 1/1 - 0s - loss: 0.0013 - acc: 1.0000 epochs : 728 Epoch 1/1 - 0s - loss: 0.0013 - acc: 1.0000 epochs : 729 Epoch 1/1 - 0s - loss: 0.0012 - acc: 1.0000 epochs : 730 Epoch 1/1 - 0s - loss: 0.0012 - acc: 1.0000 epochs : 731 Epoch 1/1 - 0s - loss: 0.0012 - acc: 1.0000 epochs : 732 Epoch 1/1 - 0s - loss: 0.0011 - acc: 1.0000 epochs : 733 Epoch 1/1 - 0s - loss: 0.0011 - acc: 1.0000 epochs : 734 Epoch 1/1 - 0s - loss: 0.0011 - acc: 1.0000 epochs : 735 Epoch 1/1 - 0s - loss: 0.0010 - acc: 1.0000 epochs : 736 Epoch 1/1 - 0s - loss: 9.8949e-04 - acc: 1.0000 epochs : 737 Epoch 1/1 - 0s - loss: 9.6045e-04 - acc: 1.0000 epochs : 738 Epoch 1/1 - 0s - loss: 9.3275e-04 - acc: 1.0000 epochs : 739 Epoch 1/1 - 0s - loss: 9.0728e-04 - acc: 1.0000 epochs : 740 Epoch 1/1 - 0s - loss: 8.8372e-04 - acc: 1.0000 epochs : 741 Epoch 1/1 - 0s - loss: 8.6075e-04 - acc: 1.0000 epochs : 742 Epoch 1/1 - 0s - loss: 8.4015e-04 - acc: 1.0000 epochs : 743 Epoch 1/1 - 0s - loss: 8.1884e-04 - acc: 1.0000 epochs : 744 Epoch 1/1 - 0s - loss: 7.9906e-04 - acc: 1.0000 epochs : 745 Epoch 1/1 - 0s - loss: 7.8063e-04 - acc: 1.0000 epochs : 746 Epoch 1/1 - 0s - loss: 7.6466e-04 - acc: 1.0000 epochs : 747 Epoch 1/1 - 0s - loss: 7.5057e-04 - acc: 1.0000 epochs : 748 Epoch 1/1 - 0s - loss: 7.3797e-04 - acc: 1.0000 epochs : 749 Epoch 1/1 - 0s - loss: 7.2760e-04 - acc: 1.0000 epochs : 750 Epoch 1/1 - 0s - loss: 7.1890e-04 - acc: 1.0000 epochs : 751 Epoch 1/1 - 0s - loss: 7.0948e-04 - acc: 1.0000 epochs : 752 Epoch 1/1 - 0s - loss: 7.0524e-04 - acc: 1.0000 epochs : 753 Epoch 1/1 - 0s - loss: 7.0695e-04 - acc: 1.0000 epochs : 754 Epoch 1/1 - 0s - loss: 7.1290e-04 - acc: 1.0000 epochs : 755 Epoch 1/1 - 0s - loss: 7.2715e-04 - acc: 1.0000 epochs : 756 Epoch 1/1 - 0s - loss: 9.2435e-04 - acc: 1.0000 epochs : 757 Epoch 1/1 - 0s - loss: 7.2873e-04 - acc: 1.0000 epochs : 758 Epoch 1/1 - 0s - loss: 6.5710e-04 - acc: 1.0000 epochs : 759 Epoch 1/1 - 0s - loss: 0.0010 - acc: 1.0000 epochs : 760 Epoch 1/1 - 0s - loss: 0.1713 - acc: 0.9200 epochs : 761 Epoch 1/1 - 0s - loss: 1.8027 - acc: 0.4400 epochs : 762 Epoch 1/1 - 0s - loss: 0.7561 - acc: 0.7800 epochs : 763 Epoch 1/1 - 0s - loss: 1.2780 - acc: 0.5000 epochs : 764 Epoch 1/1 - 0s - loss: 0.8889 - acc: 0.6000 epochs : 765 Epoch 1/1 - 0s - loss: 0.6509 - acc: 0.7200 epochs : 766 Epoch 1/1 - 0s - loss: 0.5431 - acc: 0.8200 epochs : 767 Epoch 1/1 - 0s - loss: 0.5183 - acc: 0.8000 epochs : 768 Epoch 1/1 - 0s - loss: 0.3971 - acc: 0.8400 epochs : 769 Epoch 1/1 - 0s - loss: 0.2758 - acc: 0.9000 epochs : 770 Epoch 1/1 - 0s - loss: 0.2264 - acc: 0.9400 epochs : 771 Epoch 1/1 - 0s - loss: 0.2424 - acc: 0.9400 epochs : 772 Epoch 1/1 - 0s - loss: 0.1893 - acc: 0.9800 epochs : 773 Epoch 1/1 - 0s - loss: 0.1397 - acc: 0.9600 epochs : 774 Epoch 1/1 - 0s - loss: 0.1602 - acc: 0.9800 epochs : 775 Epoch 1/1 - 0s - loss: 0.2380 - acc: 0.9000 epochs : 776 Epoch 1/1 - 0s - loss: 1.5112 - acc: 0.5800 epochs : 777 Epoch 1/1 - 0s - loss: 0.7373 - acc: 0.7000 epochs : 778 Epoch 1/1 - 0s - loss: 0.3428 - acc: 0.9000 epochs : 779 Epoch 1/1 - 0s - loss: 0.2212 - acc: 0.9400 epochs : 780 Epoch 1/1 - 0s - loss: 0.2492 - acc: 0.9600 epochs : 781 Epoch 1/1 - 0s - loss: 0.1983 - acc: 0.9400 epochs : 782 Epoch 1/1 - 0s - loss: 0.3833 - acc: 0.8800 epochs : 783 Epoch 1/1 - 0s - loss: 0.8452 - acc: 0.7200 epochs : 784 Epoch 1/1 - 0s - loss: 0.6173 - acc: 0.7400 epochs : 785 Epoch 1/1 - 0s - loss: 0.7302 - acc: 0.7600 epochs : 786 Epoch 1/1 - 0s - loss: 0.3281 - acc: 0.8800 epochs : 787 Epoch 1/1 - 0s - loss: 0.1695 - acc: 0.9800 epochs : 788 Epoch 1/1 - 0s - loss: 0.1070 - acc: 0.9800 epochs : 789 Epoch 1/1 - 0s - loss: 0.4332 - acc: 0.9000 epochs : 790 Epoch 1/1 - 0s - loss: 0.1485 - acc: 1.0000 epochs : 791 Epoch 1/1 - 0s - loss: 0.2170 - acc: 0.9200 epochs : 792 Epoch 1/1 - 0s - loss: 0.1277 - acc: 0.9800 epochs : 793 Epoch 1/1 - 0s - loss: 0.1202 - acc: 0.9600 epochs : 794 Epoch 1/1 - 0s - loss: 0.0721 - acc: 1.0000 epochs : 795 Epoch 1/1 - 0s - loss: 0.0525 - acc: 1.0000 epochs : 796 Epoch 1/1 - 0s - loss: 0.0936 - acc: 0.9800 epochs : 797 Epoch 1/1 - 0s - loss: 0.1794 - acc: 0.9200 epochs : 798 Epoch 1/1 - 0s - loss: 0.1648 - acc: 0.9200 epochs : 799 Epoch 1/1 - 0s - loss: 0.1600 - acc: 0.9600 epochs : 800 Epoch 1/1 - 0s - loss: 0.2632 - acc: 0.9000 epochs : 801 Epoch 1/1 - 0s - loss: 0.4832 - acc: 0.8000 epochs : 802 Epoch 1/1 - 0s - loss: 0.9920 - acc: 0.7000 epochs : 803 Epoch 1/1 - 0s - loss: 0.3116 - acc: 0.9200 epochs : 804 Epoch 1/1 - 0s - loss: 0.2119 - acc: 0.9600 epochs : 805 Epoch 1/1 - 0s - loss: 0.1559 - acc: 0.9400 epochs : 806 Epoch 1/1 - 0s - loss: 0.1002 - acc: 0.9800 epochs : 807 Epoch 1/1 - 0s - loss: 0.1088 - acc: 0.9600 epochs : 808 Epoch 1/1 - 0s - loss: 0.1032 - acc: 0.9800 epochs : 809 Epoch 1/1 - 0s - loss: 0.0450 - acc: 1.0000 epochs : 810 Epoch 1/1 - 0s - loss: 0.0303 - acc: 1.0000 epochs : 811 Epoch 1/1 - 0s - loss: 0.0245 - acc: 1.0000 epochs : 812 Epoch 1/1 - 0s - loss: 0.0249 - acc: 1.0000 epochs : 813 Epoch 1/1 - 0s - loss: 0.0167 - acc: 1.0000 epochs : 814 Epoch 1/1 - 0s - loss: 0.0134 - acc: 1.0000 epochs : 815 Epoch 1/1 - 0s - loss: 0.0118 - acc: 1.0000 epochs : 816 Epoch 1/1 - 0s - loss: 0.0104 - acc: 1.0000 epochs : 817 Epoch 1/1 - 0s - loss: 0.0102 - acc: 1.0000 epochs : 818 Epoch 1/1 - 0s - loss: 0.0106 - acc: 1.0000 epochs : 819 Epoch 1/1 - 0s - loss: 0.0082 - acc: 1.0000 epochs : 820 Epoch 1/1 - 0s - loss: 0.0074 - acc: 1.0000 epochs : 821 Epoch 1/1 - 0s - loss: 0.0082 - acc: 1.0000 epochs : 822 Epoch 1/1 - 0s - loss: 0.0074 - acc: 1.0000 epochs : 823 Epoch 1/1 - 0s - loss: 0.0071 - acc: 1.0000 epochs : 824 Epoch 1/1 - 0s - loss: 0.0066 - acc: 1.0000 epochs : 825 Epoch 1/1 - 0s - loss: 0.0060 - acc: 1.0000 epochs : 826 Epoch 1/1 - 0s - loss: 0.0091 - acc: 1.0000 epochs : 827 Epoch 1/1 - 0s - loss: 0.0079 - acc: 1.0000 epochs : 828 Epoch 1/1 - 0s - loss: 0.0101 - acc: 1.0000 epochs : 829 Epoch 1/1 - 0s - loss: 0.0069 - acc: 1.0000 epochs : 830 Epoch 1/1 - 0s - loss: 0.0088 - acc: 1.0000 epochs : 831 Epoch 1/1 - 0s - loss: 0.0899 - acc: 0.9800 epochs : 832 Epoch 1/1 - 0s - loss: 1.2267 - acc: 0.6200 epochs : 833 Epoch 1/1 - 0s - loss: 1.3149 - acc: 0.6000 epochs : 834 Epoch 1/1 - 0s - loss: 0.9482 - acc: 0.5800 epochs : 835 Epoch 1/1 - 0s - loss: 0.7932 - acc: 0.7000 epochs : 836 Epoch 1/1 - 0s - loss: 0.7569 - acc: 0.6800 epochs : 837 Epoch 1/1 - 0s - loss: 0.5271 - acc: 0.7600 epochs : 838 Epoch 1/1 - 0s - loss: 0.3753 - acc: 0.8800 epochs : 839 Epoch 1/1 - 0s - loss: 0.5031 - acc: 0.8400 epochs : 840 Epoch 1/1 - 0s - loss: 0.6714 - acc: 0.7800 epochs : 841 Epoch 1/1 - 0s - loss: 0.3604 - acc: 0.9200 epochs : 842 Epoch 1/1 - 0s - loss: 0.3276 - acc: 0.8600 epochs : 843 Epoch 1/1 - 0s - loss: 0.2254 - acc: 0.9200 epochs : 844 Epoch 1/1 - 0s - loss: 0.1347 - acc: 0.9600 epochs : 845 Epoch 1/1 - 0s - loss: 0.6357 - acc: 0.7800 epochs : 846 Epoch 1/1 - 0s - loss: 0.3377 - acc: 0.8800 epochs : 847 Epoch 1/1 - 0s - loss: 0.4831 - acc: 0.8000 epochs : 848 Epoch 1/1 - 0s - loss: 0.3953 - acc: 0.8600 epochs : 849 Epoch 1/1 - 0s - loss: 0.2241 - acc: 0.9600 epochs : 850 Epoch 1/1 - 0s - loss: 0.2195 - acc: 0.9200 epochs : 851 Epoch 1/1 - 0s - loss: 0.1723 - acc: 0.9400 epochs : 852 Epoch 1/1 - 0s - loss: 0.1067 - acc: 0.9800 epochs : 853 Epoch 1/1 - 0s - loss: 0.1828 - acc: 0.9400 epochs : 854 Epoch 1/1 - 0s - loss: 0.1144 - acc: 0.9800 epochs : 855 Epoch 1/1 - 0s - loss: 0.0856 - acc: 0.9600 epochs : 856 Epoch 1/1 - 0s - loss: 0.1228 - acc: 0.9600 epochs : 857 Epoch 1/1 - 0s - loss: 0.3354 - acc: 0.8600 epochs : 858 Epoch 1/1 - 0s - loss: 0.1789 - acc: 0.9200 epochs : 859 Epoch 1/1 - 0s - loss: 0.0781 - acc: 1.0000 epochs : 860 Epoch 1/1 - 0s - loss: 0.0543 - acc: 1.0000 epochs : 861 Epoch 1/1 - 0s - loss: 0.0246 - acc: 1.0000 epochs : 862 Epoch 1/1 - 0s - loss: 0.0161 - acc: 1.0000 epochs : 863 Epoch 1/1 - 0s - loss: 0.0128 - acc: 1.0000 epochs : 864 Epoch 1/1 - 0s - loss: 0.0112 - acc: 1.0000 epochs : 865 Epoch 1/1 - 0s - loss: 0.0101 - acc: 1.0000 epochs : 866 Epoch 1/1 - 0s - loss: 0.0094 - acc: 1.0000 epochs : 867 Epoch 1/1 - 0s - loss: 0.0088 - acc: 1.0000 epochs : 868 Epoch 1/1 - 0s - loss: 0.0079 - acc: 1.0000 epochs : 869 Epoch 1/1 - 0s - loss: 0.0071 - acc: 1.0000 epochs : 870 Epoch 1/1 - 0s - loss: 0.0065 - acc: 1.0000 epochs : 871 Epoch 1/1 - 0s - loss: 0.0059 - acc: 1.0000 epochs : 872 Epoch 1/1 - 0s - loss: 0.0053 - acc: 1.0000 epochs : 873 Epoch 1/1 - 0s - loss: 0.0052 - acc: 1.0000 epochs : 874 Epoch 1/1 - 0s - loss: 0.0047 - acc: 1.0000 epochs : 875 Epoch 1/1 - 0s - loss: 0.0044 - acc: 1.0000 epochs : 876 Epoch 1/1 - 0s - loss: 0.0042 - acc: 1.0000 epochs : 877 Epoch 1/1 - 0s - loss: 0.0040 - acc: 1.0000 epochs : 878 Epoch 1/1 - 0s - loss: 0.0038 - acc: 1.0000 epochs : 879 Epoch 1/1 - 0s - loss: 0.0037 - acc: 1.0000 epochs : 880 Epoch 1/1 - 0s - loss: 0.0035 - acc: 1.0000 epochs : 881 Epoch 1/1 - 0s - loss: 0.0033 - acc: 1.0000 epochs : 882 Epoch 1/1 - 0s - loss: 0.0031 - acc: 1.0000 epochs : 883 Epoch 1/1 - 0s - loss: 0.0029 - acc: 1.0000 epochs : 884 Epoch 1/1 - 0s - loss: 0.0028 - acc: 1.0000 epochs : 885 Epoch 1/1 - 0s - loss: 0.0027 - acc: 1.0000 epochs : 886 Epoch 1/1 - 0s - loss: 0.0025 - acc: 1.0000 epochs : 887 Epoch 1/1 - 0s - loss: 0.0024 - acc: 1.0000 epochs : 888 Epoch 1/1 - 0s - loss: 0.0023 - acc: 1.0000 epochs : 889 Epoch 1/1 - 0s - loss: 0.0023 - acc: 1.0000 epochs : 890 Epoch 1/1 - 0s - loss: 0.0022 - acc: 1.0000 epochs : 891 Epoch 1/1 - 0s - loss: 0.0021 - acc: 1.0000 epochs : 892 Epoch 1/1 - 0s - loss: 0.0021 - acc: 1.0000 epochs : 893 Epoch 1/1 - 0s - loss: 0.0020 - acc: 1.0000 epochs : 894 Epoch 1/1 - 0s - loss: 0.0019 - acc: 1.0000 epochs : 895 Epoch 1/1 - 0s - loss: 0.0019 - acc: 1.0000 epochs : 896 Epoch 1/1 - 0s - loss: 0.0018 - acc: 1.0000 epochs : 897 Epoch 1/1 - 0s - loss: 0.0017 - acc: 1.0000 epochs : 898 Epoch 1/1 - 0s - loss: 0.0017 - acc: 1.0000 epochs : 899 Epoch 1/1 - 0s - loss: 0.0016 - acc: 1.0000 epochs : 900 Epoch 1/1 - 0s - loss: 0.0016 - acc: 1.0000 epochs : 901 Epoch 1/1 - 0s - loss: 0.0015 - acc: 1.0000 epochs : 902 Epoch 1/1 - 0s - loss: 0.0015 - acc: 1.0000 epochs : 903 Epoch 1/1 - 0s - loss: 0.0014 - acc: 1.0000 epochs : 904 Epoch 1/1 - 0s - loss: 0.0014 - acc: 1.0000 epochs : 905 Epoch 1/1 - 0s - loss: 0.0014 - acc: 1.0000 epochs : 906 Epoch 1/1 - 0s - loss: 0.0013 - acc: 1.0000 epochs : 907 Epoch 1/1 - 0s - loss: 0.0013 - acc: 1.0000 epochs : 908 Epoch 1/1 - 0s - loss: 0.0013 - acc: 1.0000 epochs : 909 Epoch 1/1 - 0s - loss: 0.0013 - acc: 1.0000 epochs : 910 Epoch 1/1 - 0s - loss: 0.0012 - acc: 1.0000 epochs : 911 Epoch 1/1 - 0s - loss: 0.0012 - acc: 1.0000 epochs : 912 Epoch 1/1 - 0s - loss: 0.0012 - acc: 1.0000 epochs : 913 Epoch 1/1 - 0s - loss: 0.0011 - acc: 1.0000 epochs : 914 Epoch 1/1 - 0s - loss: 0.0011 - acc: 1.0000 epochs : 915 Epoch 1/1 - 0s - loss: 0.0011 - acc: 1.0000 epochs : 916 Epoch 1/1 - 0s - loss: 0.0010 - acc: 1.0000 epochs : 917 Epoch 1/1 - 0s - loss: 9.8980e-04 - acc: 1.0000 epochs : 918 Epoch 1/1 - 0s - loss: 9.5783e-04 - acc: 1.0000 epochs : 919 Epoch 1/1 - 0s - loss: 9.1305e-04 - acc: 1.0000 epochs : 920 Epoch 1/1 - 0s - loss: 8.6899e-04 - acc: 1.0000 epochs : 921 Epoch 1/1 - 0s - loss: 8.3806e-04 - acc: 1.0000 epochs : 922 Epoch 1/1 - 0s - loss: 8.6227e-04 - acc: 1.0000 epochs : 923 Epoch 1/1 - 0s - loss: 8.8180e-04 - acc: 1.0000 epochs : 924 Epoch 1/1 - 0s - loss: 0.0011 - acc: 1.0000 epochs : 925 Epoch 1/1 - 0s - loss: 0.0013 - acc: 1.0000 epochs : 926 Epoch 1/1 - 0s - loss: 0.1989 - acc: 0.9600 epochs : 927 Epoch 1/1 - 0s - loss: 0.8366 - acc: 0.7400 epochs : 928 Epoch 1/1 - 0s - loss: 1.4481 - acc: 0.6600 epochs : 929 Epoch 1/1 - 0s - loss: 1.2438 - acc: 0.6400 epochs : 930 Epoch 1/1 - 0s - loss: 0.9706 - acc: 0.6800 epochs : 931 Epoch 1/1 - 0s - loss: 0.6459 - acc: 0.7000 epochs : 932 Epoch 1/1 - 0s - loss: 0.6664 - acc: 0.7200 epochs : 933 Epoch 1/1 - 0s - loss: 0.4680 - acc: 0.8000 epochs : 934 Epoch 1/1 - 0s - loss: 0.4230 - acc: 0.8200 epochs : 935 Epoch 1/1 - 0s - loss: 0.3708 - acc: 0.8800 epochs : 936 Epoch 1/1 - 0s - loss: 0.2730 - acc: 0.8800 epochs : 937 Epoch 1/1 - 0s - loss: 0.3551 - acc: 0.8800 epochs : 938 Epoch 1/1 - 0s - loss: 0.4283 - acc: 0.8200 epochs : 939 Epoch 1/1 - 0s - loss: 0.2350 - acc: 0.9200 epochs : 940 Epoch 1/1 - 0s - loss: 0.1269 - acc: 1.0000 epochs : 941 Epoch 1/1 - 0s - loss: 0.3875 - acc: 0.8000 epochs : 942 Epoch 1/1 - 0s - loss: 0.1998 - acc: 0.9200 epochs : 943 Epoch 1/1 - 0s - loss: 0.1701 - acc: 0.9400 epochs : 944 Epoch 1/1 - 0s - loss: 0.6848 - acc: 0.7800 epochs : 945 Epoch 1/1 - 0s - loss: 0.2054 - acc: 0.9600 epochs : 946 Epoch 1/1 - 0s - loss: 0.2386 - acc: 0.8800 epochs : 947 Epoch 1/1 - 0s - loss: 0.4643 - acc: 0.8200 epochs : 948 Epoch 1/1 - 0s - loss: 0.1141 - acc: 0.9800 epochs : 949 Epoch 1/1 - 0s - loss: 0.0983 - acc: 0.9800 epochs : 950 Epoch 1/1 - 0s - loss: 0.0657 - acc: 1.0000 epochs : 951 Epoch 1/1 - 0s - loss: 0.0336 - acc: 1.0000 epochs : 952 Epoch 1/1 - 0s - loss: 0.0309 - acc: 1.0000 epochs : 953 Epoch 1/1 - 0s - loss: 0.0266 - acc: 1.0000 epochs : 954 Epoch 1/1 - 0s - loss: 0.0157 - acc: 1.0000 epochs : 955 Epoch 1/1 - 0s - loss: 0.0136 - acc: 1.0000 epochs : 956 Epoch 1/1 - 0s - loss: 0.0121 - acc: 1.0000 epochs : 957 Epoch 1/1 - 0s - loss: 0.0110 - acc: 1.0000 epochs : 958 Epoch 1/1 - 0s - loss: 0.0102 - acc: 1.0000 epochs : 959 Epoch 1/1 - 0s - loss: 0.0096 - acc: 1.0000 epochs : 960 Epoch 1/1 - 0s - loss: 0.0085 - acc: 1.0000 epochs : 961 Epoch 1/1 - 0s - loss: 0.0086 - acc: 1.0000 epochs : 962 Epoch 1/1 - 0s - loss: 0.0085 - acc: 1.0000 epochs : 963 Epoch 1/1 - 0s - loss: 0.0071 - acc: 1.0000 epochs : 964 Epoch 1/1 - 0s - loss: 0.0064 - acc: 1.0000 epochs : 965 Epoch 1/1 - 0s - loss: 0.0065 - acc: 1.0000 epochs : 966 Epoch 1/1 - 0s - loss: 0.0061 - acc: 1.0000 epochs : 967 Epoch 1/1 - 0s - loss: 0.0058 - acc: 1.0000 epochs : 968 Epoch 1/1 - 0s - loss: 0.0050 - acc: 1.0000 epochs : 969 Epoch 1/1 - 0s - loss: 0.0046 - acc: 1.0000 epochs : 970 Epoch 1/1 - 0s - loss: 0.0044 - acc: 1.0000 epochs : 971 Epoch 1/1 - 0s - loss: 0.0041 - acc: 1.0000 epochs : 972 Epoch 1/1 - 0s - loss: 0.0039 - acc: 1.0000 epochs : 973 Epoch 1/1 - 0s - loss: 0.0037 - acc: 1.0000 epochs : 974 Epoch 1/1 - 0s - loss: 0.0035 - acc: 1.0000 epochs : 975 Epoch 1/1 - 0s - loss: 0.0034 - acc: 1.0000 epochs : 976 Epoch 1/1 - 0s - loss: 0.0032 - acc: 1.0000 epochs : 977 Epoch 1/1 - 0s - loss: 0.0031 - acc: 1.0000 epochs : 978 Epoch 1/1 - 0s - loss: 0.0030 - acc: 1.0000 epochs : 979 Epoch 1/1 - 0s - loss: 0.0028 - acc: 1.0000 epochs : 980 Epoch 1/1 - 0s - loss: 0.0027 - acc: 1.0000 epochs : 981 Epoch 1/1 - 0s - loss: 0.0027 - acc: 1.0000 epochs : 982 Epoch 1/1 - 0s - loss: 0.0025 - acc: 1.0000 epochs : 983 Epoch 1/1 - 0s - loss: 0.0374 - acc: 0.9800 epochs : 984 Epoch 1/1 - 0s - loss: 1.6858 - acc: 0.5800 epochs : 985 Epoch 1/1 - 0s - loss: 1.6475 - acc: 0.5000 epochs : 986 Epoch 1/1 - 0s - loss: 1.2539 - acc: 0.5000 epochs : 987 Epoch 1/1 - 0s - loss: 0.7369 - acc: 0.8000 epochs : 988 Epoch 1/1 - 0s - loss: 0.5167 - acc: 0.8200 epochs : 989 Epoch 1/1 - 0s - loss: 1.0593 - acc: 0.7800 epochs : 990 Epoch 1/1 - 0s - loss: 1.4038 - acc: 0.5400 epochs : 991 Epoch 1/1 - 0s - loss: 0.6676 - acc: 0.7200 epochs : 992 Epoch 1/1 - 0s - loss: 0.4695 - acc: 0.8200 epochs : 993 Epoch 1/1 - 0s - loss: 0.4903 - acc: 0.8600 epochs : 994 Epoch 1/1 - 0s - loss: 0.2694 - acc: 0.9400 epochs : 995 Epoch 1/1 - 0s - loss: 0.2568 - acc: 0.9600 epochs : 996 Epoch 1/1 - 0s - loss: 0.2463 - acc: 0.9200 epochs : 997 Epoch 1/1 - 0s - loss: 0.1801 - acc: 0.9400 epochs : 998 Epoch 1/1 - 0s - loss: 0.1493 - acc: 0.9800 epochs : 999 Epoch 1/1 - 0s - loss: 0.1061 - acc: 0.9800 epochs : 1000 Epoch 1/1 - 0s - loss: 0.1656 - acc: 0.9800 epochs : 1001 Epoch 1/1 - 0s - loss: 0.1031 - acc: 0.9800 epochs : 1002 Epoch 1/1 - 0s - loss: 0.1367 - acc: 0.9600 epochs : 1003 Epoch 1/1 - 0s - loss: 0.2281 - acc: 0.9400 epochs : 1004 Epoch 1/1 - 0s - loss: 0.1905 - acc: 0.9400 epochs : 1005 Epoch 1/1 - 0s - loss: 0.0446 - acc: 1.0000 epochs : 1006 Epoch 1/1 - 0s - loss: 0.0313 - acc: 1.0000 epochs : 1007 Epoch 1/1 - 0s - loss: 0.0321 - acc: 1.0000 epochs : 1008 Epoch 1/1 - 0s - loss: 0.0208 - acc: 1.0000 epochs : 1009 Epoch 1/1 - 0s - loss: 0.0226 - acc: 1.0000 epochs : 1010 Epoch 1/1 - 0s - loss: 0.0204 - acc: 1.0000 epochs : 1011 Epoch 1/1 - 0s - loss: 0.0230 - acc: 1.0000 epochs : 1012 Epoch 1/1 - 0s - loss: 0.0275 - acc: 1.0000 epochs : 1013 Epoch 1/1 - 0s - loss: 0.0411 - acc: 0.9800 epochs : 1014 Epoch 1/1 - 0s - loss: 0.0179 - acc: 1.0000 epochs : 1015 Epoch 1/1 - 0s - loss: 0.0147 - acc: 1.0000 epochs : 1016 Epoch 1/1 - 0s - loss: 0.0168 - acc: 1.0000 epochs : 1017 Epoch 1/1 - 0s - loss: 0.0125 - acc: 1.0000 epochs : 1018 Epoch 1/1 - 0s - loss: 0.0132 - acc: 1.0000 epochs : 1019 Epoch 1/1 - 0s - loss: 0.0104 - acc: 1.0000 epochs : 1020 Epoch 1/1 - 0s - loss: 0.0081 - acc: 1.0000 epochs : 1021 Epoch 1/1 - 0s - loss: 0.0073 - acc: 1.0000 epochs : 1022 Epoch 1/1 - 0s - loss: 0.0064 - acc: 1.0000 epochs : 1023 Epoch 1/1 - 0s - loss: 0.0060 - acc: 1.0000 epochs : 1024 Epoch 1/1 - 0s - loss: 0.0054 - acc: 1.0000 epochs : 1025 Epoch 1/1 - 0s - loss: 0.0050 - acc: 1.0000 epochs : 1026 Epoch 1/1 - 0s - loss: 0.0047 - acc: 1.0000 epochs : 1027 Epoch 1/1 - 0s - loss: 0.0044 - acc: 1.0000 epochs : 1028 Epoch 1/1 - 0s - loss: 0.0042 - acc: 1.0000 epochs : 1029 Epoch 1/1 - 0s - loss: 0.0040 - acc: 1.0000 epochs : 1030 Epoch 1/1 - 0s - loss: 0.0038 - acc: 1.0000 epochs : 1031 Epoch 1/1 - 0s - loss: 0.0036 - acc: 1.0000 epochs : 1032 Epoch 1/1 - 0s - loss: 0.0035 - acc: 1.0000 epochs : 1033 Epoch 1/1 - 0s - loss: 0.0033 - acc: 1.0000 epochs : 1034 Epoch 1/1 - 0s - loss: 0.0031 - acc: 1.0000 epochs : 1035 Epoch 1/1 - 0s - loss: 0.0030 - acc: 1.0000 epochs : 1036 Epoch 1/1 - 0s - loss: 0.0029 - acc: 1.0000 epochs : 1037 Epoch 1/1 - 0s - loss: 0.0027 - acc: 1.0000 epochs : 1038 Epoch 1/1 - 0s - loss: 0.0026 - acc: 1.0000 epochs : 1039 Epoch 1/1 - 0s - loss: 0.0025 - acc: 1.0000 epochs : 1040 Epoch 1/1 - 0s - loss: 0.0024 - acc: 1.0000 epochs : 1041 Epoch 1/1 - 0s - loss: 0.0023 - acc: 1.0000 epochs : 1042 Epoch 1/1 - 0s - loss: 0.0022 - acc: 1.0000 epochs : 1043 Epoch 1/1 - 0s - loss: 0.0021 - acc: 1.0000 epochs : 1044 Epoch 1/1 - 0s - loss: 0.0020 - acc: 1.0000 epochs : 1045 Epoch 1/1 - 0s - loss: 0.0019 - acc: 1.0000 epochs : 1046 Epoch 1/1 - 0s - loss: 0.0018 - acc: 1.0000 epochs : 1047 Epoch 1/1 - 0s - loss: 0.0018 - acc: 1.0000 epochs : 1048 Epoch 1/1 - 0s - loss: 0.0017 - acc: 1.0000 epochs : 1049 Epoch 1/1 - 0s - loss: 0.0016 - acc: 1.0000 epochs : 1050 Epoch 1/1 - 0s - loss: 0.0016 - acc: 1.0000 epochs : 1051 Epoch 1/1 - 0s - loss: 0.0015 - acc: 1.0000 epochs : 1052 Epoch 1/1 - 0s - loss: 0.0015 - acc: 1.0000 epochs : 1053 Epoch 1/1 - 0s - loss: 0.0014 - acc: 1.0000 epochs : 1054 Epoch 1/1 - 0s - loss: 0.0014 - acc: 1.0000 epochs : 1055 Epoch 1/1 - 0s - loss: 0.0014 - acc: 1.0000 epochs : 1056 Epoch 1/1 - 0s - loss: 0.0013 - acc: 1.0000 epochs : 1057 Epoch 1/1 - 0s - loss: 0.0013 - acc: 1.0000 epochs : 1058 Epoch 1/1 - 0s - loss: 0.0012 - acc: 1.0000 epochs : 1059 Epoch 1/1 - 0s - loss: 0.0012 - acc: 1.0000 epochs : 1060 Epoch 1/1 - 0s - loss: 0.0012 - acc: 1.0000 epochs : 1061 Epoch 1/1 - 0s - loss: 0.0011 - acc: 1.0000 epochs : 1062 Epoch 1/1 - 0s - loss: 0.0011 - acc: 1.0000 epochs : 1063 Epoch 1/1 - 0s - loss: 0.0011 - acc: 1.0000 epochs : 1064 Epoch 1/1 - 0s - loss: 0.0010 - acc: 1.0000 epochs : 1065 Epoch 1/1 - 0s - loss: 9.9535e-04 - acc: 1.0000 epochs : 1066 Epoch 1/1 - 0s - loss: 9.6417e-04 - acc: 1.0000 epochs : 1067 Epoch 1/1 - 0s - loss: 9.3561e-04 - acc: 1.0000 epochs : 1068 Epoch 1/1 - 0s - loss: 9.0942e-04 - acc: 1.0000 epochs : 1069 Epoch 1/1 - 0s - loss: 8.8690e-04 - acc: 1.0000 epochs : 1070 Epoch 1/1 - 0s - loss: 8.6595e-04 - acc: 1.0000 epochs : 1071 Epoch 1/1 - 0s - loss: 8.4577e-04 - acc: 1.0000 epochs : 1072 Epoch 1/1 - 0s - loss: 8.2860e-04 - acc: 1.0000 epochs : 1073 Epoch 1/1 - 0s - loss: 8.1110e-04 - acc: 1.0000 epochs : 1074 Epoch 1/1 - 0s - loss: 7.9616e-04 - acc: 1.0000 epochs : 1075 Epoch 1/1 - 0s - loss: 7.8113e-04 - acc: 1.0000 epochs : 1076 Epoch 1/1 - 0s - loss: 7.6504e-04 - acc: 1.0000 epochs : 1077 Epoch 1/1 - 0s - loss: 7.4776e-04 - acc: 1.0000 epochs : 1078 Epoch 1/1 - 0s - loss: 7.3060e-04 - acc: 1.0000 epochs : 1079 Epoch 1/1 - 0s - loss: 7.1025e-04 - acc: 1.0000 epochs : 1080 Epoch 1/1 - 0s - loss: 6.9184e-04 - acc: 1.0000 epochs : 1081 Epoch 1/1 - 0s - loss: 6.7384e-04 - acc: 1.0000 epochs : 1082 Epoch 1/1 - 0s - loss: 6.4998e-04 - acc: 1.0000 epochs : 1083 Epoch 1/1 - 0s - loss: 6.2037e-04 - acc: 1.0000 epochs : 1084 Epoch 1/1 - 0s - loss: 5.9854e-04 - acc: 1.0000 epochs : 1085 Epoch 1/1 - 0s - loss: 5.8242e-04 - acc: 1.0000 epochs : 1086 Epoch 1/1 - 0s - loss: 5.6781e-04 - acc: 1.0000 epochs : 1087 Epoch 1/1 - 0s - loss: 5.6083e-04 - acc: 1.0000 epochs : 1088 Epoch 1/1 - 0s - loss: 5.5858e-04 - acc: 1.0000 epochs : 1089 Epoch 1/1 - 0s - loss: 5.5889e-04 - acc: 1.0000 epochs : 1090 Epoch 1/1 - 0s - loss: 5.8317e-04 - acc: 1.0000 epochs : 1091 Epoch 1/1 - 0s - loss: 0.0096 - acc: 1.0000 epochs : 1092 Epoch 1/1 - 0s - loss: 1.5098 - acc: 0.6200 epochs : 1093 Epoch 1/1 - 0s - loss: 1.5549 - acc: 0.5000 epochs : 1094 Epoch 1/1 - 0s - loss: 2.0592 - acc: 0.4400 epochs : 1095 Epoch 1/1 - 0s - loss: 0.7314 - acc: 0.7200 epochs : 1096 Epoch 1/1 - 0s - loss: 0.4795 - acc: 0.8400 epochs : 1097 Epoch 1/1 - 0s - loss: 0.2935 - acc: 0.9000 epochs : 1098 Epoch 1/1 - 0s - loss: 0.2794 - acc: 0.9000 epochs : 1099 Epoch 1/1 - 0s - loss: 0.8932 - acc: 0.6800 epochs : 1100 Epoch 1/1 - 0s - loss: 1.5104 - acc: 0.5400 epochs : 1101 Epoch 1/1 - 0s - loss: 0.3902 - acc: 0.8600 epochs : 1102 Epoch 1/1 - 0s - loss: 0.2949 - acc: 0.9200 epochs : 1103 Epoch 1/1 - 0s - loss: 0.1890 - acc: 0.9800 epochs : 1104 Epoch 1/1 - 0s - loss: 0.1017 - acc: 1.0000 epochs : 1105 Epoch 1/1 - 0s - loss: 0.0952 - acc: 0.9800 epochs : 1106 Epoch 1/1 - 0s - loss: 0.0860 - acc: 1.0000 epochs : 1107 Epoch 1/1 - 0s - loss: 0.0471 - acc: 1.0000 epochs : 1108 Epoch 1/1 - 0s - loss: 0.0305 - acc: 1.0000 epochs : 1109 Epoch 1/1 - 0s - loss: 0.0251 - acc: 1.0000 epochs : 1110 Epoch 1/1 - 0s - loss: 0.0233 - acc: 1.0000 epochs : 1111 Epoch 1/1 - 0s - loss: 0.0237 - acc: 1.0000 epochs : 1112 Epoch 1/1 - 0s - loss: 0.0197 - acc: 1.0000 epochs : 1113 Epoch 1/1 - 0s - loss: 0.0209 - acc: 1.0000 epochs : 1114 Epoch 1/1 - 0s - loss: 0.0192 - acc: 1.0000 epochs : 1115 Epoch 1/1 - 0s - loss: 0.0150 - acc: 1.0000 epochs : 1116 Epoch 1/1 - 0s - loss: 0.0899 - acc: 0.9800 epochs : 1117 Epoch 1/1 - 0s - loss: 1.2723 - acc: 0.5600 epochs : 1118 Epoch 1/1 - 0s - loss: 1.6543 - acc: 0.5000 epochs : 1119 Epoch 1/1 - 0s - loss: 0.8206 - acc: 0.6800 epochs : 1120 Epoch 1/1 - 0s - loss: 0.3601 - acc: 0.8800 epochs : 1121 Epoch 1/1 - 0s - loss: 0.3929 - acc: 0.8800 epochs : 1122 Epoch 1/1 - 0s - loss: 0.1415 - acc: 0.9600 epochs : 1123 Epoch 1/1 - 0s - loss: 0.1558 - acc: 0.9400 epochs : 1124 Epoch 1/1 - 0s - loss: 0.1119 - acc: 0.9800 epochs : 1125 Epoch 1/1 - 0s - loss: 0.0992 - acc: 0.9800 epochs : 1126 Epoch 1/1 - 0s - loss: 0.0814 - acc: 0.9800 epochs : 1127 Epoch 1/1 - 0s - loss: 0.0831 - acc: 1.0000 epochs : 1128 Epoch 1/1 - 0s - loss: 0.1393 - acc: 0.9400 epochs : 1129 Epoch 1/1 - 0s - loss: 0.0329 - acc: 1.0000 epochs : 1130 Epoch 1/1 - 0s - loss: 0.0225 - acc: 1.0000 epochs : 1131 Epoch 1/1 - 0s - loss: 0.0189 - acc: 1.0000 epochs : 1132 Epoch 1/1 - 0s - loss: 0.0169 - acc: 1.0000 epochs : 1133 Epoch 1/1 - 0s - loss: 0.0145 - acc: 1.0000 epochs : 1134 Epoch 1/1 - 0s - loss: 0.0134 - acc: 1.0000 epochs : 1135 Epoch 1/1 - 0s - loss: 0.0119 - acc: 1.0000 epochs : 1136 Epoch 1/1 - 0s - loss: 0.0115 - acc: 1.0000 epochs : 1137 Epoch 1/1 - 0s - loss: 0.0216 - acc: 1.0000 epochs : 1138 Epoch 1/1 - 0s - loss: 0.0124 - acc: 1.0000 epochs : 1139 Epoch 1/1 - 0s - loss: 0.0103 - acc: 1.0000 epochs : 1140 Epoch 1/1 - 0s - loss: 0.0098 - acc: 1.0000 epochs : 1141 Epoch 1/1 - 0s - loss: 0.0092 - acc: 1.0000 epochs : 1142 Epoch 1/1 - 0s - loss: 0.0083 - acc: 1.0000 epochs : 1143 Epoch 1/1 - 0s - loss: 0.0092 - acc: 1.0000 epochs : 1144 Epoch 1/1 - 0s - loss: 0.0088 - acc: 1.0000 epochs : 1145 Epoch 1/1 - 0s - loss: 0.0147 - acc: 1.0000 epochs : 1146 Epoch 1/1 - 0s - loss: 0.0103 - acc: 1.0000 epochs : 1147 Epoch 1/1 - 0s - loss: 0.0857 - acc: 0.9800 epochs : 1148 Epoch 1/1 - 0s - loss: 0.1054 - acc: 0.9600 epochs : 1149 Epoch 1/1 - 0s - loss: 0.3673 - acc: 0.8400 epochs : 1150 Epoch 1/1 - 0s - loss: 0.7130 - acc: 0.7400 epochs : 1151 Epoch 1/1 - 0s - loss: 1.2545 - acc: 0.6400 epochs : 1152 Epoch 1/1 - 0s - loss: 0.6440 - acc: 0.7400 epochs : 1153 Epoch 1/1 - 0s - loss: 0.7543 - acc: 0.8000 epochs : 1154 Epoch 1/1 - 0s - loss: 0.7856 - acc: 0.6600 epochs : 1155 Epoch 1/1 - 0s - loss: 0.1985 - acc: 0.9200 epochs : 1156 Epoch 1/1 - 0s - loss: 0.3639 - acc: 0.8600 epochs : 1157 Epoch 1/1 - 0s - loss: 0.2330 - acc: 0.9200 epochs : 1158 Epoch 1/1 - 0s - loss: 0.3381 - acc: 0.8600 epochs : 1159 Epoch 1/1 - 0s - loss: 0.1787 - acc: 0.9400 epochs : 1160 Epoch 1/1 - 0s - loss: 0.1153 - acc: 0.9800 epochs : 1161 Epoch 1/1 - 0s - loss: 0.0560 - acc: 1.0000 epochs : 1162 Epoch 1/1 - 0s - loss: 0.0380 - acc: 1.0000 epochs : 1163 Epoch 1/1 - 0s - loss: 0.0286 - acc: 1.0000 epochs : 1164 Epoch 1/1 - 0s - loss: 0.0223 - acc: 1.0000 epochs : 1165 Epoch 1/1 - 0s - loss: 0.0197 - acc: 1.0000 epochs : 1166 Epoch 1/1 - 0s - loss: 0.0174 - acc: 1.0000 epochs : 1167 Epoch 1/1 - 0s - loss: 0.0159 - acc: 1.0000 epochs : 1168 Epoch 1/1 - 0s - loss: 0.0143 - acc: 1.0000 epochs : 1169 Epoch 1/1 - 0s - loss: 0.0133 - acc: 1.0000 epochs : 1170 Epoch 1/1 - 0s - loss: 0.0120 - acc: 1.0000 epochs : 1171 Epoch 1/1 - 0s - loss: 0.0104 - acc: 1.0000 epochs : 1172 Epoch 1/1 - 0s - loss: 0.0089 - acc: 1.0000 epochs : 1173 Epoch 1/1 - 0s - loss: 0.0081 - acc: 1.0000 epochs : 1174 Epoch 1/1 - 0s - loss: 0.0076 - acc: 1.0000 epochs : 1175 Epoch 1/1 - 0s - loss: 0.0070 - acc: 1.0000 epochs : 1176 Epoch 1/1 - 0s - loss: 0.0066 - acc: 1.0000 epochs : 1177 Epoch 1/1 - 0s - loss: 0.0061 - acc: 1.0000 epochs : 1178 Epoch 1/1 - 0s - loss: 0.0058 - acc: 1.0000 epochs : 1179 Epoch 1/1 - 0s - loss: 0.0054 - acc: 1.0000 epochs : 1180 Epoch 1/1 - 0s - loss: 0.0051 - acc: 1.0000 epochs : 1181 Epoch 1/1 - 0s - loss: 0.0048 - acc: 1.0000 epochs : 1182 Epoch 1/1 - 0s - loss: 0.0046 - acc: 1.0000 epochs : 1183 Epoch 1/1 - 0s - loss: 0.0043 - acc: 1.0000 epochs : 1184 Epoch 1/1 - 0s - loss: 0.0042 - acc: 1.0000 epochs : 1185 Epoch 1/1 - 0s - loss: 0.0040 - acc: 1.0000 epochs : 1186 Epoch 1/1 - 0s - loss: 0.0038 - acc: 1.0000 epochs : 1187 Epoch 1/1 - 0s - loss: 0.0037 - acc: 1.0000 epochs : 1188 Epoch 1/1 - 0s - loss: 0.0036 - acc: 1.0000 epochs : 1189 Epoch 1/1 - 0s - loss: 0.0035 - acc: 1.0000 epochs : 1190 Epoch 1/1 - 0s - loss: 0.0034 - acc: 1.0000 epochs : 1191 Epoch 1/1 - 0s - loss: 0.0033 - acc: 1.0000 epochs : 1192 Epoch 1/1 - 0s - loss: 0.0032 - acc: 1.0000 epochs : 1193 Epoch 1/1 - 0s - loss: 0.0033 - acc: 1.0000 epochs : 1194 Epoch 1/1 - 0s - loss: 0.0033 - acc: 1.0000 epochs : 1195 Epoch 1/1 - 0s - loss: 0.0029 - acc: 1.0000 epochs : 1196 Epoch 1/1 - 0s - loss: 0.0031 - acc: 1.0000 epochs : 1197 Epoch 1/1 - 0s - loss: 0.0025 - acc: 1.0000 epochs : 1198 Epoch 1/1 - 0s - loss: 0.0027 - acc: 1.0000 epochs : 1199 Epoch 1/1 - 0s - loss: 0.0027 - acc: 1.0000 epochs : 1200 Epoch 1/1 - 0s - loss: 0.0025 - acc: 1.0000 epochs : 1201 Epoch 1/1 - 0s - loss: 0.0050 - acc: 1.0000 epochs : 1202 Epoch 1/1 - 0s - loss: 0.0049 - acc: 1.0000 epochs : 1203 Epoch 1/1 - 0s - loss: 0.0078 - acc: 1.0000 epochs : 1204 Epoch 1/1 - 0s - loss: 0.0048 - acc: 1.0000 epochs : 1205 Epoch 1/1 - 0s - loss: 0.0028 - acc: 1.0000 epochs : 1206 Epoch 1/1 - 0s - loss: 0.0027 - acc: 1.0000 epochs : 1207 Epoch 1/1 - 0s - loss: 0.0038 - acc: 1.0000 epochs : 1208 Epoch 1/1 - 0s - loss: 0.0079 - acc: 1.0000 epochs : 1209 Epoch 1/1 - 0s - loss: 0.0031 - acc: 1.0000 epochs : 1210 Epoch 1/1 - 0s - loss: 0.1267 - acc: 0.9600 epochs : 1211 Epoch 1/1 - 0s - loss: 1.1272 - acc: 0.6800 epochs : 1212 Epoch 1/1 - 0s - loss: 0.7783 - acc: 0.7400 epochs : 1213 Epoch 1/1 - 0s - loss: 0.3657 - acc: 0.8600 epochs : 1214 Epoch 1/1 - 0s - loss: 0.1198 - acc: 0.9800 epochs : 1215 Epoch 1/1 - 0s - loss: 0.3116 - acc: 0.9400 epochs : 1216 Epoch 1/1 - 0s - loss: 0.8539 - acc: 0.7000 epochs : 1217 Epoch 1/1 - 0s - loss: 0.5425 - acc: 0.7800 epochs : 1218 Epoch 1/1 - 0s - loss: 0.2955 - acc: 0.8800 epochs : 1219 Epoch 1/1 - 0s - loss: 0.1536 - acc: 0.9600 epochs : 1220 Epoch 1/1 - 0s - loss: 0.1983 - acc: 0.9200 epochs : 1221 Epoch 1/1 - 0s - loss: 0.8427 - acc: 0.8000 epochs : 1222 Epoch 1/1 - 0s - loss: 0.3731 - acc: 0.8400 epochs : 1223 Epoch 1/1 - 0s - loss: 0.2758 - acc: 0.8800 epochs : 1224 Epoch 1/1 - 0s - loss: 0.1331 - acc: 0.9800 epochs : 1225 Epoch 1/1 - 0s - loss: 0.0596 - acc: 1.0000 epochs : 1226 Epoch 1/1 - 0s - loss: 0.1425 - acc: 0.9600 epochs : 1227 Epoch 1/1 - 0s - loss: 0.0481 - acc: 1.0000 epochs : 1228 Epoch 1/1 - 0s - loss: 0.0319 - acc: 1.0000 epochs : 1229 Epoch 1/1 - 0s - loss: 0.0314 - acc: 1.0000 epochs : 1230 Epoch 1/1 - 0s - loss: 0.0232 - acc: 1.0000 epochs : 1231 Epoch 1/1 - 0s - loss: 0.0204 - acc: 1.0000 epochs : 1232 Epoch 1/1 - 0s - loss: 0.0281 - acc: 1.0000 epochs : 1233 Epoch 1/1 - 0s - loss: 0.0194 - acc: 1.0000 epochs : 1234 Epoch 1/1 - 0s - loss: 0.0276 - acc: 1.0000 epochs : 1235 Epoch 1/1 - 0s - loss: 0.0216 - acc: 1.0000 epochs : 1236 Epoch 1/1 - 0s - loss: 0.0401 - acc: 0.9800 epochs : 1237 Epoch 1/1 - 0s - loss: 0.0332 - acc: 1.0000 epochs : 1238 Epoch 1/1 - 0s - loss: 0.0439 - acc: 0.9800 epochs : 1239 Epoch 1/1 - 0s - loss: 0.0462 - acc: 1.0000 epochs : 1240 Epoch 1/1 - 0s - loss: 0.0130 - acc: 1.0000 epochs : 1241 Epoch 1/1 - 0s - loss: 0.0107 - acc: 1.0000 epochs : 1242 Epoch 1/1 - 0s - loss: 0.0096 - acc: 1.0000 epochs : 1243 Epoch 1/1 - 0s - loss: 0.0098 - acc: 1.0000 epochs : 1244 Epoch 1/1 - 0s - loss: 0.0065 - acc: 1.0000 epochs : 1245 Epoch 1/1 - 0s - loss: 0.0054 - acc: 1.0000 epochs : 1246 Epoch 1/1 - 0s - loss: 0.0052 - acc: 1.0000 epochs : 1247 Epoch 1/1 - 0s - loss: 0.0051 - acc: 1.0000 epochs : 1248 Epoch 1/1 - 0s - loss: 0.0047 - acc: 1.0000 epochs : 1249 Epoch 1/1 - 0s - loss: 0.0044 - acc: 1.0000 epochs : 1250 Epoch 1/1 - 0s - loss: 0.0041 - acc: 1.0000 epochs : 1251 Epoch 1/1 - 0s - loss: 0.0039 - acc: 1.0000 epochs : 1252 Epoch 1/1 - 0s - loss: 0.0037 - acc: 1.0000 epochs : 1253 Epoch 1/1 - 0s - loss: 0.0035 - acc: 1.0000 epochs : 1254 Epoch 1/1 - 0s - loss: 0.0033 - acc: 1.0000 epochs : 1255 Epoch 1/1 - 0s - loss: 0.0032 - acc: 1.0000 epochs : 1256 Epoch 1/1 - 0s - loss: 0.0030 - acc: 1.0000 epochs : 1257 Epoch 1/1 - 0s - loss: 0.0029 - acc: 1.0000 epochs : 1258 Epoch 1/1 - 0s - loss: 0.0027 - acc: 1.0000 epochs : 1259 Epoch 1/1 - 0s - loss: 0.0026 - acc: 1.0000 epochs : 1260 Epoch 1/1 - 0s - loss: 0.0025 - acc: 1.0000 epochs : 1261 Epoch 1/1 - 0s - loss: 0.0024 - acc: 1.0000 epochs : 1262 Epoch 1/1 - 0s - loss: 0.0023 - acc: 1.0000 epochs : 1263 Epoch 1/1 - 0s - loss: 0.0022 - acc: 1.0000 epochs : 1264 Epoch 1/1 - 0s - loss: 0.0021 - acc: 1.0000 epochs : 1265 Epoch 1/1 - 0s - loss: 0.0020 - acc: 1.0000 epochs : 1266 Epoch 1/1 - 0s - loss: 0.0019 - acc: 1.0000 epochs : 1267 Epoch 1/1 - 0s - loss: 0.0019 - acc: 1.0000 epochs : 1268 Epoch 1/1 - 0s - loss: 0.0019 - acc: 1.0000 epochs : 1269 Epoch 1/1 - 0s - loss: 0.0018 - acc: 1.0000 epochs : 1270 Epoch 1/1 - 0s - loss: 0.0017 - acc: 1.0000 epochs : 1271 Epoch 1/1 - 0s - loss: 0.0016 - acc: 1.0000 epochs : 1272 Epoch 1/1 - 0s - loss: 0.0015 - acc: 1.0000 epochs : 1273 Epoch 1/1 - 0s - loss: 0.0014 - acc: 1.0000 epochs : 1274 Epoch 1/1 - 0s - loss: 0.0014 - acc: 1.0000 epochs : 1275 Epoch 1/1 - 0s - loss: 0.0013 - acc: 1.0000 epochs : 1276 Epoch 1/1 - 0s - loss: 0.0013 - acc: 1.0000 epochs : 1277 Epoch 1/1 - 0s - loss: 0.0013 - acc: 1.0000 epochs : 1278 Epoch 1/1 - 0s - loss: 0.0012 - acc: 1.0000 epochs : 1279 Epoch 1/1 - 0s - loss: 0.0012 - acc: 1.0000 epochs : 1280 Epoch 1/1 - 0s - loss: 0.0011 - acc: 1.0000 epochs : 1281 Epoch 1/1 - 0s - loss: 0.0011 - acc: 1.0000 epochs : 1282 Epoch 1/1 - 0s - loss: 0.0011 - acc: 1.0000 epochs : 1283 Epoch 1/1 - 0s - loss: 0.0010 - acc: 1.0000 epochs : 1284 Epoch 1/1 - 0s - loss: 0.0010 - acc: 1.0000 epochs : 1285 Epoch 1/1 - 0s - loss: 0.0010 - acc: 1.0000 epochs : 1286 Epoch 1/1 - 0s - loss: 9.9388e-04 - acc: 1.0000 epochs : 1287 Epoch 1/1 - 0s - loss: 9.7645e-04 - acc: 1.0000 epochs : 1288 Epoch 1/1 - 0s - loss: 9.5639e-04 - acc: 1.0000 epochs : 1289 Epoch 1/1 - 0s - loss: 9.2176e-04 - acc: 1.0000 epochs : 1290 Epoch 1/1 - 0s - loss: 8.7580e-04 - acc: 1.0000 epochs : 1291 Epoch 1/1 - 0s - loss: 8.4240e-04 - acc: 1.0000 epochs : 1292 Epoch 1/1 - 0s - loss: 8.1624e-04 - acc: 1.0000 epochs : 1293 Epoch 1/1 - 0s - loss: 7.9095e-04 - acc: 1.0000 epochs : 1294 Epoch 1/1 - 0s - loss: 7.6695e-04 - acc: 1.0000 epochs : 1295 Epoch 1/1 - 0s - loss: 7.5016e-04 - acc: 1.0000 epochs : 1296 Epoch 1/1 - 0s - loss: 7.4545e-04 - acc: 1.0000 epochs : 1297 Epoch 1/1 - 0s - loss: 7.3255e-04 - acc: 1.0000 epochs : 1298 Epoch 1/1 - 0s - loss: 6.9954e-04 - acc: 1.0000 epochs : 1299 Epoch 1/1 - 0s - loss: 6.8076e-04 - acc: 1.0000 epochs : 1300 Epoch 1/1 - 0s - loss: 6.6307e-04 - acc: 1.0000 epochs : 1301 Epoch 1/1 - 0s - loss: 6.2429e-04 - acc: 1.0000 epochs : 1302 Epoch 1/1 - 0s - loss: 5.9410e-04 - acc: 1.0000 epochs : 1303 Epoch 1/1 - 0s - loss: 5.7274e-04 - acc: 1.0000 epochs : 1304 Epoch 1/1 - 0s - loss: 5.5405e-04 - acc: 1.0000 epochs : 1305 Epoch 1/1 - 0s - loss: 6.0649e-04 - acc: 1.0000 epochs : 1306 Epoch 1/1 - 0s - loss: 5.2616e-04 - acc: 1.0000 epochs : 1307 Epoch 1/1 - 0s - loss: 4.9965e-04 - acc: 1.0000 epochs : 1308 Epoch 1/1 - 0s - loss: 4.8564e-04 - acc: 1.0000 epochs : 1309 Epoch 1/1 - 0s - loss: 4.9601e-04 - acc: 1.0000 epochs : 1310 Epoch 1/1 - 0s - loss: 4.5964e-04 - acc: 1.0000 epochs : 1311 Epoch 1/1 - 0s - loss: 4.3846e-04 - acc: 1.0000 epochs : 1312 Epoch 1/1 - 0s - loss: 4.1296e-04 - acc: 1.0000 epochs : 1313 Epoch 1/1 - 0s - loss: 4.0311e-04 - acc: 1.0000 epochs : 1314 Epoch 1/1 - 0s - loss: 4.8580e-04 - acc: 1.0000 epochs : 1315 Epoch 1/1 - 0s - loss: 0.0048 - acc: 1.0000 epochs : 1316 Epoch 1/1 - 0s - loss: 0.8777 - acc: 0.7400 epochs : 1317 Epoch 1/1 - 0s - loss: 1.4950 - acc: 0.5000 epochs : 1318 Epoch 1/1 - 0s - loss: 1.5580 - acc: 0.5600 epochs : 1319 Epoch 1/1 - 0s - loss: 0.6639 - acc: 0.7400 epochs : 1320 Epoch 1/1 - 0s - loss: 0.8651 - acc: 0.5000 epochs : 1321 Epoch 1/1 - 0s - loss: 2.0688 - acc: 0.3800 epochs : 1322 Epoch 1/1 - 0s - loss: 0.9450 - acc: 0.6000 epochs : 1323 Epoch 1/1 - 0s - loss: 0.5150 - acc: 0.7200 epochs : 1324 Epoch 1/1 - 0s - loss: 0.2604 - acc: 0.9200 epochs : 1325 Epoch 1/1 - 0s - loss: 0.1912 - acc: 0.9800 epochs : 1326 Epoch 1/1 - 0s - loss: 0.1571 - acc: 0.9600 epochs : 1327 Epoch 1/1 - 0s - loss: 0.1155 - acc: 1.0000 epochs : 1328 Epoch 1/1 - 0s - loss: 0.1300 - acc: 0.9600 epochs : 1329 Epoch 1/1 - 0s - loss: 0.0855 - acc: 1.0000 epochs : 1330 Epoch 1/1 - 0s - loss: 0.0643 - acc: 1.0000 epochs : 1331 Epoch 1/1 - 0s - loss: 0.1346 - acc: 0.9400 epochs : 1332 Epoch 1/1 - 0s - loss: 0.9062 - acc: 0.7400 epochs : 1333 Epoch 1/1 - 0s - loss: 0.5006 - acc: 0.8000 epochs : 1334 Epoch 1/1 - 0s - loss: 0.4197 - acc: 0.8800 epochs : 1335 Epoch 1/1 - 0s - loss: 0.2664 - acc: 0.9600 epochs : 1336 Epoch 1/1 - 0s - loss: 0.1486 - acc: 1.0000 epochs : 1337 Epoch 1/1 - 0s - loss: 0.0469 - acc: 1.0000 epochs : 1338 Epoch 1/1 - 0s - loss: 0.0331 - acc: 1.0000 epochs : 1339 Epoch 1/1 - 0s - loss: 0.0286 - acc: 1.0000 epochs : 1340 Epoch 1/1 - 0s - loss: 0.0361 - acc: 1.0000 epochs : 1341 Epoch 1/1 - 0s - loss: 0.1331 - acc: 0.9400 epochs : 1342 Epoch 1/1 - 0s - loss: 0.0482 - acc: 1.0000 epochs : 1343 Epoch 1/1 - 0s - loss: 0.0251 - acc: 1.0000 epochs : 1344 Epoch 1/1 - 0s - loss: 0.0196 - acc: 1.0000 epochs : 1345 Epoch 1/1 - 0s - loss: 0.0212 - acc: 1.0000 epochs : 1346 Epoch 1/1 - 0s - loss: 0.0179 - acc: 1.0000 epochs : 1347 Epoch 1/1 - 0s - loss: 0.0112 - acc: 1.0000 epochs : 1348 Epoch 1/1 - 0s - loss: 0.0106 - acc: 1.0000 epochs : 1349 Epoch 1/1 - 0s - loss: 0.0093 - acc: 1.0000 epochs : 1350 Epoch 1/1 - 0s - loss: 0.0084 - acc: 1.0000 epochs : 1351 Epoch 1/1 - 0s - loss: 0.0076 - acc: 1.0000 epochs : 1352 Epoch 1/1 - 0s - loss: 0.0070 - acc: 1.0000 epochs : 1353 Epoch 1/1 - 0s - loss: 0.0064 - acc: 1.0000 epochs : 1354 Epoch 1/1 - 0s - loss: 0.0059 - acc: 1.0000 epochs : 1355 Epoch 1/1 - 0s - loss: 0.0055 - acc: 1.0000 epochs : 1356 Epoch 1/1 - 0s - loss: 0.0052 - acc: 1.0000 epochs : 1357 Epoch 1/1 - 0s - loss: 0.0049 - acc: 1.0000 epochs : 1358 Epoch 1/1 - 0s - loss: 0.0046 - acc: 1.0000 epochs : 1359 Epoch 1/1 - 0s - loss: 0.0044 - acc: 1.0000 epochs : 1360 Epoch 1/1 - 0s - loss: 0.0042 - acc: 1.0000 epochs : 1361 Epoch 1/1 - 0s - loss: 0.0040 - acc: 1.0000 epochs : 1362 Epoch 1/1 - 0s - loss: 0.0038 - acc: 1.0000 epochs : 1363 Epoch 1/1 - 0s - loss: 0.0037 - acc: 1.0000 epochs : 1364 Epoch 1/1 - 0s - loss: 0.0036 - acc: 1.0000 epochs : 1365 Epoch 1/1 - 0s - loss: 0.0034 - acc: 1.0000 epochs : 1366 Epoch 1/1 - 0s - loss: 0.0033 - acc: 1.0000 epochs : 1367 Epoch 1/1 - 0s - loss: 0.0031 - acc: 1.0000 epochs : 1368 Epoch 1/1 - 0s - loss: 0.0030 - acc: 1.0000 epochs : 1369 Epoch 1/1 - 0s - loss: 0.0029 - acc: 1.0000 epochs : 1370 Epoch 1/1 - 0s - loss: 0.0028 - acc: 1.0000 epochs : 1371 Epoch 1/1 - 0s - loss: 0.0027 - acc: 1.0000 epochs : 1372 Epoch 1/1 - 0s - loss: 0.0027 - acc: 1.0000 epochs : 1373 Epoch 1/1 - 0s - loss: 0.0026 - acc: 1.0000 epochs : 1374 Epoch 1/1 - 0s - loss: 0.0025 - acc: 1.0000 epochs : 1375 Epoch 1/1 - 0s - loss: 0.0024 - acc: 1.0000 epochs : 1376 Epoch 1/1 - 0s - loss: 0.0024 - acc: 1.0000 epochs : 1377 Epoch 1/1 - 0s - loss: 0.0023 - acc: 1.0000 epochs : 1378 Epoch 1/1 - 0s - loss: 0.0022 - acc: 1.0000 epochs : 1379 Epoch 1/1 - 0s - loss: 0.0022 - acc: 1.0000 epochs : 1380 Epoch 1/1 - 0s - loss: 0.0021 - acc: 1.0000 epochs : 1381 Epoch 1/1 - 0s - loss: 0.0021 - acc: 1.0000 epochs : 1382 Epoch 1/1 - 0s - loss: 0.0020 - acc: 1.0000 epochs : 1383 Epoch 1/1 - 0s - loss: 0.0019 - acc: 1.0000 epochs : 1384 Epoch 1/1 - 0s - loss: 0.0017 - acc: 1.0000 epochs : 1385 Epoch 1/1 - 0s - loss: 0.0016 - acc: 1.0000 epochs : 1386 Epoch 1/1 - 0s - loss: 0.0015 - acc: 1.0000 epochs : 1387 Epoch 1/1 - 0s - loss: 0.0014 - acc: 1.0000 epochs : 1388 Epoch 1/1 - 0s - loss: 0.0015 - acc: 1.0000 epochs : 1389 Epoch 1/1 - 0s - loss: 0.0037 - acc: 1.0000 epochs : 1390 Epoch 1/1 - 0s - loss: 0.0014 - acc: 1.0000 epochs : 1391 Epoch 1/1 - 0s - loss: 0.0013 - acc: 1.0000 epochs : 1392 Epoch 1/1 - 0s - loss: 0.0012 - acc: 1.0000 epochs : 1393 Epoch 1/1 - 0s - loss: 0.0012 - acc: 1.0000 epochs : 1394 Epoch 1/1 - 0s - loss: 0.0012 - acc: 1.0000 epochs : 1395 Epoch 1/1 - 0s - loss: 0.0011 - acc: 1.0000 epochs : 1396 Epoch 1/1 - 0s - loss: 0.0013 - acc: 1.0000 epochs : 1397 Epoch 1/1 - 0s - loss: 0.0013 - acc: 1.0000 epochs : 1398 Epoch 1/1 - 0s - loss: 0.0012 - acc: 1.0000 epochs : 1399 Epoch 1/1 - 0s - loss: 0.0489 - acc: 0.9800 epochs : 1400 Epoch 1/1 - 0s - loss: 0.3692 - acc: 0.8800 epochs : 1401 Epoch 1/1 - 0s - loss: 1.6539 - acc: 0.6400 epochs : 1402 Epoch 1/1 - 0s - loss: 1.4685 - acc: 0.6200 epochs : 1403 Epoch 1/1 - 0s - loss: 0.9524 - acc: 0.6000 epochs : 1404 Epoch 1/1 - 0s - loss: 0.2455 - acc: 0.9600 epochs : 1405 Epoch 1/1 - 0s - loss: 0.1213 - acc: 0.9800 epochs : 1406 Epoch 1/1 - 0s - loss: 0.0959 - acc: 1.0000 epochs : 1407 Epoch 1/1 - 0s - loss: 0.0416 - acc: 1.0000 epochs : 1408 Epoch 1/1 - 0s - loss: 0.0864 - acc: 0.9800 epochs : 1409 Epoch 1/1 - 0s - loss: 0.3934 - acc: 0.9000 epochs : 1410 Epoch 1/1 - 0s - loss: 0.0459 - acc: 1.0000 epochs : 1411 Epoch 1/1 - 0s - loss: 0.0422 - acc: 1.0000 epochs : 1412 Epoch 1/1 - 0s - loss: 0.0680 - acc: 0.9800 epochs : 1413 Epoch 1/1 - 0s - loss: 0.1296 - acc: 0.9600 epochs : 1414 Epoch 1/1 - 0s - loss: 0.0642 - acc: 0.9800 epochs : 1415 Epoch 1/1 - 0s - loss: 0.0284 - acc: 1.0000 epochs : 1416 Epoch 1/1 - 0s - loss: 0.0153 - acc: 1.0000 epochs : 1417 Epoch 1/1 - 0s - loss: 0.0140 - acc: 1.0000 epochs : 1418 Epoch 1/1 - 0s - loss: 0.0107 - acc: 1.0000 epochs : 1419 Epoch 1/1 - 0s - loss: 0.0096 - acc: 1.0000 epochs : 1420 Epoch 1/1 - 0s - loss: 0.0084 - acc: 1.0000 epochs : 1421 Epoch 1/1 - 0s - loss: 0.0072 - acc: 1.0000 epochs : 1422 Epoch 1/1 - 0s - loss: 0.0065 - acc: 1.0000 epochs : 1423 Epoch 1/1 - 0s - loss: 0.0059 - acc: 1.0000 epochs : 1424 Epoch 1/1 - 0s - loss: 0.0053 - acc: 1.0000 epochs : 1425 Epoch 1/1 - 0s - loss: 0.0050 - acc: 1.0000 epochs : 1426 Epoch 1/1 - 0s - loss: 0.0050 - acc: 1.0000 epochs : 1427 Epoch 1/1 - 0s - loss: 0.0047 - acc: 1.0000 epochs : 1428 Epoch 1/1 - 0s - loss: 0.0048 - acc: 1.0000 epochs : 1429 Epoch 1/1 - 0s - loss: 0.0070 - acc: 1.0000 epochs : 1430 Epoch 1/1 - 0s - loss: 0.0638 - acc: 0.9800 epochs : 1431 Epoch 1/1 - 0s - loss: 0.1442 - acc: 0.9600 epochs : 1432 Epoch 1/1 - 0s - loss: 0.0098 - acc: 1.0000 epochs : 1433 Epoch 1/1 - 0s - loss: 0.0059 - acc: 1.0000 epochs : 1434 Epoch 1/1 - 0s - loss: 0.0047 - acc: 1.0000 epochs : 1435 Epoch 1/1 - 0s - loss: 0.0040 - acc: 1.0000 epochs : 1436 Epoch 1/1 - 0s - loss: 0.0038 - acc: 1.0000 epochs : 1437 Epoch 1/1 - 0s - loss: 0.0120 - acc: 1.0000 epochs : 1438 Epoch 1/1 - 0s - loss: 0.0184 - acc: 1.0000 epochs : 1439 Epoch 1/1 - 0s - loss: 0.0030 - acc: 1.0000 epochs : 1440 Epoch 1/1 - 0s - loss: 0.0024 - acc: 1.0000 epochs : 1441 Epoch 1/1 - 0s - loss: 0.0022 - acc: 1.0000 epochs : 1442 Epoch 1/1 - 0s - loss: 0.0021 - acc: 1.0000 epochs : 1443 Epoch 1/1 - 0s - loss: 0.0020 - acc: 1.0000 epochs : 1444 Epoch 1/1 - 0s - loss: 0.0019 - acc: 1.0000 epochs : 1445 Epoch 1/1 - 0s - loss: 0.0019 - acc: 1.0000 epochs : 1446 Epoch 1/1 - 0s - loss: 0.0018 - acc: 1.0000 epochs : 1447 Epoch 1/1 - 0s - loss: 0.0018 - acc: 1.0000 epochs : 1448 Epoch 1/1 - 0s - loss: 0.0018 - acc: 1.0000 epochs : 1449 Epoch 1/1 - 0s - loss: 0.0017 - acc: 1.0000 epochs : 1450 Epoch 1/1 - 0s - loss: 0.0017 - acc: 1.0000 epochs : 1451 Epoch 1/1 - 0s - loss: 0.0018 - acc: 1.0000 epochs : 1452 Epoch 1/1 - 0s - loss: 0.0016 - acc: 1.0000 epochs : 1453 Epoch 1/1 - 0s - loss: 0.0015 - acc: 1.0000 epochs : 1454 Epoch 1/1 - 0s - loss: 0.0015 - acc: 1.0000 epochs : 1455 Epoch 1/1 - 0s - loss: 0.0015 - acc: 1.0000 epochs : 1456 Epoch 1/1 - 0s - loss: 0.0014 - acc: 1.0000 epochs : 1457 Epoch 1/1 - 0s - loss: 0.0014 - acc: 1.0000 epochs : 1458 Epoch 1/1 - 0s - loss: 0.0013 - acc: 1.0000 epochs : 1459 Epoch 1/1 - 0s - loss: 0.0013 - acc: 1.0000 epochs : 1460 Epoch 1/1 - 0s - loss: 0.0013 - acc: 1.0000 epochs : 1461 Epoch 1/1 - 0s - loss: 0.0013 - acc: 1.0000 epochs : 1462 Epoch 1/1 - 0s - loss: 0.0013 - acc: 1.0000 epochs : 1463 Epoch 1/1 - 0s - loss: 0.0013 - acc: 1.0000 epochs : 1464 Epoch 1/1 - 0s - loss: 0.0013 - acc: 1.0000 epochs : 1465 Epoch 1/1 - 0s - loss: 0.0014 - acc: 1.0000 epochs : 1466 Epoch 1/1 - 0s - loss: 0.0014 - acc: 1.0000 epochs : 1467 Epoch 1/1 - 0s - loss: 0.0013 - acc: 1.0000 epochs : 1468 Epoch 1/1 - 0s - loss: 0.0011 - acc: 1.0000 epochs : 1469 Epoch 1/1 - 0s - loss: 0.0010 - acc: 1.0000 epochs : 1470 Epoch 1/1 - 0s - loss: 9.6392e-04 - acc: 1.0000 epochs : 1471 Epoch 1/1 - 0s - loss: 9.2486e-04 - acc: 1.0000 epochs : 1472 Epoch 1/1 - 0s - loss: 8.9784e-04 - acc: 1.0000 epochs : 1473 Epoch 1/1 - 0s - loss: 8.6549e-04 - acc: 1.0000 epochs : 1474 Epoch 1/1 - 0s - loss: 8.0948e-04 - acc: 1.0000 epochs : 1475 Epoch 1/1 - 0s - loss: 7.6700e-04 - acc: 1.0000 epochs : 1476 Epoch 1/1 - 0s - loss: 7.4406e-04 - acc: 1.0000 epochs : 1477 Epoch 1/1 - 0s - loss: 7.3036e-04 - acc: 1.0000 epochs : 1478 Epoch 1/1 - 0s - loss: 7.2172e-04 - acc: 1.0000 epochs : 1479 Epoch 1/1 - 0s - loss: 7.0941e-04 - acc: 1.0000 epochs : 1480 Epoch 1/1 - 0s - loss: 6.8601e-04 - acc: 1.0000 epochs : 1481 Epoch 1/1 - 0s - loss: 6.5484e-04 - acc: 1.0000 epochs : 1482 Epoch 1/1 - 0s - loss: 6.3466e-04 - acc: 1.0000 epochs : 1483 Epoch 1/1 - 0s - loss: 6.2160e-04 - acc: 1.0000 epochs : 1484 Epoch 1/1 - 0s - loss: 5.9921e-04 - acc: 1.0000 epochs : 1485 Epoch 1/1 - 0s - loss: 5.7687e-04 - acc: 1.0000 epochs : 1486 Epoch 1/1 - 0s - loss: 5.6305e-04 - acc: 1.0000 epochs : 1487 Epoch 1/1 - 0s - loss: 5.4702e-04 - acc: 1.0000 epochs : 1488 Epoch 1/1 - 0s - loss: 5.3324e-04 - acc: 1.0000 epochs : 1489 Epoch 1/1 - 0s - loss: 5.1926e-04 - acc: 1.0000 epochs : 1490 Epoch 1/1 - 0s - loss: 5.0857e-04 - acc: 1.0000 epochs : 1491 Epoch 1/1 - 0s - loss: 4.9813e-04 - acc: 1.0000 epochs : 1492 Epoch 1/1 - 0s - loss: 4.8942e-04 - acc: 1.0000 epochs : 1493 Epoch 1/1 - 0s - loss: 4.7953e-04 - acc: 1.0000 epochs : 1494 Epoch 1/1 - 0s - loss: 4.5904e-04 - acc: 1.0000 epochs : 1495 Epoch 1/1 - 0s - loss: 4.4406e-04 - acc: 1.0000 epochs : 1496 Epoch 1/1 - 0s - loss: 4.2235e-04 - acc: 1.0000 epochs : 1497 Epoch 1/1 - 0s - loss: 4.1676e-04 - acc: 1.0000 epochs : 1498 Epoch 1/1 - 0s - loss: 4.6100e-04 - acc: 1.0000 epochs : 1499 Epoch 1/1 - 0s - loss: 4.8462e-04 - acc: 1.0000 epochs : 1500 Epoch 1/1 - 0s - loss: 6.3893e-04 - acc: 1.0000 epochs : 1501 Epoch 1/1 - 0s - loss: 8.1858e-04 - acc: 1.0000 epochs : 1502 Epoch 1/1 - 0s - loss: 4.2491e-04 - acc: 1.0000 epochs : 1503 Epoch 1/1 - 0s - loss: 3.9538e-04 - acc: 1.0000 epochs : 1504 Epoch 1/1 - 0s - loss: 3.5625e-04 - acc: 1.0000 epochs : 1505 Epoch 1/1 - 0s - loss: 3.3139e-04 - acc: 1.0000 epochs : 1506 Epoch 1/1 - 0s - loss: 3.2389e-04 - acc: 1.0000 epochs : 1507 Epoch 1/1 - 0s - loss: 3.2556e-04 - acc: 1.0000 epochs : 1508 Epoch 1/1 - 0s - loss: 3.1889e-04 - acc: 1.0000 epochs : 1509 Epoch 1/1 - 0s - loss: 3.0898e-04 - acc: 1.0000 epochs : 1510 Epoch 1/1 - 0s - loss: 3.0335e-04 - acc: 1.0000 epochs : 1511 Epoch 1/1 - 0s - loss: 2.7667e-04 - acc: 1.0000 epochs : 1512 Epoch 1/1 - 0s - loss: 2.5934e-04 - acc: 1.0000 epochs : 1513 Epoch 1/1 - 0s - loss: 2.3875e-04 - acc: 1.0000 epochs : 1514 Epoch 1/1 - 0s - loss: 2.3553e-04 - acc: 1.0000 epochs : 1515 Epoch 1/1 - 0s - loss: 2.3353e-04 - acc: 1.0000 epochs : 1516 Epoch 1/1 - 0s - loss: 2.4081e-04 - acc: 1.0000 epochs : 1517 Epoch 1/1 - 0s - loss: 2.8701e-04 - acc: 1.0000 epochs : 1518 Epoch 1/1 - 0s - loss: 3.6550e-04 - acc: 1.0000 epochs : 1519 Epoch 1/1 - 0s - loss: 3.6275e-04 - acc: 1.0000 epochs : 1520 Epoch 1/1 - 0s - loss: 2.1699e-04 - acc: 1.0000 epochs : 1521 Epoch 1/1 - 0s - loss: 2.3071e-04 - acc: 1.0000 epochs : 1522 Epoch 1/1 - 0s - loss: 3.6943e-04 - acc: 1.0000 epochs : 1523 Epoch 1/1 - 0s - loss: 0.1454 - acc: 0.9800 epochs : 1524 Epoch 1/1 - 0s - loss: 1.2192 - acc: 0.6400 epochs : 1525 Epoch 1/1 - 0s - loss: 2.4703 - acc: 0.2800 epochs : 1526 Epoch 1/1 - 0s - loss: 0.8383 - acc: 0.6400 epochs : 1527 Epoch 1/1 - 0s - loss: 0.7158 - acc: 0.7000 epochs : 1528 Epoch 1/1 - 0s - loss: 0.7535 - acc: 0.7000 epochs : 1529 Epoch 1/1 - 0s - loss: 0.5356 - acc: 0.8000 epochs : 1530 Epoch 1/1 - 0s - loss: 0.4276 - acc: 0.8800 epochs : 1531 Epoch 1/1 - 0s - loss: 0.2920 - acc: 0.9000 epochs : 1532 Epoch 1/1 - 0s - loss: 0.2202 - acc: 0.9400 epochs : 1533 Epoch 1/1 - 0s - loss: 0.1578 - acc: 0.9800 epochs : 1534 Epoch 1/1 - 0s - loss: 0.2232 - acc: 0.9200 epochs : 1535 Epoch 1/1 - 0s - loss: 0.1819 - acc: 0.9600 epochs : 1536 Epoch 1/1 - 0s - loss: 0.5997 - acc: 0.7600 epochs : 1537 Epoch 1/1 - 0s - loss: 0.4491 - acc: 0.8600 epochs : 1538 Epoch 1/1 - 0s - loss: 0.2201 - acc: 0.9200 epochs : 1539 Epoch 1/1 - 0s - loss: 0.0815 - acc: 1.0000 epochs : 1540 Epoch 1/1 - 0s - loss: 0.0511 - acc: 1.0000 epochs : 1541 Epoch 1/1 - 0s - loss: 0.0459 - acc: 1.0000 epochs : 1542 Epoch 1/1 - 0s - loss: 0.0492 - acc: 0.9800 epochs : 1543 Epoch 1/1 - 0s - loss: 0.0236 - acc: 1.0000 epochs : 1544 Epoch 1/1 - 0s - loss: 0.0465 - acc: 0.9800 epochs : 1545 Epoch 1/1 - 0s - loss: 0.0297 - acc: 1.0000 epochs : 1546 Epoch 1/1 - 0s - loss: 0.0095 - acc: 1.0000 epochs : 1547 Epoch 1/1 - 0s - loss: 0.0084 - acc: 1.0000 epochs : 1548 Epoch 1/1 - 0s - loss: 0.0083 - acc: 1.0000 epochs : 1549 Epoch 1/1 - 0s - loss: 0.0073 - acc: 1.0000 epochs : 1550 Epoch 1/1 - 0s - loss: 0.0191 - acc: 1.0000 epochs : 1551 Epoch 1/1 - 0s - loss: 0.0209 - acc: 1.0000 epochs : 1552 Epoch 1/1 - 0s - loss: 0.3854 - acc: 0.9000 epochs : 1553 Epoch 1/1 - 0s - loss: 0.0414 - acc: 1.0000 epochs : 1554 Epoch 1/1 - 0s - loss: 0.0167 - acc: 1.0000 epochs : 1555 Epoch 1/1 - 0s - loss: 0.0619 - acc: 0.9600 epochs : 1556 Epoch 1/1 - 0s - loss: 0.0468 - acc: 1.0000 epochs : 1557 Epoch 1/1 - 0s - loss: 0.1739 - acc: 0.9400 epochs : 1558 Epoch 1/1 - 0s - loss: 0.2135 - acc: 0.9000 epochs : 1559 Epoch 1/1 - 0s - loss: 0.8798 - acc: 0.6800 epochs : 1560 Epoch 1/1 - 0s - loss: 0.1252 - acc: 0.9400 epochs : 1561 Epoch 1/1 - 0s - loss: 0.2031 - acc: 0.9600 epochs : 1562 Epoch 1/1 - 0s - loss: 0.0462 - acc: 1.0000 epochs : 1563 Epoch 1/1 - 0s - loss: 0.0316 - acc: 1.0000 epochs : 1564 Epoch 1/1 - 0s - loss: 0.0425 - acc: 0.9800 epochs : 1565 Epoch 1/1 - 0s - loss: 0.0127 - acc: 1.0000 epochs : 1566 Epoch 1/1 - 0s - loss: 0.0047 - acc: 1.0000 epochs : 1567 Epoch 1/1 - 0s - loss: 0.0042 - acc: 1.0000 epochs : 1568 Epoch 1/1 - 0s - loss: 0.0040 - acc: 1.0000 epochs : 1569 Epoch 1/1 - 0s - loss: 0.0038 - acc: 1.0000 epochs : 1570 Epoch 1/1 - 0s - loss: 0.0036 - acc: 1.0000 epochs : 1571 Epoch 1/1 - 0s - loss: 0.0034 - acc: 1.0000 epochs : 1572 Epoch 1/1 - 0s - loss: 0.0033 - acc: 1.0000 epochs : 1573 Epoch 1/1 - 0s - loss: 0.0033 - acc: 1.0000 epochs : 1574 Epoch 1/1 - 0s - loss: 0.0032 - acc: 1.0000 epochs : 1575 Epoch 1/1 - 0s - loss: 0.0031 - acc: 1.0000 epochs : 1576 Epoch 1/1 - 0s - loss: 0.0029 - acc: 1.0000 epochs : 1577 Epoch 1/1 - 0s - loss: 0.0028 - acc: 1.0000 epochs : 1578 Epoch 1/1 - 0s - loss: 0.0027 - acc: 1.0000 epochs : 1579 Epoch 1/1 - 0s - loss: 0.0027 - acc: 1.0000 epochs : 1580 Epoch 1/1 - 0s - loss: 0.0026 - acc: 1.0000 epochs : 1581 Epoch 1/1 - 0s - loss: 0.0026 - acc: 1.0000 epochs : 1582 Epoch 1/1 - 0s - loss: 0.0025 - acc: 1.0000 epochs : 1583 Epoch 1/1 - 0s - loss: 0.0025 - acc: 1.0000 epochs : 1584 Epoch 1/1 - 0s - loss: 0.0025 - acc: 1.0000 epochs : 1585 Epoch 1/1 - 0s - loss: 0.0024 - acc: 1.0000 epochs : 1586 Epoch 1/1 - 0s - loss: 0.0024 - acc: 1.0000 epochs : 1587 Epoch 1/1 - 0s - loss: 0.0023 - acc: 1.0000 epochs : 1588 Epoch 1/1 - 0s - loss: 0.0023 - acc: 1.0000 epochs : 1589 Epoch 1/1 - 0s - loss: 0.0023 - acc: 1.0000 epochs : 1590 Epoch 1/1 - 0s - loss: 0.0022 - acc: 1.0000 epochs : 1591 Epoch 1/1 - 0s - loss: 0.0021 - acc: 1.0000 epochs : 1592 Epoch 1/1 - 0s - loss: 0.0020 - acc: 1.0000 epochs : 1593 Epoch 1/1 - 0s - loss: 0.0020 - acc: 1.0000 epochs : 1594 Epoch 1/1 - 0s - loss: 0.0019 - acc: 1.0000 epochs : 1595 Epoch 1/1 - 0s - loss: 0.0018 - acc: 1.0000 epochs : 1596 Epoch 1/1 - 0s - loss: 0.0017 - acc: 1.0000 epochs : 1597 Epoch 1/1 - 0s - loss: 0.0017 - acc: 1.0000 epochs : 1598 Epoch 1/1 - 0s - loss: 0.0016 - acc: 1.0000 epochs : 1599 Epoch 1/1 - 0s - loss: 0.0015 - acc: 1.0000 epochs : 1600 Epoch 1/1 - 0s - loss: 0.0014 - acc: 1.0000 epochs : 1601 Epoch 1/1 - 0s - loss: 0.0014 - acc: 1.0000 epochs : 1602 Epoch 1/1 - 0s - loss: 0.0013 - acc: 1.0000 epochs : 1603 Epoch 1/1 - 0s - loss: 0.0013 - acc: 1.0000 epochs : 1604 Epoch 1/1 - 0s - loss: 0.0013 - acc: 1.0000 epochs : 1605 Epoch 1/1 - 0s - loss: 0.0012 - acc: 1.0000 epochs : 1606 Epoch 1/1 - 0s - loss: 0.0012 - acc: 1.0000 epochs : 1607 Epoch 1/1 - 0s - loss: 0.0012 - acc: 1.0000 epochs : 1608 Epoch 1/1 - 0s - loss: 0.0011 - acc: 1.0000 epochs : 1609 Epoch 1/1 - 0s - loss: 0.0011 - acc: 1.0000 epochs : 1610 Epoch 1/1 - 0s - loss: 0.0011 - acc: 1.0000 epochs : 1611 Epoch 1/1 - 0s - loss: 0.0011 - acc: 1.0000 epochs : 1612 Epoch 1/1 - 0s - loss: 0.0010 - acc: 1.0000 epochs : 1613 Epoch 1/1 - 0s - loss: 0.0010 - acc: 1.0000 epochs : 1614 Epoch 1/1 - 0s - loss: 9.9827e-04 - acc: 1.0000 epochs : 1615 Epoch 1/1 - 0s - loss: 9.9071e-04 - acc: 1.0000 epochs : 1616 Epoch 1/1 - 0s - loss: 9.7798e-04 - acc: 1.0000 epochs : 1617 Epoch 1/1 - 0s - loss: 9.6823e-04 - acc: 1.0000 epochs : 1618 Epoch 1/1 - 0s - loss: 9.6202e-04 - acc: 1.0000 epochs : 1619 Epoch 1/1 - 0s - loss: 9.5323e-04 - acc: 1.0000 epochs : 1620 Epoch 1/1 - 0s - loss: 9.3533e-04 - acc: 1.0000 epochs : 1621 Epoch 1/1 - 0s - loss: 9.1521e-04 - acc: 1.0000 epochs : 1622 Epoch 1/1 - 0s - loss: 8.9428e-04 - acc: 1.0000 epochs : 1623 Epoch 1/1 - 0s - loss: 8.6380e-04 - acc: 1.0000 epochs : 1624 Epoch 1/1 - 0s - loss: 8.3854e-04 - acc: 1.0000 epochs : 1625 Epoch 1/1 - 0s - loss: 7.9748e-04 - acc: 1.0000 epochs : 1626 Epoch 1/1 - 0s - loss: 7.7748e-04 - acc: 1.0000 epochs : 1627 Epoch 1/1 - 0s - loss: 7.4677e-04 - acc: 1.0000 epochs : 1628 Epoch 1/1 - 0s - loss: 7.3039e-04 - acc: 1.0000 epochs : 1629 Epoch 1/1 - 0s - loss: 7.0438e-04 - acc: 1.0000 epochs : 1630 Epoch 1/1 - 0s - loss: 6.8194e-04 - acc: 1.0000 epochs : 1631 Epoch 1/1 - 0s - loss: 6.5451e-04 - acc: 1.0000 epochs : 1632 Epoch 1/1 - 0s - loss: 6.3385e-04 - acc: 1.0000 epochs : 1633 Epoch 1/1 - 0s - loss: 6.0914e-04 - acc: 1.0000 epochs : 1634 Epoch 1/1 - 0s - loss: 5.8764e-04 - acc: 1.0000 epochs : 1635 Epoch 1/1 - 0s - loss: 5.6694e-04 - acc: 1.0000 epochs : 1636 Epoch 1/1 - 0s - loss: 5.4977e-04 - acc: 1.0000 epochs : 1637 Epoch 1/1 - 0s - loss: 5.3124e-04 - acc: 1.0000 epochs : 1638 Epoch 1/1 - 0s - loss: 5.1524e-04 - acc: 1.0000 epochs : 1639 Epoch 1/1 - 0s - loss: 5.0016e-04 - acc: 1.0000 epochs : 1640 Epoch 1/1 - 0s - loss: 4.8705e-04 - acc: 1.0000 epochs : 1641 Epoch 1/1 - 0s - loss: 4.7697e-04 - acc: 1.0000 epochs : 1642 Epoch 1/1 - 0s - loss: 4.6384e-04 - acc: 1.0000 epochs : 1643 Epoch 1/1 - 0s - loss: 4.5419e-04 - acc: 1.0000 epochs : 1644 Epoch 1/1 - 0s - loss: 4.4270e-04 - acc: 1.0000 epochs : 1645 Epoch 1/1 - 0s - loss: 4.3214e-04 - acc: 1.0000 epochs : 1646 Epoch 1/1 - 0s - loss: 4.1910e-04 - acc: 1.0000 epochs : 1647 Epoch 1/1 - 0s - loss: 4.0652e-04 - acc: 1.0000 epochs : 1648 Epoch 1/1 - 0s - loss: 3.9679e-04 - acc: 1.0000 epochs : 1649 Epoch 1/1 - 0s - loss: 3.8269e-04 - acc: 1.0000 epochs : 1650 Epoch 1/1 - 0s - loss: 3.7011e-04 - acc: 1.0000 epochs : 1651 Epoch 1/1 - 0s - loss: 3.5939e-04 - acc: 1.0000 epochs : 1652 Epoch 1/1 - 0s - loss: 3.4893e-04 - acc: 1.0000 epochs : 1653 Epoch 1/1 - 0s - loss: 3.3910e-04 - acc: 1.0000 epochs : 1654 Epoch 1/1 - 0s - loss: 3.2978e-04 - acc: 1.0000 epochs : 1655 Epoch 1/1 - 0s - loss: 3.2067e-04 - acc: 1.0000 epochs : 1656 Epoch 1/1 - 0s - loss: 3.1135e-04 - acc: 1.0000 epochs : 1657 Epoch 1/1 - 0s - loss: 3.0214e-04 - acc: 1.0000 epochs : 1658 Epoch 1/1 - 0s - loss: 2.9333e-04 - acc: 1.0000 epochs : 1659 Epoch 1/1 - 0s - loss: 2.8472e-04 - acc: 1.0000 epochs : 1660 Epoch 1/1 - 0s - loss: 2.7711e-04 - acc: 1.0000 epochs : 1661 Epoch 1/1 - 0s - loss: 2.6924e-04 - acc: 1.0000 epochs : 1662 Epoch 1/1 - 0s - loss: 2.6120e-04 - acc: 1.0000 epochs : 1663 Epoch 1/1 - 0s - loss: 2.5435e-04 - acc: 1.0000 epochs : 1664 Epoch 1/1 - 0s - loss: 2.4703e-04 - acc: 1.0000 epochs : 1665 Epoch 1/1 - 0s - loss: 2.4187e-04 - acc: 1.0000 epochs : 1666 Epoch 1/1 - 0s - loss: 2.3456e-04 - acc: 1.0000 epochs : 1667 Epoch 1/1 - 0s - loss: 2.3141e-04 - acc: 1.0000 epochs : 1668 Epoch 1/1 - 0s - loss: 2.2459e-04 - acc: 1.0000 epochs : 1669 Epoch 1/1 - 0s - loss: 2.3446e-04 - acc: 1.0000 epochs : 1670 Epoch 1/1 - 0s - loss: 2.1315e-04 - acc: 1.0000 epochs : 1671 Epoch 1/1 - 0s - loss: 2.0899e-04 - acc: 1.0000 epochs : 1672 Epoch 1/1 - 0s - loss: 2.0789e-04 - acc: 1.0000 epochs : 1673 Epoch 1/1 - 0s - loss: 2.0626e-04 - acc: 1.0000 epochs : 1674 Epoch 1/1 - 0s - loss: 2.0022e-04 - acc: 1.0000 epochs : 1675 Epoch 1/1 - 0s - loss: 1.9539e-04 - acc: 1.0000 epochs : 1676 Epoch 1/1 - 0s - loss: 1.8732e-04 - acc: 1.0000 epochs : 1677 Epoch 1/1 - 0s - loss: 1.7954e-04 - acc: 1.0000 epochs : 1678 Epoch 1/1 - 0s - loss: 1.7312e-04 - acc: 1.0000 epochs : 1679 Epoch 1/1 - 0s - loss: 1.6882e-04 - acc: 1.0000 epochs : 1680 Epoch 1/1 - 0s - loss: 1.6782e-04 - acc: 1.0000 epochs : 1681 Epoch 1/1 - 0s - loss: 1.6873e-04 - acc: 1.0000 epochs : 1682 Epoch 1/1 - 0s - loss: 1.7561e-04 - acc: 1.0000 epochs : 1683 Epoch 1/1 - 0s - loss: 2.1020e-04 - acc: 1.0000 epochs : 1684 Epoch 1/1 - 0s - loss: 1.7917e-04 - acc: 1.0000 epochs : 1685 Epoch 1/1 - 0s - loss: 1.7058e-04 - acc: 1.0000 epochs : 1686 Epoch 1/1 - 0s - loss: 1.5882e-04 - acc: 1.0000 epochs : 1687 Epoch 1/1 - 0s - loss: 1.4390e-04 - acc: 1.0000 epochs : 1688 Epoch 1/1 - 0s - loss: 1.4699e-04 - acc: 1.0000 epochs : 1689 Epoch 1/1 - 0s - loss: 1.3717e-04 - acc: 1.0000 epochs : 1690 Epoch 1/1 - 0s - loss: 1.4065e-04 - acc: 1.0000 epochs : 1691 Epoch 1/1 - 0s - loss: 1.3800e-04 - acc: 1.0000 epochs : 1692 Epoch 1/1 - 0s - loss: 1.2967e-04 - acc: 1.0000 epochs : 1693 Epoch 1/1 - 0s - loss: 1.2878e-04 - acc: 1.0000 epochs : 1694 Epoch 1/1 - 0s - loss: 1.1993e-04 - acc: 1.0000 epochs : 1695 Epoch 1/1 - 0s - loss: 1.1511e-04 - acc: 1.0000 epochs : 1696 Epoch 1/1 - 0s - loss: 1.1181e-04 - acc: 1.0000 epochs : 1697 Epoch 1/1 - 0s - loss: 1.0914e-04 - acc: 1.0000 epochs : 1698 Epoch 1/1 - 0s - loss: 1.0743e-04 - acc: 1.0000 epochs : 1699 Epoch 1/1 - 0s - loss: 1.0789e-04 - acc: 1.0000 epochs : 1700 Epoch 1/1 - 0s - loss: 1.0423e-04 - acc: 1.0000 epochs : 1701 Epoch 1/1 - 0s - loss: 9.4993e-05 - acc: 1.0000 epochs : 1702 Epoch 1/1 - 0s - loss: 9.5294e-05 - acc: 1.0000 epochs : 1703 Epoch 1/1 - 0s - loss: 9.2394e-05 - acc: 1.0000 epochs : 1704 Epoch 1/1 - 0s - loss: 9.1569e-05 - acc: 1.0000 epochs : 1705 Epoch 1/1 - 0s - loss: 9.0358e-05 - acc: 1.0000 epochs : 1706 Epoch 1/1 - 0s - loss: 8.8377e-05 - acc: 1.0000 epochs : 1707 Epoch 1/1 - 0s - loss: 8.2517e-05 - acc: 1.0000 epochs : 1708 Epoch 1/1 - 0s - loss: 8.1279e-05 - acc: 1.0000 epochs : 1709 Epoch 1/1 - 0s - loss: 7.9445e-05 - acc: 1.0000 epochs : 1710 Epoch 1/1 - 0s - loss: 7.7752e-05 - acc: 1.0000 epochs : 1711 Epoch 1/1 - 0s - loss: 7.5897e-05 - acc: 1.0000 epochs : 1712 Epoch 1/1 - 0s - loss: 7.3358e-05 - acc: 1.0000 epochs : 1713 Epoch 1/1 - 0s - loss: 7.1831e-05 - acc: 1.0000 epochs : 1714 Epoch 1/1 - 0s - loss: 7.1733e-05 - acc: 1.0000 epochs : 1715 Epoch 1/1 - 0s - loss: 7.0114e-05 - acc: 1.0000 epochs : 1716 Epoch 1/1 - 0s - loss: 6.8482e-05 - acc: 1.0000 epochs : 1717 Epoch 1/1 - 0s - loss: 6.6366e-05 - acc: 1.0000 epochs : 1718 Epoch 1/1 - 0s - loss: 6.4737e-05 - acc: 1.0000 epochs : 1719 Epoch 1/1 - 0s - loss: 6.3706e-05 - acc: 1.0000 epochs : 1720 Epoch 1/1 - 0s - loss: 6.5951e-05 - acc: 1.0000 epochs : 1721 Epoch 1/1 - 0s - loss: 6.7927e-05 - acc: 1.0000 epochs : 1722 Epoch 1/1 - 0s - loss: 7.1169e-05 - acc: 1.0000 epochs : 1723 Epoch 1/1 - 0s - loss: 7.0240e-05 - acc: 1.0000 epochs : 1724 Epoch 1/1 - 0s - loss: 6.9466e-05 - acc: 1.0000 epochs : 1725 Epoch 1/1 - 0s - loss: 9.3691e-05 - acc: 1.0000 epochs : 1726 Epoch 1/1 - 0s - loss: 0.0444 - acc: 0.9800 epochs : 1727 Epoch 1/1 - 0s - loss: 1.1521 - acc: 0.7400 epochs : 1728 Epoch 1/1 - 0s - loss: 1.7224 - acc: 0.5800 epochs : 1729 Epoch 1/1 - 0s - loss: 1.2859 - acc: 0.6600 epochs : 1730 Epoch 1/1 - 0s - loss: 1.1766 - acc: 0.6200 epochs : 1731 Epoch 1/1 - 0s - loss: 0.6466 - acc: 0.7800 epochs : 1732 Epoch 1/1 - 0s - loss: 1.1490 - acc: 0.6200 epochs : 1733 Epoch 1/1 - 0s - loss: 0.3632 - acc: 0.8400 epochs : 1734 Epoch 1/1 - 0s - loss: 0.1639 - acc: 0.9600 epochs : 1735 Epoch 1/1 - 0s - loss: 0.0940 - acc: 1.0000 epochs : 1736 Epoch 1/1 - 0s - loss: 0.0978 - acc: 0.9800 epochs : 1737 Epoch 1/1 - 0s - loss: 0.0500 - acc: 1.0000 epochs : 1738 Epoch 1/1 - 0s - loss: 0.0384 - acc: 1.0000 epochs : 1739 Epoch 1/1 - 0s - loss: 0.0372 - acc: 1.0000 epochs : 1740 Epoch 1/1 - 0s - loss: 0.1116 - acc: 0.9600 epochs : 1741 Epoch 1/1 - 0s - loss: 0.4592 - acc: 0.8200 epochs : 1742 Epoch 1/1 - 0s - loss: 0.1009 - acc: 0.9600 epochs : 1743 Epoch 1/1 - 0s - loss: 0.0448 - acc: 1.0000 epochs : 1744 Epoch 1/1 - 0s - loss: 0.0887 - acc: 0.9800 epochs : 1745 Epoch 1/1 - 0s - loss: 0.1589 - acc: 0.9600 epochs : 1746 Epoch 1/1 - 0s - loss: 0.0350 - acc: 1.0000 epochs : 1747 Epoch 1/1 - 0s - loss: 0.0305 - acc: 1.0000 epochs : 1748 Epoch 1/1 - 0s - loss: 0.1258 - acc: 0.9400 epochs : 1749 Epoch 1/1 - 0s - loss: 0.0554 - acc: 0.9800 epochs : 1750 Epoch 1/1 - 0s - loss: 0.0442 - acc: 0.9800 epochs : 1751 Epoch 1/1 - 0s - loss: 0.2583 - acc: 0.9000 epochs : 1752 Epoch 1/1 - 0s - loss: 0.0797 - acc: 0.9800 epochs : 1753 Epoch 1/1 - 0s - loss: 0.0558 - acc: 0.9800 epochs : 1754 Epoch 1/1 - 0s - loss: 0.1821 - acc: 0.9600 epochs : 1755 Epoch 1/1 - 0s - loss: 0.0510 - acc: 0.9800 epochs : 1756 Epoch 1/1 - 0s - loss: 0.0171 - acc: 1.0000 epochs : 1757 Epoch 1/1 - 0s - loss: 0.0133 - acc: 1.0000 epochs : 1758 Epoch 1/1 - 0s - loss: 0.0106 - acc: 1.0000 epochs : 1759 Epoch 1/1 - 0s - loss: 0.0090 - acc: 1.0000 epochs : 1760 Epoch 1/1 - 0s - loss: 0.0081 - acc: 1.0000 epochs : 1761 Epoch 1/1 - 0s - loss: 0.0077 - acc: 1.0000 epochs : 1762 Epoch 1/1 - 0s - loss: 0.0069 - acc: 1.0000 epochs : 1763 Epoch 1/1 - 0s - loss: 0.0060 - acc: 1.0000 epochs : 1764 Epoch 1/1 - 0s - loss: 0.0049 - acc: 1.0000 epochs : 1765 Epoch 1/1 - 0s - loss: 0.0042 - acc: 1.0000 epochs : 1766 Epoch 1/1 - 0s - loss: 0.0038 - acc: 1.0000 epochs : 1767 Epoch 1/1 - 0s - loss: 0.0035 - acc: 1.0000 epochs : 1768 Epoch 1/1 - 0s - loss: 0.0038 - acc: 1.0000 epochs : 1769 Epoch 1/1 - 0s - loss: 0.0035 - acc: 1.0000 epochs : 1770 Epoch 1/1 - 0s - loss: 0.0031 - acc: 1.0000 epochs : 1771 Epoch 1/1 - 0s - loss: 0.0028 - acc: 1.0000 epochs : 1772 Epoch 1/1 - 0s - loss: 0.0026 - acc: 1.0000 epochs : 1773 Epoch 1/1 - 0s - loss: 0.0024 - acc: 1.0000 epochs : 1774 Epoch 1/1 - 0s - loss: 0.0023 - acc: 1.0000 epochs : 1775 Epoch 1/1 - 0s - loss: 0.0022 - acc: 1.0000 epochs : 1776 Epoch 1/1 - 0s - loss: 0.0021 - acc: 1.0000 epochs : 1777 Epoch 1/1 - 0s - loss: 0.0021 - acc: 1.0000 epochs : 1778 Epoch 1/1 - 0s - loss: 0.0021 - acc: 1.0000 epochs : 1779 Epoch 1/1 - 0s - loss: 0.0020 - acc: 1.0000 epochs : 1780 Epoch 1/1 - 0s - loss: 0.0020 - acc: 1.0000 epochs : 1781 Epoch 1/1 - 0s - loss: 0.0020 - acc: 1.0000 epochs : 1782 Epoch 1/1 - 0s - loss: 0.0019 - acc: 1.0000 epochs : 1783 Epoch 1/1 - 0s - loss: 0.0019 - acc: 1.0000 epochs : 1784 Epoch 1/1 - 0s - loss: 0.0019 - acc: 1.0000 epochs : 1785 Epoch 1/1 - 0s - loss: 0.0018 - acc: 1.0000 epochs : 1786 Epoch 1/1 - 0s - loss: 0.0017 - acc: 1.0000 epochs : 1787 Epoch 1/1 - 0s - loss: 0.0016 - acc: 1.0000 epochs : 1788 Epoch 1/1 - 0s - loss: 0.0015 - acc: 1.0000 epochs : 1789 Epoch 1/1 - 0s - loss: 0.0014 - acc: 1.0000 epochs : 1790 Epoch 1/1 - 0s - loss: 0.0014 - acc: 1.0000 epochs : 1791 Epoch 1/1 - 0s - loss: 0.0013 - acc: 1.0000 epochs : 1792 Epoch 1/1 - 0s - loss: 0.0013 - acc: 1.0000 epochs : 1793 Epoch 1/1 - 0s - loss: 0.0012 - acc: 1.0000 epochs : 1794 Epoch 1/1 - 0s - loss: 0.0011 - acc: 1.0000 epochs : 1795 Epoch 1/1 - 0s - loss: 0.0011 - acc: 1.0000 epochs : 1796 Epoch 1/1 - 0s - loss: 0.0011 - acc: 1.0000 epochs : 1797 Epoch 1/1 - 0s - loss: 0.0010 - acc: 1.0000 epochs : 1798 Epoch 1/1 - 0s - loss: 9.8603e-04 - acc: 1.0000 epochs : 1799 Epoch 1/1 - 0s - loss: 9.5399e-04 - acc: 1.0000 epochs : 1800 Epoch 1/1 - 0s - loss: 9.2367e-04 - acc: 1.0000 epochs : 1801 Epoch 1/1 - 0s - loss: 8.9400e-04 - acc: 1.0000 epochs : 1802 Epoch 1/1 - 0s - loss: 8.6253e-04 - acc: 1.0000 epochs : 1803 Epoch 1/1 - 0s - loss: 8.3456e-04 - acc: 1.0000 epochs : 1804 Epoch 1/1 - 0s - loss: 8.0465e-04 - acc: 1.0000 epochs : 1805 Epoch 1/1 - 0s - loss: 7.7820e-04 - acc: 1.0000 epochs : 1806 Epoch 1/1 - 0s - loss: 7.5086e-04 - acc: 1.0000 epochs : 1807 Epoch 1/1 - 0s - loss: 7.2848e-04 - acc: 1.0000 epochs : 1808 Epoch 1/1 - 0s - loss: 7.0613e-04 - acc: 1.0000 epochs : 1809 Epoch 1/1 - 0s - loss: 6.8662e-04 - acc: 1.0000 epochs : 1810 Epoch 1/1 - 0s - loss: 6.7043e-04 - acc: 1.0000 epochs : 1811 Epoch 1/1 - 0s - loss: 6.5401e-04 - acc: 1.0000 epochs : 1812 Epoch 1/1 - 0s - loss: 6.3913e-04 - acc: 1.0000 epochs : 1813 Epoch 1/1 - 0s - loss: 6.2569e-04 - acc: 1.0000 epochs : 1814 Epoch 1/1 - 0s - loss: 6.1222e-04 - acc: 1.0000 epochs : 1815 Epoch 1/1 - 0s - loss: 6.0127e-04 - acc: 1.0000 epochs : 1816 Epoch 1/1 - 0s - loss: 5.9343e-04 - acc: 1.0000 epochs : 1817 Epoch 1/1 - 0s - loss: 5.9618e-04 - acc: 1.0000 epochs : 1818 Epoch 1/1 - 0s - loss: 5.9895e-04 - acc: 1.0000 epochs : 1819 Epoch 1/1 - 0s - loss: 5.9584e-04 - acc: 1.0000 epochs : 1820 Epoch 1/1 - 0s - loss: 5.7928e-04 - acc: 1.0000 epochs : 1821 Epoch 1/1 - 0s - loss: 5.6249e-04 - acc: 1.0000 epochs : 1822 Epoch 1/1 - 0s - loss: 5.5414e-04 - acc: 1.0000 epochs : 1823 Epoch 1/1 - 0s - loss: 5.4446e-04 - acc: 1.0000 epochs : 1824 Epoch 1/1 - 0s - loss: 5.5039e-04 - acc: 1.0000 epochs : 1825 Epoch 1/1 - 0s - loss: 5.5727e-04 - acc: 1.0000 epochs : 1826 Epoch 1/1 - 0s - loss: 5.4491e-04 - acc: 1.0000 epochs : 1827 Epoch 1/1 - 0s - loss: 5.3559e-04 - acc: 1.0000 epochs : 1828 Epoch 1/1 - 0s - loss: 5.5920e-04 - acc: 1.0000 epochs : 1829 Epoch 1/1 - 0s - loss: 5.3057e-04 - acc: 1.0000 epochs : 1830 Epoch 1/1 - 0s - loss: 4.5791e-04 - acc: 1.0000 epochs : 1831 Epoch 1/1 - 0s - loss: 4.4827e-04 - acc: 1.0000 epochs : 1832 Epoch 1/1 - 0s - loss: 4.5551e-04 - acc: 1.0000 epochs : 1833 Epoch 1/1 - 0s - loss: 4.7918e-04 - acc: 1.0000 epochs : 1834 Epoch 1/1 - 0s - loss: 5.2230e-04 - acc: 1.0000 epochs : 1835 Epoch 1/1 - 0s - loss: 5.1340e-04 - acc: 1.0000 epochs : 1836 Epoch 1/1 - 0s - loss: 5.4148e-04 - acc: 1.0000 epochs : 1837 Epoch 1/1 - 0s - loss: 4.6766e-04 - acc: 1.0000 epochs : 1838 Epoch 1/1 - 0s - loss: 0.0016 - acc: 1.0000 epochs : 1839 Epoch 1/1 - 0s - loss: 0.0015 - acc: 1.0000 epochs : 1840 Epoch 1/1 - 0s - loss: 5.3060e-04 - acc: 1.0000 epochs : 1841 Epoch 1/1 - 0s - loss: 3.3710e-04 - acc: 1.0000 epochs : 1842 Epoch 1/1 - 0s - loss: 6.9056e-04 - acc: 1.0000 epochs : 1843 Epoch 1/1 - 0s - loss: 4.1335e-04 - acc: 1.0000 epochs : 1844 Epoch 1/1 - 0s - loss: 0.0012 - acc: 1.0000 epochs : 1845 Epoch 1/1 - 0s - loss: 0.0325 - acc: 0.9800 epochs : 1846 Epoch 1/1 - 0s - loss: 0.4357 - acc: 0.8800 epochs : 1847 Epoch 1/1 - 0s - loss: 0.9494 - acc: 0.7600 epochs : 1848 Epoch 1/1 - 0s - loss: 0.8883 - acc: 0.7200 epochs : 1849 Epoch 1/1 - 0s - loss: 1.5843 - acc: 0.4600 epochs : 1850 Epoch 1/1 - 0s - loss: 0.4979 - acc: 0.8200 epochs : 1851 Epoch 1/1 - 0s - loss: 0.6165 - acc: 0.7600 epochs : 1852 Epoch 1/1 - 0s - loss: 0.2334 - acc: 0.9200 epochs : 1853 Epoch 1/1 - 0s - loss: 0.1091 - acc: 0.9800 epochs : 1854 Epoch 1/1 - 0s - loss: 0.1252 - acc: 0.9800 epochs : 1855 Epoch 1/1 - 0s - loss: 0.0507 - acc: 0.9800 epochs : 1856 Epoch 1/1 - 0s - loss: 0.0317 - acc: 1.0000 epochs : 1857 Epoch 1/1 - 0s - loss: 0.0184 - acc: 1.0000 epochs : 1858 Epoch 1/1 - 0s - loss: 0.0144 - acc: 1.0000 epochs : 1859 Epoch 1/1 - 0s - loss: 0.0119 - acc: 1.0000 epochs : 1860 Epoch 1/1 - 0s - loss: 0.0107 - acc: 1.0000 epochs : 1861 Epoch 1/1 - 0s - loss: 0.0097 - acc: 1.0000 epochs : 1862 Epoch 1/1 - 0s - loss: 0.0091 - acc: 1.0000 epochs : 1863 Epoch 1/1 - 0s - loss: 0.0085 - acc: 1.0000 epochs : 1864 Epoch 1/1 - 0s - loss: 0.0127 - acc: 1.0000 epochs : 1865 Epoch 1/1 - 0s - loss: 0.0151 - acc: 1.0000 epochs : 1866 Epoch 1/1 - 0s - loss: 0.0114 - acc: 1.0000 epochs : 1867 Epoch 1/1 - 0s - loss: 0.0082 - acc: 1.0000 epochs : 1868 Epoch 1/1 - 0s - loss: 0.0166 - acc: 1.0000 epochs : 1869 Epoch 1/1 - 0s - loss: 0.0094 - acc: 1.0000 epochs : 1870 Epoch 1/1 - 0s - loss: 0.0101 - acc: 1.0000 epochs : 1871 Epoch 1/1 - 0s - loss: 0.0083 - acc: 1.0000 epochs : 1872 Epoch 1/1 - 0s - loss: 0.0068 - acc: 1.0000 epochs : 1873 Epoch 1/1 - 0s - loss: 0.0070 - acc: 1.0000 epochs : 1874 Epoch 1/1 - 0s - loss: 0.0095 - acc: 1.0000 epochs : 1875 Epoch 1/1 - 0s - loss: 0.0078 - acc: 1.0000 epochs : 1876 Epoch 1/1 - 0s - loss: 0.0055 - acc: 1.0000 epochs : 1877 Epoch 1/1 - 0s - loss: 0.0084 - acc: 1.0000 epochs : 1878 Epoch 1/1 - 0s - loss: 0.0034 - acc: 1.0000 epochs : 1879 Epoch 1/1 - 0s - loss: 0.0418 - acc: 0.9800 epochs : 1880 Epoch 1/1 - 0s - loss: 0.0084 - acc: 1.0000 epochs : 1881 Epoch 1/1 - 0s - loss: 0.0222 - acc: 1.0000 epochs : 1882 Epoch 1/1 - 0s - loss: 0.0053 - acc: 1.0000 epochs : 1883 Epoch 1/1 - 0s - loss: 0.1578 - acc: 0.9400 epochs : 1884 Epoch 1/1 - 0s - loss: 0.0663 - acc: 1.0000 epochs : 1885 Epoch 1/1 - 0s - loss: 0.1079 - acc: 0.9800 epochs : 1886 Epoch 1/1 - 0s - loss: 0.7118 - acc: 0.8400 epochs : 1887 Epoch 1/1 - 0s - loss: 0.5308 - acc: 0.8600 epochs : 1888 Epoch 1/1 - 0s - loss: 1.0213 - acc: 0.7200 epochs : 1889 Epoch 1/1 - 0s - loss: 0.4321 - acc: 0.8000 epochs : 1890 Epoch 1/1 - 0s - loss: 0.6814 - acc: 0.8000 epochs : 1891 Epoch 1/1 - 0s - loss: 0.3491 - acc: 0.8600 epochs : 1892 Epoch 1/1 - 0s - loss: 0.2851 - acc: 0.9200 epochs : 1893 Epoch 1/1 - 0s - loss: 0.1476 - acc: 0.9600 epochs : 1894 Epoch 1/1 - 0s - loss: 0.1046 - acc: 0.9600 epochs : 1895 Epoch 1/1 - 0s - loss: 0.0583 - acc: 0.9800 epochs : 1896 Epoch 1/1 - 0s - loss: 0.0108 - acc: 1.0000 epochs : 1897 Epoch 1/1 - 0s - loss: 0.0090 - acc: 1.0000 epochs : 1898 Epoch 1/1 - 0s - loss: 0.0078 - acc: 1.0000 epochs : 1899 Epoch 1/1 - 0s - loss: 0.0069 - acc: 1.0000 epochs : 1900 Epoch 1/1 - 0s - loss: 0.0061 - acc: 1.0000 epochs : 1901 Epoch 1/1 - 0s - loss: 0.0056 - acc: 1.0000 epochs : 1902 Epoch 1/1 - 0s - loss: 0.0051 - acc: 1.0000 epochs : 1903 Epoch 1/1 - 0s - loss: 0.0048 - acc: 1.0000 epochs : 1904 Epoch 1/1 - 0s - loss: 0.0045 - acc: 1.0000 epochs : 1905 Epoch 1/1 - 0s - loss: 0.0042 - acc: 1.0000 epochs : 1906 Epoch 1/1 - 0s - loss: 0.0040 - acc: 1.0000 epochs : 1907 Epoch 1/1 - 0s - loss: 0.0038 - acc: 1.0000 epochs : 1908 Epoch 1/1 - 0s - loss: 0.0036 - acc: 1.0000 epochs : 1909 Epoch 1/1 - 0s - loss: 0.0034 - acc: 1.0000 epochs : 1910 Epoch 1/1 - 0s - loss: 0.0032 - acc: 1.0000 epochs : 1911 Epoch 1/1 - 0s - loss: 0.0031 - acc: 1.0000 epochs : 1912 Epoch 1/1 - 0s - loss: 0.0030 - acc: 1.0000 epochs : 1913 Epoch 1/1 - 0s - loss: 0.0028 - acc: 1.0000 epochs : 1914 Epoch 1/1 - 0s - loss: 0.0027 - acc: 1.0000 epochs : 1915 Epoch 1/1 - 0s - loss: 0.0026 - acc: 1.0000 epochs : 1916 Epoch 1/1 - 0s - loss: 0.0025 - acc: 1.0000 epochs : 1917 Epoch 1/1 - 0s - loss: 0.0024 - acc: 1.0000 epochs : 1918 Epoch 1/1 - 0s - loss: 0.0024 - acc: 1.0000 epochs : 1919 Epoch 1/1 - 0s - loss: 0.0023 - acc: 1.0000 epochs : 1920 Epoch 1/1 - 0s - loss: 0.0023 - acc: 1.0000 epochs : 1921 Epoch 1/1 - 0s - loss: 0.0023 - acc: 1.0000 epochs : 1922 Epoch 1/1 - 0s - loss: 0.0025 - acc: 1.0000 epochs : 1923 Epoch 1/1 - 0s - loss: 0.0030 - acc: 1.0000 epochs : 1924 Epoch 1/1 - 0s - loss: 0.0030 - acc: 1.0000 epochs : 1925 Epoch 1/1 - 0s - loss: 0.0030 - acc: 1.0000 epochs : 1926 Epoch 1/1 - 0s - loss: 0.0029 - acc: 1.0000 epochs : 1927 Epoch 1/1 - 0s - loss: 0.0024 - acc: 1.0000 epochs : 1928 Epoch 1/1 - 0s - loss: 0.0021 - acc: 1.0000 epochs : 1929 Epoch 1/1 - 0s - loss: 0.0020 - acc: 1.0000 epochs : 1930 Epoch 1/1 - 0s - loss: 0.0039 - acc: 1.0000 epochs : 1931 Epoch 1/1 - 0s - loss: 0.0129 - acc: 1.0000 epochs : 1932 Epoch 1/1 - 0s - loss: 0.0617 - acc: 0.9800 epochs : 1933 Epoch 1/1 - 0s - loss: 0.2655 - acc: 0.9400 epochs : 1934 Epoch 1/1 - 0s - loss: 0.0787 - acc: 0.9600 epochs : 1935 Epoch 1/1 - 0s - loss: 0.0396 - acc: 1.0000 epochs : 1936 Epoch 1/1 - 0s - loss: 0.0271 - acc: 1.0000 epochs : 1937 Epoch 1/1 - 0s - loss: 0.1800 - acc: 0.9400 epochs : 1938 Epoch 1/1 - 0s - loss: 0.5106 - acc: 0.9000 epochs : 1939 Epoch 1/1 - 0s - loss: 0.2170 - acc: 0.9400 epochs : 1940 Epoch 1/1 - 0s - loss: 0.0714 - acc: 0.9600 epochs : 1941 Epoch 1/1 - 0s - loss: 0.1430 - acc: 0.9600 epochs : 1942 Epoch 1/1 - 0s - loss: 0.0378 - acc: 1.0000 epochs : 1943 Epoch 1/1 - 0s - loss: 0.0094 - acc: 1.0000 epochs : 1944 Epoch 1/1 - 0s - loss: 0.0063 - acc: 1.0000 epochs : 1945 Epoch 1/1 - 0s - loss: 0.0051 - acc: 1.0000 epochs : 1946 Epoch 1/1 - 0s - loss: 0.0043 - acc: 1.0000 epochs : 1947 Epoch 1/1 - 0s - loss: 0.0040 - acc: 1.0000 epochs : 1948 Epoch 1/1 - 0s - loss: 0.0036 - acc: 1.0000 epochs : 1949 Epoch 1/1 - 0s - loss: 0.0031 - acc: 1.0000 epochs : 1950 Epoch 1/1 - 0s - loss: 0.0030 - acc: 1.0000 epochs : 1951 Epoch 1/1 - 0s - loss: 0.0031 - acc: 1.0000 epochs : 1952 Epoch 1/1 - 0s - loss: 0.0037 - acc: 1.0000 epochs : 1953 Epoch 1/1 - 0s - loss: 0.0043 - acc: 1.0000 epochs : 1954 Epoch 1/1 - 0s - loss: 0.0041 - acc: 1.0000 epochs : 1955 Epoch 1/1 - 0s - loss: 0.0037 - acc: 1.0000 epochs : 1956 Epoch 1/1 - 0s - loss: 0.0032 - acc: 1.0000 epochs : 1957 Epoch 1/1 - 0s - loss: 0.0027 - acc: 1.0000 epochs : 1958 Epoch 1/1 - 0s - loss: 0.0025 - acc: 1.0000 epochs : 1959 Epoch 1/1 - 0s - loss: 0.0023 - acc: 1.0000 epochs : 1960 Epoch 1/1 - 0s - loss: 0.0022 - acc: 1.0000 epochs : 1961 Epoch 1/1 - 0s - loss: 0.0023 - acc: 1.0000 epochs : 1962 Epoch 1/1 - 0s - loss: 0.0022 - acc: 1.0000 epochs : 1963 Epoch 1/1 - 0s - loss: 0.0020 - acc: 1.0000 epochs : 1964 Epoch 1/1 - 0s - loss: 0.0017 - acc: 1.0000 epochs : 1965 Epoch 1/1 - 0s - loss: 0.0015 - acc: 1.0000 epochs : 1966 Epoch 1/1 - 0s - loss: 0.0014 - acc: 1.0000 epochs : 1967 Epoch 1/1 - 0s - loss: 0.0018 - acc: 1.0000 epochs : 1968 Epoch 1/1 - 0s - loss: 0.0020 - acc: 1.0000 epochs : 1969 Epoch 1/1 - 0s - loss: 0.0029 - acc: 1.0000 epochs : 1970 Epoch 1/1 - 0s - loss: 0.0148 - acc: 1.0000 epochs : 1971 Epoch 1/1 - 0s - loss: 0.0864 - acc: 0.9800 epochs : 1972 Epoch 1/1 - 0s - loss: 0.0533 - acc: 0.9800 epochs : 1973 Epoch 1/1 - 0s - loss: 0.1541 - acc: 0.9600 epochs : 1974 Epoch 1/1 - 0s - loss: 0.6829 - acc: 0.8600 epochs : 1975 Epoch 1/1 - 0s - loss: 1.1532 - acc: 0.7000 epochs : 1976 Epoch 1/1 - 0s - loss: 0.6430 - acc: 0.7800 epochs : 1977 Epoch 1/1 - 0s - loss: 1.6028 - acc: 0.4400 epochs : 1978 Epoch 1/1 - 0s - loss: 0.4259 - acc: 0.8400 epochs : 1979 Epoch 1/1 - 0s - loss: 0.8471 - acc: 0.7800 epochs : 1980 Epoch 1/1 - 0s - loss: 0.2043 - acc: 0.9400 epochs : 1981 Epoch 1/1 - 0s - loss: 0.1088 - acc: 0.9800 epochs : 1982 Epoch 1/1 - 0s - loss: 0.0710 - acc: 1.0000 epochs : 1983 Epoch 1/1 - 0s - loss: 0.0626 - acc: 1.0000 epochs : 1984 Epoch 1/1 - 0s - loss: 0.0392 - acc: 1.0000 epochs : 1985 Epoch 1/1 - 0s - loss: 0.0284 - acc: 1.0000 epochs : 1986 Epoch 1/1 - 0s - loss: 0.0275 - acc: 1.0000 epochs : 1987 Epoch 1/1 - 0s - loss: 0.0219 - acc: 1.0000 epochs : 1988 Epoch 1/1 - 0s - loss: 0.0184 - acc: 1.0000 epochs : 1989 Epoch 1/1 - 0s - loss: 0.0137 - acc: 1.0000 epochs : 1990 Epoch 1/1 - 0s - loss: 0.0118 - acc: 1.0000 epochs : 1991 Epoch 1/1 - 0s - loss: 0.0107 - acc: 1.0000 epochs : 1992 Epoch 1/1 - 0s - loss: 0.0096 - acc: 1.0000 epochs : 1993 Epoch 1/1 - 0s - loss: 0.0090 - acc: 1.0000 epochs : 1994 Epoch 1/1 - 0s - loss: 0.0072 - acc: 1.0000 epochs : 1995 Epoch 1/1 - 0s - loss: 0.0064 - acc: 1.0000 epochs : 1996 Epoch 1/1 - 0s - loss: 0.0059 - acc: 1.0000 epochs : 1997 Epoch 1/1 - 0s - loss: 0.0056 - acc: 1.0000 epochs : 1998 Epoch 1/1 - 0s - loss: 0.0057 - acc: 1.0000 epochs : 1999 Epoch 1/1 - 0s - loss: 0.0082 - acc: 1.0000
50/50 [==============================] - 0s 3ms/step acc: 100.00% one step prediction : ['g8', 'e8', 'e4', 'f8', 'd8', 'd4', 'c8', 'd8', 'e8', 'f8', 'g8', 'g8', 'g4', 'g8', 'e8', 'e8', 'e8', 'f8', 'd8', 'd4', 'c8', 'e8', 'g8', 'g8', 'e8', 'e8', 'e4', 'd8', 'd8', 'd8', 'd8', 'd8', 'e8', 'f4', 'e8', 'e8', 'e8', 'e8', 'e8', 'f8', 'g4', 'g8', 'e8', 'e4', 'f8', 'd8', 'd4', 'c8', 'e8', 'g8', 'g8', 'e8', 'e8', 'e4'] full song prediction : ['g8', 'e8', 'e4', 'f8', 'd8', 'd4', 'c8', 'd8', 'e8', 'f8', 'g8', 'g8', 'g4', 'g8', 'e8', 'e8', 'e8', 'f8', 'd8', 'd4', 'c8', 'e8', 'g8', 'g8', 'e8', 'e8', 'e4', 'd8', 'd8', 'd8', 'd8', 'd8', 'e8', 'f4', 'e8', 'e8', 'e8', 'e8', 'e8', 'f8', 'g4', 'g8', 'e8', 'e4', 'f8', 'd8', 'd4', 'c8', 'e8', 'g8', 'g8', 'e8', 'e8', 'e4']
핍홀은 2000년에 Recurrent Nets that and Count 논문에서 제안한 LSTM의 변종
기존의 LSTM에서 sigmoid layer는 ht와 xt만 입력으로 받지만 핍홀에서는 ct-1도 입력으로 받는다.
이를 통해 좀 더 많은 맥락을 인식할 수 있다.
import tensorflow as tf
import numpy as np
import pandas as pd
import datetime
import matplotlib.pyplot as plt
# 랜덤에 의해 똑같은 결과를 재현하도록 시드 설정
# 하이퍼파라미터를 튜닝하기 위한 용도(흔들리면 무엇때문에 좋아졌는지 알기 어려움)
tf.set_random_seed(777)
# Standardization
def data_standardization(x):
x_np = np.asarray(x)
return (x_np - x_np.mean()) / x_np.std()
# 너무 작거나 너무 큰 값이 학습을 방해하는 것을 방지하고자 정규화한다
# x가 양수라는 가정하에 최소값과 최대값을 이용하여 0~1사이의 값으로 변환
# Min-Max scaling
def min_max_scaling(x):
x_np = np.asarray(x)
return (x_np - x_np.min()) / (x_np.max() - x_np.min() + 1e-7) # 1e-7은 0으로 나누는 오류 예방차원
# 정규화된 값을 원래의 값으로 되돌린다
# 정규화하기 이전의 org_x값과 되돌리고 싶은 x를 입력하면 역정규화된 값을 리턴한다
def reverse_min_max_scaling(org_x, x):
org_x_np = np.asarray(org_x)
x_np = np.asarray(x)
return (x_np * (org_x_np.max() - org_x_np.min() + 1e-7)) + org_x_np.min()
# 하이퍼파라미터
input_data_column_cnt = 6 # 입력데이터의 컬럼 개수(Variable 개수)
output_data_column_cnt = 1 # 결과데이터의 컬럼 개수
seq_length = 28 # 1개 시퀀스의 길이(시계열데이터 입력 개수)
rnn_cell_hidden_dim = 20 # 각 셀의 (hidden)출력 크기
forget_bias = 1.0 # 망각편향(기본값 1.0)
num_stacked_layers = 1 # stacked LSTM layers 개수
keep_prob = 1.0 # dropout할 때 keep할 비율
epoch_num = 1000 # 에폭 횟수(학습용전체데이터를 몇 회 반복해서 학습할 것인가 입력)
learning_rate = 0.01 # 학습률
# 데이터를 로딩한다.
stock_file_name = 'AMZN.csv' # 아마존 주가데이터 파일
encoding = 'euc-kr' # 문자 인코딩
names = ['Date','Open','High','Low','Close','Adj Close','Volume']
raw_dataframe = pd.read_csv(stock_file_name, names=names, encoding=encoding) #판다스이용 csv파일 로딩
raw_dataframe.info() # 데이터 정보 출력
# raw_dataframe.drop('Date', axis=1, inplace=True) # 시간열을 제거하고 dataframe 재생성하지 않기
del raw_dataframe['Date'] # 위 줄과 같은 효과
stock_info = raw_dataframe.values[1:].astype(np.float) # 금액&거래량 문자열을 부동소수점형으로 변환한다
print("stock_info.shape: ", stock_info.shape)
print("stock_info[0]: ", stock_info[0])
# 데이터 전처리
# 가격과 거래량 수치의 차이가 많아나서 각각 별도로 정규화한다
# 가격형태 데이터들을 정규화한다
# ['Open','High','Low','Close','Adj Close','Volume']에서 'Adj Close'까지 취함
# 곧, 마지막 열 Volume를 제외한 모든 열
price = stock_info[:,:-1]
norm_price = min_max_scaling(price) # 가격형태 데이터 정규화 처리
print("price.shape: ", price.shape)
print("price[0]: ", price[0])
print("norm_price[0]: ", norm_price[0])
print("="*100) # 화면상 구분용
# 거래량형태 데이터를 정규화한다
# ['Open','High','Low','Close','Adj Close','Volume']에서 마지막 'Volume'만 취함
# [:,-1]이 아닌 [:,-1:]이므로 주의하자! 스칼라가아닌 벡터값 산출해야만 쉽게 병합 가능
volume = stock_info[:,-1:]
norm_volume = min_max_scaling(volume) # 거래량형태 데이터 정규화 처리
print("volume.shape: ", volume.shape)
print("volume[0]: ", volume[0])
print("norm_volume[0]: ", norm_volume[0])
print("="*100) # 화면상 구분용
# 행은 그대로 두고 열을 우측에 붙여 합친다
x = np.concatenate((norm_price, norm_volume), axis=1) # axis=1, 세로로 합친다
print("x.shape: ", x.shape)
print("x[0]: ", x[0]) # x의 첫 값
print("x[-1]: ", x[-1]) # x의 마지막 값
print("="*100) # 화면상 구분용
y = x[:, [-2]] # 타켓은 주식 종가이다
print("y[0]: ",y[0]) # y의 첫 값
print("y[-1]: ",y[-1]) # y의 마지막 값
dataX = [] # 입력으로 사용될 Sequence Data
dataY = [] # 출력(타켓)으로 사용
for i in range(0, len(y) - seq_length):
_x = x[i : i+seq_length]
_y = y[i + seq_length] # 다음 나타날 주가(정답)
if i is 0:
print(_x, "->", _y) # 첫번째 행만 출력해 봄
dataX.append(_x) # dataX 리스트에 추가
dataY.append(_y) # dataY 리스트에 추가
# 학습용/테스트용 데이터 생성
# 전체 70%를 학습용 데이터로 사용
train_size = int(len(dataY) * 0.7)
# 나머지(30%)를 테스트용 데이터로 사용
test_size = len(dataY) - train_size
# 데이터를 잘라 학습용 데이터 생성
trainX = np.array(dataX[0:train_size])
trainY = np.array(dataY[0:train_size])
# 데이터를 잘라 테스트용 데이터 생성
testX = np.array(dataX[train_size:len(dataX)])
testY = np.array(dataY[train_size:len(dataY)])
# 텐서플로우 플레이스홀더 생성
# 입력 X, 출력 Y를 생성한다
X = tf.placeholder(tf.float32, [None, seq_length, input_data_column_cnt])
print("X: ", X)
Y = tf.placeholder(tf.float32, [None, 1])
print("Y: ", Y)
# 검증용 측정지표를 산출하기 위한 targets, predictions를 생성한다
targets = tf.placeholder(tf.float32, [None, 1])
print("targets: ", targets)
predictions = tf.placeholder(tf.float32, [None, 1])
print("predictions: ", predictions)
# 모델(LSTM 네트워크) 생성
def lstm_cell():
# LSTM셀을 생성
# num_units: 각 Cell 출력 크기
# forget_bias: to the biases of the forget gate
# (default: 1) in order to reduce the scale of forgetting in the beginning of the training.
# state_is_tuple: True ==> accepted and returned states are 2-tuples of the c_state and m_state.
# state_is_tuple: False ==> they are concatenated along the column axis.
#cell = tf.contrib.rnn.BasicLSTMCell(num_units=rnn_cell_hidden_dim,
# forget_bias=forget_bias, state_is_tuple=True, activation=tf.nn.softsign)
cell = tf.contrib.rnn.LSTMCell(num_units=rnn_cell_hidden_dim,
forget_bias=forget_bias, state_is_tuple=True, activation=tf.nn.softsign, use_peepholes=True)
if keep_prob < 1.0:
cell = tf.contrib.rnn.DropoutWrapper(cell, output_keep_prob=keep_prob)
return cell
# num_stacked_layers개의 층으로 쌓인 Stacked RNNs 생성
stackedRNNs = [lstm_cell() for _ in range(num_stacked_layers)]
multi_cells = tf.contrib.rnn.MultiRNNCell(stackedRNNs, state_is_tuple=True) if num_stacked_layers > 1 else lstm_cell()
# RNN Cell(여기서는 LSTM셀임)들을 연결
hypothesis, _states = tf.nn.dynamic_rnn(multi_cells, X, dtype=tf.float32)
print("hypothesis: ", hypothesis)
# [:, -1]를 잘 살펴보자. LSTM RNN의 마지막 (hidden)출력만을 사용했다.
# 과거 여러 거래일의 주가를 이용해서 다음날의 주가 1개를 예측하기때문에 MANY-TO-ONE형태이다
hypothesis = tf.contrib.layers.fully_connected(hypothesis[:, -1], output_data_column_cnt, activation_fn=tf.identity)
# 손실함수로 평균제곱오차를 사용한다
loss = tf.reduce_sum(tf.square(hypothesis - Y))
# 최적화함수로 AdamOptimizer를 사용한다
optimizer = tf.train.AdamOptimizer(learning_rate)
# optimizer = tf.train.RMSPropOptimizer(learning_rate) # LSTM과 궁합 별로임
train = optimizer.minimize(loss)
# RMSE(Root Mean Square Error)
# 제곱오차의 평균을 구하고 다시 제곱근을 구하면 평균 오차가 나온다
# rmse = tf.sqrt(tf.reduce_mean(tf.square(targets-predictions))) # 아래 코드와 같다
rmse = tf.sqrt(tf.reduce_mean(tf.squared_difference(targets, predictions)))
train_error_summary = [] # 학습용 데이터의 오류를 중간 중간 기록한다
test_error_summary = [] # 테스트용 데이터의 오류를 중간 중간 기록한다
test_predict = '' # 테스트용데이터로 예측한 결과
sess = tf.Session()
sess.run(tf.global_variables_initializer())
# 학습한다
start_time = datetime.datetime.now() # 시작시간을 기록한다
print('학습을 시작합니다...')
for epoch in range(epoch_num):
_, _loss = sess.run([train, loss], feed_dict={X: trainX, Y: trainY})
if ((epoch+1) % 100 == 0) or (epoch == epoch_num-1): # 100번째마다 또는 마지막 epoch인 경우
# 학습용데이터로 rmse오차를 구한다
train_predict = sess.run(hypothesis, feed_dict={X: trainX})
train_error = sess.run(rmse, feed_dict={targets: trainY, predictions: train_predict})
train_error_summary.append(train_error)
# 테스트용데이터로 rmse오차를 구한다
test_predict = sess.run(hypothesis, feed_dict={X: testX})
test_error = sess.run(rmse, feed_dict={targets: testY, predictions: test_predict})
test_error_summary.append(test_error)
# 현재 오류를 출력한다
print("epoch: {}, train_error(A): {}, test_error(B): {}, B-A: {}".format(epoch+1, train_error, test_error, test_error-train_error))
end_time = datetime.datetime.now() # 종료시간을 기록한다
elapsed_time = end_time - start_time # 경과시간을 구한다
print('elapsed_time:',elapsed_time)
print('elapsed_time per epoch:',elapsed_time/epoch_num)
# 하이퍼파라미터 출력
print('input_data_column_cnt:', input_data_column_cnt, end='')
print(',output_data_column_cnt:', output_data_column_cnt, end='')
print(',seq_length:', seq_length, end='')
print(',rnn_cell_hidden_dim:', rnn_cell_hidden_dim, end='')
print(',forget_bias:', forget_bias, end='')
print(',num_stacked_layers:', num_stacked_layers, end='')
print(',keep_prob:', keep_prob, end='')
print(',epoch_num:', epoch_num, end='')
print(',learning_rate:', learning_rate, end='')
print(',train_error:', train_error_summary[-1], end='')
print(',test_error:', test_error_summary[-1], end='')
print(',min_test_error:', np.min(test_error_summary))
# 결과 그래프 출력
plt.figure(1)
plt.plot(train_error_summary, 'gold')
plt.plot(test_error_summary, 'b')
plt.xlabel('Epoch(x100)')
plt.ylabel('Root Mean Square Error')
plt.figure(2)
plt.plot(testY, 'r')
plt.plot(test_predict, 'b')
plt.xlabel('Time Period')
plt.ylabel('Stock Price')
plt.show()
# sequence length만큼의 가장 최근 데이터를 슬라이싱한다
recent_data = np.array([x[len(x)-seq_length : ]])
print("recent_data.shape:", recent_data.shape)
print("recent_data:", recent_data)
# 내일 종가를 예측해본다
test_predict = sess.run(hypothesis, feed_dict={X: recent_data})
print("test_predict", test_predict[0])
test_predict = reverse_min_max_scaling(price,test_predict) # 금액데이터 역정규화한다
print("Tomorrow's stock price", test_predict[0]) # 예측한 주가를 출력한다
# LSTM RNN을 이용하여 아마존 주가 예측하기|작성자 똑똑이
<class 'pandas.core.frame.DataFrame'> RangeIndex: 5179 entries, 0 to 5178 Data columns (total 7 columns): Date 5179 non-null object Open 5179 non-null object High 5179 non-null object Low 5179 non-null object Close 5179 non-null object Adj Close 5179 non-null object Volume 5179 non-null object dtypes: object(7) memory usage: 283.3+ KB stock_info.shape: (5178, 6) stock_info[0]: [2.437500e+00 2.500000e+00 1.927083e+00 1.958333e+00 1.958333e+00 7.215600e+07] price.shape: (5178, 5) price[0]: [2.4375 2.5 1.927083 1.958333 1.958333] norm_price[0]: [0.00092814 0.00097971 0.00050704 0.00053282 0.00053282] ==================================================================================================== volume.shape: (5178, 1) volume[0]: [72156000.] norm_volume[0]: [0.69017161] ==================================================================================================== x.shape: (5178, 6) x[0]: [9.28143131e-04 9.79706638e-04 5.07040880e-04 5.32822633e-04 5.32822633e-04 6.90171607e-01] x[-1]: [0.96451605 0.96645482 0.95354329 0.95758589 0.95758589 0.02142033] ==================================================================================================== y[0]: [0.00053282] y[-1]: [0.95758589] [[9.28143131e-04 9.79706638e-04 5.07040880e-04 5.32822633e-04 5.32822633e-04 6.90171607e-01] [5.41416826e-04 5.50011019e-04 3.26568604e-04 3.43756990e-04 3.43756990e-04 1.36869475e-01] [3.69538744e-04 3.78132112e-04 2.57817536e-04 3.26568604e-04 3.26568604e-04 5.41168313e-02] [3.43756990e-04 3.60944551e-04 2.66411729e-04 2.66411729e-04 2.66411729e-04 4.79574739e-02] [2.66411729e-04 2.75005097e-04 5.15635073e-05 9.45328216e-05 9.45328216e-05 1.76864852e-01] [1.03127015e-04 1.11721207e-04 0.00000000e+00 6.87510680e-05 6.87510680e-05 1.08719015e-01] [7.73452609e-05 1.71878083e-04 1.71875607e-05 1.54690522e-04 1.54690522e-04 1.48783729e-01] [1.63284715e-04 2.75005097e-04 1.20314575e-04 2.23441590e-04 2.23441590e-04 7.90662738e-02] [2.57817536e-04 2.66411729e-04 1.80472275e-04 1.80472275e-04 1.80472275e-04 3.93597966e-02] [1.89066468e-04 1.89066468e-04 1.37502961e-04 1.58987206e-04 1.58987206e-04 2.87513723e-02] [1.54690522e-04 1.63284715e-04 1.37502961e-04 1.54690522e-04 1.54690522e-04 2.02923673e-02] [1.63284715e-04 1.80472275e-04 1.54690522e-04 1.63284715e-04 1.63284715e-04 1.00537355e-03] [1.80472275e-04 1.80472275e-04 1.37502961e-04 1.37502961e-04 1.37502961e-04 6.70249032e-03] [1.37502961e-04 1.46096329e-04 6.87510680e-05 8.59394538e-05 8.59394538e-05 2.49725545e-02] [8.59394538e-05 1.89066468e-04 5.15635073e-05 1.89066468e-04 1.89066468e-04 4.99335529e-02] [1.67581399e-04 3.26568604e-04 1.63284715e-04 2.83599290e-04 2.83599290e-04 7.04917086e-02] [2.83599290e-04 3.26568604e-04 2.83599290e-04 3.09381044e-04 3.09381044e-04 1.79580517e-02] [3.26568604e-04 3.26568604e-04 1.80472275e-04 2.23441590e-04 2.23441590e-04 4.78765817e-02] [2.32035783e-04 2.40629976e-04 1.80472275e-04 1.89066468e-04 1.89066468e-04 6.74871439e-03] [2.23441590e-04 2.75005097e-04 1.97659836e-04 2.40629976e-04 2.40629976e-04 1.10244410e-02] [2.57817536e-04 2.57817536e-04 2.23441590e-04 2.23441590e-04 2.23441590e-04 1.98763506e-03] [2.40629976e-04 2.40629976e-04 2.06254029e-04 2.14848222e-04 2.14848222e-04 4.10238632e-03] [2.36332467e-04 2.36332467e-04 1.50393838e-04 1.58987206e-04 1.58987206e-04 4.06309586e-02] [1.71878083e-04 1.84768959e-04 1.54690522e-04 1.63284715e-04 1.63284715e-04 1.90443173e-02] [1.63284715e-04 1.80472275e-04 1.54690522e-04 1.63284715e-04 1.63284715e-04 4.96908765e-03] [1.80472275e-04 1.97659836e-04 1.54690522e-04 1.76175591e-04 1.76175591e-04 2.79308950e-02] [1.89066468e-04 1.89066468e-04 1.37502961e-04 1.54690522e-04 1.54690522e-04 5.39666031e-03] [1.58987206e-04 1.76175591e-04 1.37502961e-04 1.63284715e-04 1.63284715e-04 2.54232392e-03]] -> [0.00016328] X: Tensor("Placeholder_2:0", shape=(?, 28, 6), dtype=float32) Y: Tensor("Placeholder_3:0", shape=(?, 1), dtype=float32) targets: Tensor("Placeholder_4:0", shape=(?, 1), dtype=float32) predictions: Tensor("Placeholder_5:0", shape=(?, 1), dtype=float32) WARNING:tensorflow:From <ipython-input-66-9c87ec8b3c96>:148: BasicLSTMCell.__init__ (from tensorflow.python.ops.rnn_cell_impl) is deprecated and will be removed in a future version. Instructions for updating: This class is deprecated, please use tf.nn.rnn_cell.LSTMCell, which supports all the feature this cell currently has. Please replace the existing code with tf.nn.rnn_cell.LSTMCell(name='basic_lstm_cell'). hypothesis: Tensor("rnn_1/transpose_1:0", shape=(?, 28, 20), dtype=float32) 학습을 시작합니다... epoch: 100, train_error(A): 0.003129080170765519, test_error(B): 0.10491427034139633, B-A: 0.10178519040346146 epoch: 200, train_error(A): 0.0028923184145241976, test_error(B): 0.08287963271141052, B-A: 0.07998731732368469 epoch: 300, train_error(A): 0.0026995243970304728, test_error(B): 0.054745256900787354, B-A: 0.052045732736587524 epoch: 400, train_error(A): 0.0025518666952848434, test_error(B): 0.030642904341220856, B-A: 0.028091037645936012 epoch: 500, train_error(A): 0.0024431541096419096, test_error(B): 0.01827004738152027, B-A: 0.015826893970370293 epoch: 600, train_error(A): 0.002361605642363429, test_error(B): 0.014293929561972618, B-A: 0.011932323686778545 epoch: 700, train_error(A): 0.0022956582251936197, test_error(B): 0.013497594743967056, B-A: 0.01120193675160408 epoch: 800, train_error(A): 0.0022389725781977177, test_error(B): 0.013615524396300316, B-A: 0.011376552283763885 epoch: 900, train_error(A): 0.002188938669860363, test_error(B): 0.013762109912931919, B-A: 0.011573171243071556 epoch: 1000, train_error(A): 0.002144591184332967, test_error(B): 0.013722626492381096, B-A: 0.011578035540878773 elapsed_time: 0:00:19.466815 elapsed_time per epoch: 0:00:00.019467 input_data_column_cnt: 6,output_data_column_cnt: 1,seq_length: 28,rnn_cell_hidden_dim: 20,forget_bias: 1.0,num_stacked_layers: 1,keep_prob: 1.0,epoch_num: 1000,learning_rate: 0.01,train_error: 0.0021445912,test_error: 0.0137226265,min_test_error: 0.013497595
C:\Users\User\Anaconda3\lib\site-packages\matplotlib\font_manager.py:1328: UserWarning: findfont: Font family ['NanumBarunGothic'] not found. Falling back to DejaVu Sans (prop.get_family(), self.defaultFamily[fontext]))
recent_data.shape: (1, 28, 6) recent_data: [[[0.91386004 0.9151306 0.90735891 0.910791 0.910791 0.02879182] [0.91089 0.91383526 0.90374533 0.909471 0.909471 0.03147378] [0.90462816 0.90803538 0.89560243 0.90166628 0.90166628 0.0307939 ] [0.89913352 0.91689614 0.89696373 0.91600506 0.91600506 0.03143526] [0.91398381 0.92739858 0.9136703 0.92347976 0.92347976 0.0278683 ] [0.92684578 0.93168037 0.92087268 0.92555055 0.92555055 0.02120337] [0.92526172 0.935756 0.92220094 0.93356143 0.93356143 0.02016814] [0.92785228 0.93087187 0.91944541 0.93046762 0.93046762 0.03125421] [0.92796779 0.93262916 0.92628483 0.92734903 0.92734903 0.01630073] [0.92541026 0.93935306 0.92486573 0.93050065 0.93050065 0.02341249] [0.93127612 0.93778551 0.92614454 0.93682846 0.93682846 0.0255311 ] [0.92871859 0.93262916 0.92428 0.92845453 0.92845453 0.0331417 ] [0.93131741 0.93791754 0.93122667 0.93719978 0.93719978 0.01662237] [0.93801653 0.93844556 0.92772861 0.93108638 0.93108638 0.01854933] [0.93099564 0.93400697 0.92503905 0.92814112 0.92814112 0.01614665] [0.93354491 0.93943554 0.92970031 0.93901477 0.93901477 0.01918492] [0.94026055 0.95615863 0.94026055 0.95276783 0.95276783 0.02954585] [0.95651333 0.97807927 0.95651333 0.97738628 0.97738628 0.02928295] [0.99113108 1. 0.98163513 0.98549615 0.98549615 0.06025308] [0.99296259 0.99334206 0.97946534 0.98365638 0.98365638 0.03921535] [0.98464646 0.98464646 0.94371732 0.95698365 0.95698365 0.08445812] [0.96179346 0.97125637 0.95593586 0.96975488 0.96975488 0.03873192] [0.96587735 0.97214745 0.94933573 0.95787463 0.95787463 0.03485969] [0.96736231 0.96847607 0.92953535 0.93444415 0.93444415 0.05243254] [0.92974986 0.95533362 0.92684578 0.94073077 0.94073077 0.03459679] [0.93777725 0.95254506 0.93620144 0.94962447 0.94962447 0.02278558] [0.95312253 0.95856761 0.94851072 0.95576264 0.95576264 0.01946804] [0.96451605 0.96645482 0.95354329 0.95758589 0.95758589 0.02142033]]] test_predict [0.9115674] Tomorrow's stock price [1106.2211]
import tensorflow as tf
import numpy as np
import pandas as pd
import matplotlib.pyplot as plt
np.random.seed(42)
C:\Users\User\Anaconda3\lib\site-packages\h5py\__init__.py:36: FutureWarning: Conversion of the second argument of issubdtype from `float` to `np.floating` is deprecated. In future, it will be treated as `np.float64 == np.dtype(float).type`. from ._conv import register_converters as _register_converters
rain2 = pd.read_csv('train.csv')
rain2.head(n=30)
Id | minutes_past | radardist_km | Ref | Ref_5x5_10th | Ref_5x5_50th | Ref_5x5_90th | RefComposite | RefComposite_5x5_10th | RefComposite_5x5_50th | ... | RhoHV_5x5_90th | Zdr | Zdr_5x5_10th | Zdr_5x5_50th | Zdr_5x5_90th | Kdp | Kdp_5x5_10th | Kdp_5x5_50th | Kdp_5x5_90th | Expected | |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
0 | 1 | 3 | 10.0 | NaN | NaN | NaN | NaN | NaN | NaN | NaN | ... | NaN | NaN | NaN | NaN | NaN | NaN | NaN | NaN | NaN | 0.254000 |
1 | 1 | 16 | 10.0 | NaN | NaN | NaN | NaN | NaN | NaN | NaN | ... | NaN | NaN | NaN | NaN | NaN | NaN | NaN | NaN | NaN | 0.254000 |
2 | 1 | 25 | 10.0 | NaN | NaN | NaN | NaN | NaN | NaN | NaN | ... | NaN | NaN | NaN | NaN | NaN | NaN | NaN | NaN | NaN | 0.254000 |
3 | 1 | 35 | 10.0 | NaN | NaN | NaN | NaN | NaN | NaN | NaN | ... | NaN | NaN | NaN | NaN | NaN | NaN | NaN | NaN | NaN | 0.254000 |
4 | 1 | 45 | 10.0 | NaN | NaN | NaN | NaN | NaN | NaN | NaN | ... | NaN | NaN | NaN | NaN | NaN | NaN | NaN | NaN | NaN | 0.254000 |
5 | 1 | 55 | 10.0 | NaN | NaN | NaN | NaN | NaN | NaN | NaN | ... | NaN | NaN | NaN | NaN | NaN | NaN | NaN | NaN | NaN | 0.254000 |
6 | 2 | 1 | 2.0 | 9.0 | 5.0 | 7.5 | 10.5 | 15.0 | 10.5 | 16.5 | ... | 0.998333 | 0.3750 | -0.1250 | 0.3125 | 0.8750 | 1.059998 | -1.410004 | -0.350006 | 1.059998 | 1.016000 |
7 | 2 | 6 | 2.0 | 26.5 | 22.5 | 25.5 | 31.5 | 26.5 | 26.5 | 28.5 | ... | 1.005000 | 0.0625 | -0.1875 | 0.2500 | 0.6875 | NaN | NaN | NaN | 1.409988 | 1.016000 |
8 | 2 | 11 | 2.0 | 21.5 | 15.5 | 20.5 | 25.0 | 26.5 | 23.5 | 25.0 | ... | 1.001667 | 0.3125 | -0.0625 | 0.3125 | 0.6250 | 0.349991 | NaN | -0.350006 | 1.759994 | 1.016000 |
9 | 2 | 16 | 2.0 | 18.0 | 14.0 | 17.5 | 21.0 | 20.5 | 18.0 | 20.5 | ... | 1.001667 | 0.2500 | 0.1250 | 0.3750 | 0.6875 | 0.349991 | -1.059998 | 0.000000 | 1.059998 | 1.016000 |
10 | 2 | 21 | 2.0 | 24.5 | 16.5 | 21.0 | 24.5 | 24.5 | 21.0 | 24.0 | ... | 0.998333 | 0.2500 | 0.0625 | 0.1875 | 0.5625 | -0.350006 | -1.059998 | -0.350006 | 1.759994 | 1.016000 |
11 | 2 | 26 | 2.0 | 12.0 | 12.0 | 16.0 | 20.0 | 16.5 | 17.0 | 19.0 | ... | 0.998333 | 0.5625 | 0.2500 | 0.4375 | 0.6875 | -1.760010 | -1.760010 | -0.350006 | 0.709991 | 1.016000 |
12 | 2 | 31 | 2.0 | 22.5 | 19.0 | 22.0 | 25.0 | 26.0 | 23.5 | 25.5 | ... | 1.001667 | 0.0000 | -0.1875 | 0.2500 | 0.6250 | -1.059998 | -2.120010 | -0.710007 | 0.349991 | 1.016000 |
13 | 2 | 37 | 2.0 | 14.0 | 14.0 | 18.5 | 21.0 | 19.5 | 20.0 | 21.0 | ... | 0.998333 | 0.5000 | 0.1875 | 0.4375 | 0.8125 | 0.000000 | -1.760010 | -0.350006 | 1.059998 | 1.016000 |
14 | 2 | 42 | 2.0 | 12.0 | 11.0 | 12.5 | 17.0 | 19.5 | 18.0 | 21.0 | ... | 0.998333 | 0.6250 | 0.3750 | 0.6250 | 0.8750 | -0.350006 | -0.350006 | 0.000000 | 0.349991 | 1.016000 |
15 | 2 | 47 | 2.0 | 1.5 | 3.5 | 7.0 | 10.5 | 18.0 | 16.5 | 18.5 | ... | 0.998333 | 0.3750 | 0.1875 | 0.5000 | 0.6875 | 0.349991 | -2.110001 | -0.350006 | 1.059998 | 1.016000 |
16 | 2 | 53 | 2.0 | 16.0 | 14.5 | 18.0 | 23.5 | 28.0 | 23.5 | 26.5 | ... | 0.998333 | 0.8750 | 0.6250 | 0.9375 | 1.3750 | -0.350006 | -1.410004 | -0.350006 | 2.119995 | 1.016000 |
17 | 2 | 58 | 2.0 | 22.0 | 16.5 | 22.5 | 26.5 | 31.5 | 26.5 | 29.0 | ... | 1.001667 | 0.3750 | 0.1875 | 0.3750 | 0.8750 | -1.410004 | NaN | -0.350006 | 0.699997 | 1.016000 |
18 | 3 | 4 | 10.0 | NaN | NaN | NaN | NaN | NaN | NaN | NaN | ... | NaN | NaN | NaN | NaN | NaN | NaN | NaN | NaN | NaN | 26.162014 |
19 | 3 | 9 | 10.0 | NaN | NaN | NaN | NaN | NaN | NaN | NaN | ... | NaN | NaN | NaN | NaN | NaN | NaN | NaN | NaN | NaN | 26.162014 |
20 | 3 | 14 | 10.0 | NaN | NaN | NaN | NaN | NaN | NaN | NaN | ... | NaN | NaN | NaN | NaN | NaN | NaN | NaN | NaN | NaN | 26.162014 |
21 | 3 | 18 | 10.0 | NaN | NaN | NaN | NaN | NaN | NaN | NaN | ... | NaN | NaN | NaN | NaN | NaN | NaN | NaN | NaN | NaN | 26.162014 |
22 | 3 | 23 | 10.0 | NaN | NaN | NaN | NaN | NaN | NaN | NaN | ... | NaN | NaN | NaN | NaN | NaN | NaN | NaN | NaN | NaN | 26.162014 |
23 | 3 | 28 | 10.0 | NaN | NaN | NaN | NaN | NaN | NaN | NaN | ... | NaN | NaN | NaN | NaN | NaN | NaN | NaN | NaN | NaN | 26.162014 |
24 | 3 | 33 | 10.0 | NaN | NaN | NaN | NaN | NaN | NaN | NaN | ... | NaN | NaN | NaN | NaN | NaN | NaN | NaN | NaN | NaN | 26.162014 |
25 | 3 | 38 | 10.0 | NaN | NaN | NaN | NaN | NaN | NaN | NaN | ... | NaN | NaN | NaN | NaN | NaN | NaN | NaN | NaN | NaN | 26.162014 |
26 | 3 | 43 | 10.0 | NaN | NaN | NaN | 8.5 | NaN | NaN | NaN | ... | NaN | NaN | NaN | NaN | NaN | NaN | NaN | NaN | NaN | 26.162014 |
27 | 3 | 48 | 10.0 | NaN | NaN | NaN | NaN | NaN | NaN | NaN | ... | NaN | NaN | NaN | NaN | NaN | NaN | NaN | NaN | NaN | 26.162014 |
28 | 3 | 53 | 10.0 | NaN | NaN | NaN | NaN | NaN | NaN | NaN | ... | 0.801667 | NaN | NaN | NaN | 2.0625 | NaN | NaN | NaN | NaN | 26.162014 |
29 | 3 | 58 | 10.0 | NaN | NaN | NaN | NaN | NaN | NaN | NaN | ... | NaN | NaN | NaN | NaN | NaN | NaN | NaN | NaN | NaN | 26.162014 |
30 rows × 24 columns
train_df = rain2.dropna(subset=['Ref'])
train_df.describe()
Id | minutes_past | radardist_km | Ref | Ref_5x5_10th | Ref_5x5_50th | Ref_5x5_90th | RefComposite | RefComposite_5x5_10th | RefComposite_5x5_50th | ... | RhoHV_5x5_90th | Zdr | Zdr_5x5_10th | Zdr_5x5_50th | Zdr_5x5_90th | Kdp | Kdp_5x5_10th | Kdp_5x5_50th | Kdp_5x5_90th | Expected | |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
count | 6.349375e+06 | 6.349375e+06 | 6.349375e+06 | 6.349375e+06 | 5.269105e+06 | 6.156846e+06 | 6.349288e+06 | 6.349375e+06 | 5.550921e+06 | 6.207490e+06 | ... | 4.600898e+06 | 4.069799e+06 | 3.577674e+06 | 4.095586e+06 | 4.600898e+06 | 3.494665e+06 | 2.978691e+06 | 3.511354e+06 | 4.062702e+06 | 6.349375e+06 |
mean | 5.894813e+05 | 2.927051e+01 | 9.286182e+00 | 2.292666e+01 | 1.998385e+01 | 2.302103e+01 | 2.812518e+01 | 2.540071e+01 | 2.256900e+01 | 2.541082e+01 | ... | 1.013113e+00 | 5.295568e-01 | -6.975932e-01 | 3.815768e-01 | 2.057199e+00 | 3.230033e-02 | -3.409001e+00 | -3.781228e-01 | 4.038652e+00 | 1.714876e+01 |
std | 3.406124e+05 | 1.715383e+01 | 4.068784e+00 | 1.035516e+01 | 9.195141e+00 | 9.882618e+00 | 1.028228e+01 | 1.044013e+01 | 9.547383e+00 | 1.003713e+01 | ... | 4.179456e-02 | 1.476643e+00 | 1.017368e+00 | 9.260191e-01 | 1.617217e+00 | 3.699795e+00 | 2.719573e+00 | 2.087361e+00 | 3.902270e+00 | 2.025279e+02 |
min | 2.000000e+00 | 0.000000e+00 | 0.000000e+00 | -3.100000e+01 | -3.200000e+01 | -3.200000e+01 | -2.850000e+01 | -2.800000e+01 | -3.050000e+01 | -2.750000e+01 | ... | 2.083333e-01 | -7.875000e+00 | -7.875000e+00 | -7.875000e+00 | -7.875000e+00 | -9.604000e+01 | -8.079000e+01 | -7.168000e+01 | -1.002000e+02 | 1.000000e-02 |
25% | 2.923290e+05 | 1.400000e+01 | 6.000000e+00 | 1.600000e+01 | 1.450000e+01 | 1.650000e+01 | 2.150000e+01 | 1.850000e+01 | 1.650000e+01 | 1.850000e+01 | ... | 9.983333e-01 | -1.875000e-01 | -1.062500e+00 | 0.000000e+00 | 1.062500e+00 | -1.410004e+00 | -4.540008e+00 | -7.100067e-01 | 2.059998e+00 | 5.080003e-01 |
50% | 5.906240e+05 | 2.900000e+01 | 1.000000e+01 | 2.250000e+01 | 2.000000e+01 | 2.300000e+01 | 2.750000e+01 | 2.500000e+01 | 2.250000e+01 | 2.500000e+01 | ... | 1.005000e+00 | 3.750000e-01 | -6.250000e-01 | 3.125000e-01 | 1.687500e+00 | 0.000000e+00 | -2.820007e+00 | 0.000000e+00 | 3.509994e+00 | 1.425001e+00 |
75% | 8.883170e+05 | 4.400000e+01 | 1.200000e+01 | 2.950000e+01 | 2.600000e+01 | 2.950000e+01 | 3.500000e+01 | 3.200000e+01 | 2.900000e+01 | 3.200000e+01 | ... | 1.051667e+00 | 1.062500e+00 | -1.875000e-01 | 6.875000e-01 | 2.562500e+00 | 1.409988e+00 | -1.740006e+00 | 3.499908e-01 | 5.629990e+00 | 4.064002e+00 |
max | 1.180945e+06 | 5.900000e+01 | 2.100000e+01 | 7.100000e+01 | 6.250000e+01 | 6.900000e+01 | 7.250000e+01 | 9.250000e+01 | 6.600000e+01 | 7.100000e+01 | ... | 1.051667e+00 | 7.937500e+00 | 7.937500e+00 | 7.937500e+00 | 7.937500e+00 | 1.797500e+02 | 3.169998e+00 | 1.280000e+01 | 1.446000e+02 | 3.301773e+04 |
8 rows × 24 columns
train_df.isna().sum()
Id 0 minutes_past 0 radardist_km 0 Ref 0 Ref_5x5_10th 1080270 Ref_5x5_50th 192529 Ref_5x5_90th 87 RefComposite 0 RefComposite_5x5_10th 798454 RefComposite_5x5_50th 141885 RefComposite_5x5_90th 67 RhoHV 2279576 RhoHV_5x5_10th 2771701 RhoHV_5x5_50th 2253789 RhoHV_5x5_90th 1748477 Zdr 2279576 Zdr_5x5_10th 2771701 Zdr_5x5_50th 2253789 Zdr_5x5_90th 1748477 Kdp 2854710 Kdp_5x5_10th 3370684 Kdp_5x5_50th 2838021 Kdp_5x5_90th 2286673 Expected 0 dtype: int64
train_df = train_df.fillna(0)
train_df.isna().sum()
Id 0 minutes_past 0 radardist_km 0 Ref 0 Ref_5x5_10th 0 Ref_5x5_50th 0 Ref_5x5_90th 0 RefComposite 0 RefComposite_5x5_10th 0 RefComposite_5x5_50th 0 RefComposite_5x5_90th 0 RhoHV 0 RhoHV_5x5_10th 0 RhoHV_5x5_50th 0 RhoHV_5x5_90th 0 Zdr 0 Zdr_5x5_10th 0 Zdr_5x5_50th 0 Zdr_5x5_90th 0 Kdp 0 Kdp_5x5_10th 0 Kdp_5x5_50th 0 Kdp_5x5_90th 0 Expected 0 dtype: int64
train_seq = train_df.groupby(['Id']) ## groupby메소드를 이용하여 id를 그룹화함
#
train_seq_size = train_seq.size()
train_seq_size.count(), train_seq_size.max()
(731556, 19)
X = np.zeros((731556, 19, 22))
y = np.zeros((731556, 1))
i = 0
for name, group in train_seq:
# d.shape is (seq_length, 24)
d = group.values
# column 1~22 are features.
# column 0 is Id and column 23 is target.
# save 1~22 features to 0~21 index of dataset up to d.shape[0].
X[i, :d.shape[0], 0:22] = d[:, 1:23]
y[i, 0] = d[0, 23]
i += 1;
print(i)
731556
def feed_gen(X, y, batch_size=1024):
shuffled_index = np.random.permutation(len(X)) # shuffle index
start = 0
while 1:
end = start + batch_size
if end > len(X): # cannot exceed X's length
end = len(X)
yield X[shuffled_index[start:end]], y[shuffled_index[start:end]]
start = end
if end >= len(X): # if arrive at the end, shuffle again.
shuffled_index = np.random.permutation(len(X))
start = 0
from sklearn.model_selection import train_test_split
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.1, random_state=42)
batch_size = 1024
steps_per_epoch = np.ceil(X_train.shape[0] / batch_size)
validation_steps = np.ceil(X_test.shape[0] / batch_size)
steps_per_epoch, validation_steps
(643.0, 72.0)
train_gen = feed_gen(X_train, y_train, batch_size)
val_gen = feed_gen(X_test, y_test, batch_size)
from tensorflow.keras.models import Sequential
from tensorflow.keras.layers import Dense, LSTM
model = Sequential()
model.add(LSTM(35, input_shape=(None, 22)))
model.add(Dense(1))
model.compile(loss='mae', optimizer='rmsprop')
history = model.fit_generator(train_gen,
steps_per_epoch=steps_per_epoch,
epochs=100,
validation_data=val_gen,
validation_steps=validation_steps)
Epoch 1/100 643/643 [==============================] - 18s 28ms/step - loss: 23.3781 - val_loss: 22.4762 Epoch 2/100 643/643 [==============================] - 17s 26ms/step - loss: 23.2803 - val_loss: 22.5541 Epoch 3/100 643/643 [==============================] - 17s 27ms/step - loss: 23.2525 - val_loss: 22.5051 Epoch 4/100 643/643 [==============================] - 17s 27ms/step - loss: 23.2354 - val_loss: 22.0474 Epoch 5/100 643/643 [==============================] - 17s 27ms/step - loss: 23.2201 - val_loss: 22.9945 Epoch 6/100 643/643 [==============================] - 17s 27ms/step - loss: 23.2099 - val_loss: 23.1794 Epoch 7/100 643/643 [==============================] - 17s 26ms/step - loss: 23.2010 - val_loss: 22.0522 Epoch 8/100 643/643 [==============================] - 17s 26ms/step - loss: 23.1941 - val_loss: 22.2700 Epoch 9/100 643/643 [==============================] - 17s 26ms/step - loss: 23.1894 - val_loss: 22.3911 Epoch 10/100 643/643 [==============================] - 17s 27ms/step - loss: 23.1838 - val_loss: 23.4093 Epoch 11/100 643/643 [==============================] - 17s 26ms/step - loss: 23.1782 - val_loss: 22.4629 Epoch 12/100 643/643 [==============================] - 17s 26ms/step - loss: 23.1740 - val_loss: 21.8177 Epoch 13/100 643/643 [==============================] - 16s 26ms/step - loss: 23.1705 - val_loss: 22.5029 Epoch 14/100 643/643 [==============================] - 16s 26ms/step - loss: 23.1687 - val_loss: 23.1420 Epoch 15/100 643/643 [==============================] - 17s 26ms/step - loss: 23.1642 - val_loss: 22.5740 Epoch 16/100 643/643 [==============================] - 17s 26ms/step - loss: 23.1619 - val_loss: 22.4353 Epoch 17/100 643/643 [==============================] - 17s 26ms/step - loss: 23.1587 - val_loss: 21.8780 Epoch 18/100 643/643 [==============================] - 17s 26ms/step - loss: 23.1578 - val_loss: 22.8080 Epoch 19/100 643/643 [==============================] - 17s 26ms/step - loss: 23.1555 - val_loss: 21.5869 Epoch 20/100 643/643 [==============================] - 17s 26ms/step - loss: 23.1548 - val_loss: 23.2436 Epoch 21/100 643/643 [==============================] - 17s 26ms/step - loss: 23.1517 - val_loss: 21.5182 Epoch 22/100 643/643 [==============================] - 17s 26ms/step - loss: 23.1501 - val_loss: 21.9726 Epoch 23/100 643/643 [==============================] - 17s 26ms/step - loss: 23.1493 - val_loss: 22.3294 Epoch 24/100 643/643 [==============================] - 17s 26ms/step - loss: 23.1465 - val_loss: 22.0555 Epoch 25/100 643/643 [==============================] - 17s 26ms/step - loss: 23.1447 - val_loss: 21.8476 Epoch 26/100 643/643 [==============================] - 17s 26ms/step - loss: 23.1426 - val_loss: 22.5266 Epoch 27/100 643/643 [==============================] - 16s 26ms/step - loss: 23.1425 - val_loss: 22.3340 Epoch 28/100 643/643 [==============================] - 16s 26ms/step - loss: 23.1418 - val_loss: 22.5100 Epoch 29/100 643/643 [==============================] - 17s 26ms/step - loss: 23.1382 - val_loss: 22.4524 Epoch 30/100 643/643 [==============================] - 16s 26ms/step - loss: 23.1380 - val_loss: 22.1319 Epoch 31/100 643/643 [==============================] - 16s 26ms/step - loss: 23.1365 - val_loss: 22.6604 Epoch 32/100 643/643 [==============================] - 17s 27ms/step - loss: 23.1357 - val_loss: 22.5869 Epoch 33/100 643/643 [==============================] - 16s 26ms/step - loss: 23.1337 - val_loss: 22.1874 Epoch 34/100 643/643 [==============================] - 17s 26ms/step - loss: 23.1310 - val_loss: 22.7072 Epoch 35/100 643/643 [==============================] - 17s 26ms/step - loss: 23.1309 - val_loss: 22.2070 Epoch 36/100 643/643 [==============================] - 16s 26ms/step - loss: 23.1298 - val_loss: 21.8802 Epoch 37/100 643/643 [==============================] - 17s 26ms/step - loss: 23.1295 - val_loss: 22.0587 Epoch 38/100 643/643 [==============================] - 16s 26ms/step - loss: 23.1288 - val_loss: 22.3135 Epoch 39/100 643/643 [==============================] - 17s 26ms/step - loss: 23.1266 - val_loss: 22.2588 Epoch 40/100 643/643 [==============================] - 16s 26ms/step - loss: 23.1268 - val_loss: 23.3347 Epoch 41/100 643/643 [==============================] - 17s 26ms/step - loss: 23.1259 - val_loss: 22.3835 Epoch 42/100 643/643 [==============================] - 17s 26ms/step - loss: 23.1246 - val_loss: 21.2047 Epoch 43/100 643/643 [==============================] - 17s 26ms/step - loss: 23.1236 - val_loss: 22.6074 Epoch 44/100 643/643 [==============================] - 17s 27ms/step - loss: 23.1219 - val_loss: 22.5904 Epoch 45/100 643/643 [==============================] - 17s 26ms/step - loss: 23.1223 - val_loss: 22.2789 Epoch 46/100 643/643 [==============================] - 17s 26ms/step - loss: 23.1210 - val_loss: 22.4629 Epoch 47/100 643/643 [==============================] - 17s 26ms/step - loss: 23.1207 - val_loss: 23.3099 Epoch 48/100 643/643 [==============================] - 17s 26ms/step - loss: 23.1204 - val_loss: 21.5007 Epoch 49/100 643/643 [==============================] - 17s 26ms/step - loss: 23.1192 - val_loss: 23.0589 Epoch 50/100 643/643 [==============================] - 17s 26ms/step - loss: 23.1176 - val_loss: 23.3748 Epoch 51/100 643/643 [==============================] - 17s 26ms/step - loss: 23.1162 - val_loss: 22.1123 Epoch 52/100 643/643 [==============================] - 17s 26ms/step - loss: 23.1169 - val_loss: 22.0784 Epoch 53/100 643/643 [==============================] - 17s 26ms/step - loss: 23.1149 - val_loss: 22.2172 Epoch 54/100 643/643 [==============================] - 17s 26ms/step - loss: 23.1142 - val_loss: 21.8808 Epoch 55/100 643/643 [==============================] - 17s 26ms/step - loss: 23.1136 - val_loss: 22.2396 Epoch 56/100 643/643 [==============================] - 17s 26ms/step - loss: 23.1133 - val_loss: 21.9942 Epoch 57/100 643/643 [==============================] - 17s 26ms/step - loss: 23.1138 - val_loss: 23.0640 Epoch 58/100 643/643 [==============================] - 17s 26ms/step - loss: 23.1130 - val_loss: 22.5219 Epoch 59/100 643/643 [==============================] - 17s 26ms/step - loss: 23.1107 - val_loss: 23.3349 Epoch 60/100 643/643 [==============================] - 16s 26ms/step - loss: 23.1116 - val_loss: 22.3273 Epoch 61/100 643/643 [==============================] - 17s 26ms/step - loss: 23.1106 - val_loss: 22.1653 Epoch 62/100 643/643 [==============================] - 16s 26ms/step - loss: 23.1100 - val_loss: 22.4563 Epoch 63/100 643/643 [==============================] - 16s 26ms/step - loss: 23.1104 - val_loss: 22.9241 Epoch 64/100 643/643 [==============================] - 17s 26ms/step - loss: 23.1093 - val_loss: 21.2605 Epoch 65/100 643/643 [==============================] - 17s 26ms/step - loss: 23.1100 - val_loss: 23.0543 Epoch 66/100 643/643 [==============================] - 17s 26ms/step - loss: 23.1078 - val_loss: 22.1931 Epoch 67/100 643/643 [==============================] - 16s 26ms/step - loss: 23.1068 - val_loss: 22.2137 Epoch 68/100 643/643 [==============================] - 17s 26ms/step - loss: 23.1077 - val_loss: 21.6787 Epoch 69/100 643/643 [==============================] - 17s 26ms/step - loss: 23.1067 - val_loss: 22.5104 Epoch 70/100 643/643 [==============================] - 17s 26ms/step - loss: 23.1053 - val_loss: 21.9236 Epoch 71/100 643/643 [==============================] - 17s 26ms/step - loss: 23.1049 - val_loss: 22.8131 Epoch 72/100 643/643 [==============================] - 17s 26ms/step - loss: 23.1050 - val_loss: 22.3110 Epoch 73/100 643/643 [==============================] - 16s 26ms/step - loss: 23.1054 - val_loss: 22.4871 Epoch 74/100 643/643 [==============================] - 17s 26ms/step - loss: 23.1028 - val_loss: 22.1814 Epoch 75/100 643/643 [==============================] - 17s 26ms/step - loss: 23.1034 - val_loss: 22.2770 Epoch 76/100 643/643 [==============================] - 17s 26ms/step - loss: 23.1028 - val_loss: 22.5422 Epoch 77/100 643/643 [==============================] - 16s 26ms/step - loss: 23.1026 - val_loss: 21.3006 Epoch 78/100 643/643 [==============================] - 16s 25ms/step - loss: 23.1015 - val_loss: 21.9315 Epoch 79/100 643/643 [==============================] - 17s 26ms/step - loss: 23.1010 - val_loss: 22.7901 Epoch 80/100 643/643 [==============================] - 17s 26ms/step - loss: 23.0996 - val_loss: 22.1922 Epoch 81/100 643/643 [==============================] - 17s 26ms/step - loss: 23.1025 - val_loss: 21.7178 Epoch 82/100 643/643 [==============================] - 17s 26ms/step - loss: 23.0995 - val_loss: 22.3317 Epoch 83/100 643/643 [==============================] - 17s 26ms/step - loss: 23.0992 - val_loss: 22.6841 Epoch 84/100 643/643 [==============================] - 17s 27ms/step - loss: 23.0988 - val_loss: 22.0606 Epoch 85/100 643/643 [==============================] - 17s 26ms/step - loss: 23.0970 - val_loss: 22.5748 Epoch 86/100 643/643 [==============================] - 17s 26ms/step - loss: 23.0976 - val_loss: 22.2332 Epoch 87/100 643/643 [==============================] - 17s 26ms/step - loss: 23.0990 - val_loss: 22.3345 Epoch 88/100 643/643 [==============================] - 17s 26ms/step - loss: 23.0969 - val_loss: 22.4005 Epoch 89/100 643/643 [==============================] - 17s 27ms/step - loss: 23.0970 - val_loss: 22.5849 Epoch 90/100 643/643 [==============================] - 17s 26ms/step - loss: 23.0955 - val_loss: 22.6030 Epoch 91/100 643/643 [==============================] - 16s 26ms/step - loss: 23.0957 - val_loss: 22.5316 Epoch 92/100 643/643 [==============================] - 16s 26ms/step - loss: 23.0956 - val_loss: 22.6105 Epoch 93/100 643/643 [==============================] - 16s 26ms/step - loss: 23.0949 - val_loss: 21.9217 Epoch 94/100 643/643 [==============================] - 16s 26ms/step - loss: 23.0953 - val_loss: 21.8170 Epoch 95/100 643/643 [==============================] - 16s 26ms/step - loss: 23.0933 - val_loss: 22.8156 Epoch 96/100 643/643 [==============================] - 16s 25ms/step - loss: 23.0934 - val_loss: 22.3775 Epoch 97/100 643/643 [==============================] - 17s 26ms/step - loss: 23.0931 - val_loss: 22.3022 Epoch 98/100 643/643 [==============================] - 17s 26ms/step - loss: 23.0923 - val_loss: 22.4234 Epoch 99/100 643/643 [==============================] - 17s 26ms/step - loss: 23.0929 - val_loss: 22.5102 Epoch 100/100 643/643 [==============================] - 17s 26ms/step - loss: 23.0918 - val_loss: 21.5608
rain2_test = pd.read_csv('test.csv')
test_df = rain2_test.fillna(0)
test_df.head()
Id | minutes_past | radardist_km | Ref | Ref_5x5_10th | Ref_5x5_50th | Ref_5x5_90th | RefComposite | RefComposite_5x5_10th | RefComposite_5x5_50th | ... | RhoHV_5x5_50th | RhoHV_5x5_90th | Zdr | Zdr_5x5_10th | Zdr_5x5_50th | Zdr_5x5_90th | Kdp | Kdp_5x5_10th | Kdp_5x5_50th | Kdp_5x5_90th | |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
0 | 1 | 1 | 8.0 | 0.0 | 0.0 | 0.0 | 14.0 | 0.0 | 0.0 | 0.0 | ... | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
1 | 1 | 5 | 8.0 | 10.0 | 0.0 | 10.0 | 18.0 | 11.5 | 0.0 | 11.0 | ... | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
2 | 1 | 8 | 8.0 | 0.0 | 0.0 | 7.0 | 14.5 | 0.0 | 0.0 | 7.0 | ... | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
3 | 1 | 12 | 8.0 | 14.0 | 0.0 | 9.0 | 16.0 | 14.0 | 0.0 | 9.0 | ... | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
4 | 1 | 15 | 8.0 | 10.5 | 0.0 | 9.0 | 15.5 | 13.5 | 0.0 | 9.0 | ... | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
5 rows × 23 columns
test_seq = test_df.groupby(['Id'])
test_seq_size = test_seq.size()
test_seq_size.count(), test_seq_size.max()
(717625, 19)
X_test = np.zeros((717625, 19, 22))
i = 0
for name, group in test_seq:
# d.shape is (seq_length, 23)
d = group.values
# column 1~22 are features.
# save 1~22 features to 0~21 index of dataset up to d.shape[0].
X_test[i, :d.shape[0], 0:22] = d[:, 1:23]
i += 1;
print(i)
717625
pred = model.predict(X_test)
pred_with_index = np.hstack((np.arange(1, pred.shape[0]+1).reshape(-1,1), pred))
np.savetxt("test_prediction.csv", pred_with_index, "%d,%f",
delimiter=",", header="Id,Expected", comments="")
GRU cell은 2014년 조경현 등의 논문에서 제안됨
[참조]
Cho, K., Van Merriënboer, B., Gulcehre, C., Bahdanau, D., Bougares, F., Schwenk, H., & Bengio, Y. (2014).
Learning phrase representations using RNN encoder-decoder for statistical machine translation. arXiv preprint arXiv:1406.1078.
GRU 셀은 그림에서 확인할 수 있는 것과 같이 LSTM 셀의 간소화된 버전이고 유사하게 작동함
LSTM에서의 변경 점
LSTM의 상태 Vector인 c(t), h(t)가 h(t)로 통합
게이트 제어기 f(t), i(t)가 z(t)로 통합되며 z(t)는 Update gate임
즉, 이전 (t-1)의 기억이 저장될 때 마다 타임스탭 t의 입력이 삭제됨.
출력 게이트가 없음 > 전체 상태 벡터가 매 Time Step 마다 출력됨
이전상태의 어느 부분이 출력될지 제어하는 새로운 게이트 제어기 r(t) 존재함
GRU state vector 계산 식
z(t),r(t)는 각각 update, reset gate를 의미
Update, reset gate에서는 활성화 함수로 sigmoid 함수를 사용
두 게이트 모두 현 시점의 입력값(x(t))와 직전 시점 은닉층 값(h(t-1))을 반영하여 구함
W는 각각 입력값과 은닉층 값을 선형결합하는 Parameter(가중치)
update, reset gate의 활성화 함수는 시그모이드 이므로 0~1사이의 범위를 갖음
기억에 관련된 과정
r(t) 값이 0이면 과거 정보를 모두 잊고 1이면 과거의 정보를 모두 갖으며 r(t)값에 상관없이 현재 정보는 반영됨
위의 활성화 함수 tanh의 경우 -1 ~ 1 사이의 범위를 갖음
현재정보 h(t)에서 기억할만한 정보 g(t) 를 얼마나 조합할지 결정하는 것은 z(t) 즉 Update gate임
n_neurons = 5
gru_cell = tf.contrib.rnn.GRUCell(num_units=n_neurons)
GRU Code Example 1 -> 덧셈에 대한 학습 방식
-> 기억할 수 있으므로 GRU에 적용 가능 따라서, 역 bitstring화 한 숫자의 덧셈에 학습이 가능하다.
각 라이브러리를 Import
import tensorflow as tf
import numpy as np
from numpy import random
import matplotlib.pyplot as plt
from IPython import display
% matplotlib inline
import random
import warnings
warnings.simplefilter(action='ignore', category=FutureWarning) #간단한 warning Message가 나와서 없애기 위해 사용
def as_bytes(num, final_size): # byte로 변경
"""
integer를 bitstring으로 변환
final_size는 bitstring의 길이
Arguments
---------
num: int
The number to convert.
final_size: int
The length of the bitstring.
Returns
-------
list
"""
res = []
for _ in range(final_size):
res.append(num % 2)
num //= 2
return res
"""
Examples
--------
>>> as_bytes(3, 4)
[1, 1, 0, 0]
>>> as_bytes(3, 5)
[1, 1, 0, 0, 0]
"""
def generate_example(num_bits):
"""Generate an example addition.
Arguments
---------
num_bits: int
The number of bits to use.
Returns
-------
a: list
The first term (represented as reversed bitstring) of the addition.
b: list
The second term (represented as reversed bitstring) of the addition.
c: list
The addition (a + b) represented as reversed bitstring.
Examples
--------
>>> np.random.seed(4)
>>> a, b, c = generate_example(3)
>>> a
[0, 1, 0]
>>> b
[0, 1, 0]
>>> c
[1, 0, 0]
>>> # Notice that these numbers are represented as reversed bitstrings)
"""
a = random.randint(0, 2**(num_bits - 1) - 1)
b = random.randint(0, 2**(num_bits - 1) - 1)
res = a + b
return (as_bytes(a, num_bits),
as_bytes(b, num_bits),
as_bytes(res,num_bits))
def generate_batch(num_bits, batch_size):
"""Generates instances of the addition problem.
Arguments
---------
num_bits: int
The number of bits to use for each number.
batch_size: int
The number of examples to generate.
Returns
-------
x: np.array
Two numbers to be added represented as bits (in reversed order).
Shape: b, i, n
Where:
b is bit index from the end.
i is example idx in batch.
n is one of [0,1] depending for first and second summand respectively.
y: np.array
The result of the addition.
Shape: b, i, n
Where:
b is bit index from the end.
i is example idx in batch.
n is always 0 since there is only one result.
"""
x = np.empty((batch_size, num_bits, 2))
y = np.empty((batch_size, num_bits, 1))
for i in range(batch_size):
a, b, r = generate_example(num_bits)
x[i, :, 0] = a
x[i, :, 1] = b
y[i, :, 0] = r
return x, y
batch_size = 100
time_size = 5
#5비트에서 표현되는 수의 100가지 traning set과 test set을 생성
X_train, Y_train = generate_batch(time_size, batch_size)
X_test, Y_test = generate_batch(time_size, batch_size)
class GRU:
"""Implementation of a Gated Recurrent Unit (GRU) as described in [1].
[1] Chung, J., Gulcehre, C., Cho, K., & Bengio, Y. (2014). Empirical evaluation of gated recurrent neural networks on sequence modeling. arXiv preprint arXiv:1412.3555.
Arguments
---------
input_dimensions: int
The size of the input vectors (x_t).
hidden_size: int
The size of the hidden layer vectors (h_t).
dtype: obj
The datatype used for the variables and constants (optional).
"""
def __init__(self, input_dimensions, hidden_size, dtype=tf.float64):
self.input_dimensions = input_dimensions
self.hidden_size = hidden_size
# Weights for input vectors of shape (input_dimensions, hidden_size)
self.Wr = tf.Variable(tf.truncated_normal(
dtype=dtype, shape=(self.input_dimensions, self.hidden_size), mean=0, stddev=0.01), name='Wr')
self.Wz = tf.Variable(tf.truncated_normal(
dtype=dtype, shape=(self.input_dimensions, self.hidden_size), mean=0, stddev=0.01), name='Wz')
self.Wh = tf.Variable(tf.truncated_normal(
dtype=dtype, shape=(self.input_dimensions, self.hidden_size), mean=0, stddev=0.01), name='Wh')
# Weights for hidden vectors of shape (hidden_size, hidden_size)
self.Ur = tf.Variable(tf.truncated_normal(
dtype=dtype, shape=(self.hidden_size, self.hidden_size), mean=0, stddev=0.01), name='Ur')
self.Uz = tf.Variable(tf.truncated_normal(
dtype=dtype, shape=(self.hidden_size, self.hidden_size), mean=0, stddev=0.01), name='Uz')
self.Uh = tf.Variable(tf.truncated_normal(
dtype=dtype, shape=(self.hidden_size, self.hidden_size), mean=0, stddev=0.01), name='Uh')
# Biases for hidden vectors of shape (hidden_size,)
self.br = tf.Variable(tf.truncated_normal(dtype=dtype, shape=(self.hidden_size,), mean=0, stddev=0.01), name='br')
self.bz = tf.Variable(tf.truncated_normal(dtype=dtype, shape=(self.hidden_size,), mean=0, stddev=0.01), name='bz')
self.bh = tf.Variable(tf.truncated_normal(dtype=dtype, shape=(self.hidden_size,), mean=0, stddev=0.01), name='bh')
# Define the input layer placeholder
self.input_layer = tf.placeholder(dtype=tf.float64, shape=(None, None, input_dimensions), name='input')
# Put the time-dimension upfront for the scan operator
self.x_t = tf.transpose(self.input_layer, [1, 0, 2], name='x_t')
self.h_0 = tf.matmul(self.x_t[0, :, :], tf.zeros(dtype=tf.float64, shape=(input_dimensions, hidden_size)), name='h_0')
# Perform the scan operator
self.h_t_transposed = tf.scan(self.forward_pass, self.x_t, initializer=self.h_0, name='h_t_transposed')
# Transpose the result back
self.h_t = tf.transpose(self.h_t_transposed, [1, 0, 2], name='h_t')
def forward_pass(self, h_tm1, x_t):
"""Perform a forward pass.
Arguments
---------
h_tm1: np.matrix
The hidden state at the previous timestep (h_{t-1}).
x_t: np.matrix
The input vector.
"""
# Definitions of z_t and r_t
z_t = tf.sigmoid(tf.matmul(x_t, self.Wz) + tf.matmul(h_tm1, self.Uz) + self.bz)
r_t = tf.sigmoid(tf.matmul(x_t, self.Wr) + tf.matmul(h_tm1, self.Ur) + self.br)
# Definition of h~_t
h_proposal = tf.tanh(tf.matmul(x_t, self.Wh) + tf.matmul(tf.multiply(r_t, h_tm1), self.Uh) + self.bh)
# Compute the next hidden state
h_t = tf.multiply(1 - z_t, h_tm1) + tf.multiply(z_t, h_proposal)
return h_t
#%% (3) Initialize and train the model.
# The input has 2 dimensions: dimension 0 is reserved for the first term and dimension 1 is reverved for the second term
input_dimensions = 2
# Arbitrary number for the size of the hidden state
hidden_size = 16
# Initialize a session
session = tf.Session()
# Create a new instance of the GRU model
gru = GRU(input_dimensions, hidden_size)
# Add an additional layer on top of each of the hidden state outputs
W_output = tf.Variable(tf.truncated_normal(dtype=tf.float64, shape=(hidden_size, 1), mean=0, stddev=0.01))
b_output = tf.Variable(tf.truncated_normal(dtype=tf.float64, shape=(1,), mean=0, stddev=0.01))
output = tf.map_fn(lambda h_t: tf.matmul(h_t, W_output) + b_output, gru.h_t)#quadratic loss 사용
# Create a placeholder for the expected output
expected_output = tf.placeholder(dtype=tf.float64, shape=(batch_size, time_size, 1), name='expected_output')
# Just use quadratic loss
loss = tf.reduce_sum(0.5 * tf.pow(output - expected_output, 2)) / float(batch_size)
# Use the Adam optimizer for training
train_step = tf.train.AdamOptimizer().minimize(loss)
# Initialize all the variables
init_variables = tf.global_variables_initializer()
session.run(init_variables)
# Initialize the losses
train_losses = []
validation_losses = []
a = 1024
b = 16
y=0
# Perform all the iterations
for epoch in range(10000):
# Compute the losses
_, train_loss = session.run([train_step, loss], feed_dict={gru.input_layer: X_train, expected_output: Y_train})
validation_loss = session.run(loss, feed_dict={gru.input_layer: X_test, expected_output: Y_test})
# Log the losses
train_losses += [train_loss]
validation_losses += [validation_loss]
# Display an update every 100 iterations
if epoch % 100 == 0:
plt.plot(train_losses, '-b', label='Train loss')
plt.plot(validation_losses, '-r', label='Validation loss')
plt.legend(loc=0)
plt.title('Loss')
plt.xlabel('Iteration')
plt.ylabel('Loss')
plt.show()
print('Iteration: %d, train loss: %.4f, test loss: %.4f' % (epoch, train_loss, validation_loss))
#%% (4) Manually evaluate the model.
if y==a+b:
# Define two numbers a and b and let the model compute a + b
a = random.randrange(1, 1024)
b = random.randrange(1, 256)
# The model is independent of the sequence length! Now we can test the model on even longer bitstrings
bitstring_length = 20
# Create the feature vectors
X_custom_sample = np.vstack([as_bytes(a, bitstring_length), as_bytes(b, bitstring_length)]).T
X_custom = np.zeros((1,) + X_custom_sample.shape)
X_custom[0, :, :] = X_custom_sample
# Make a prediction by using the model
y_predicted = session.run(output, feed_dict={gru.input_layer: X_custom})
# Just use a linear class separator at 0.5
y_bits = 1 * (y_predicted > 0.5)[0, :, 0]
# Join and reverse the bitstring
y_bitstr = ''.join([str(int(bit)) for bit in y_bits.tolist()])[::-1]
# Convert the found bitstring to a number
y = int(y_bitstr, 2)
print("a : " + str(a)+ ", b : "+str(b) + ", y : " + str(y))
Iteration: 0, train loss: 1.2159, test loss: 1.2426 a : 1024, b : 16, y : 0
Iteration: 100, train loss: 0.6392, test loss: 0.6485 a : 1024, b : 16, y : 31744
Iteration: 200, train loss: 0.6236, test loss: 0.6294 a : 1024, b : 16, y : 1024
Iteration: 300, train loss: 0.6172, test loss: 0.6210 a : 1024, b : 16, y : 1024
Iteration: 400, train loss: 0.6119, test loss: 0.6142 a : 1024, b : 16, y : 1024
Iteration: 500, train loss: 0.6066, test loss: 0.6078 a : 1024, b : 16, y : 1024
Iteration: 600, train loss: 0.6005, test loss: 0.6010 a : 1024, b : 16, y : 1024
Iteration: 700, train loss: 0.5929, test loss: 0.5933 a : 1024, b : 16, y : 1024
Iteration: 800, train loss: 0.5833, test loss: 0.5840 a : 1024, b : 16, y : 1024
Iteration: 900, train loss: 0.5710, test loss: 0.5727 a : 1024, b : 16, y : 1024
Iteration: 1000, train loss: 0.5556, test loss: 0.5589 a : 1024, b : 16, y : 1024
Iteration: 1100, train loss: 0.5364, test loss: 0.5412 a : 1024, b : 16, y : 1024
Iteration: 1200, train loss: 0.5112, test loss: 0.5165 a : 1024, b : 16, y : 1040
Iteration: 1300, train loss: 0.4702, test loss: 0.4760 a : 821, b : 231, y : 1022
Iteration: 1400, train loss: 0.3790, test loss: 0.3888 a : 821, b : 231, y : 1016
Iteration: 1500, train loss: 0.2810, test loss: 0.2975 a : 821, b : 231, y : 1016
Iteration: 1600, train loss: 0.2500, test loss: 0.2705 a : 821, b : 231, y : 1016
Iteration: 1700, train loss: 0.2328, test loss: 0.2543 a : 821, b : 231, y : 2040
Iteration: 1800, train loss: 0.2035, test loss: 0.2283 a : 821, b : 231, y : 2008
Iteration: 1900, train loss: 0.1520, test loss: 0.1727 a : 821, b : 231, y : 1816
Iteration: 2000, train loss: 0.0990, test loss: 0.1080 a : 821, b : 231, y : 1048
Iteration: 2100, train loss: 0.0326, test loss: 0.0368 a : 821, b : 231, y : 1052
Iteration: 2200, train loss: 0.0060, test loss: 0.0096 a : 857, b : 78, y : 935
Iteration: 2300, train loss: 0.0027, test loss: 0.0055 a : 99, b : 235, y : 334
Iteration: 2400, train loss: 0.0016, test loss: 0.0041 a : 941, b : 71, y : 1012
Iteration: 2500, train loss: 0.0010, test loss: 0.0035 a : 227, b : 222, y : 449
Iteration: 2600, train loss: 0.0008, test loss: 0.0031 a : 510, b : 228, y : 738
Iteration: 2700, train loss: 0.0006, test loss: 0.0029 a : 186, b : 78, y : 264
Iteration: 2800, train loss: 0.0005, test loss: 0.0027 a : 279, b : 209, y : 488
Iteration: 2900, train loss: 0.0004, test loss: 0.0025 a : 847, b : 106, y : 953
Iteration: 3000, train loss: 0.0003, test loss: 0.0024 a : 765, b : 249, y : 1014
Iteration: 3100, train loss: 0.0003, test loss: 0.0022 a : 122, b : 156, y : 278
Iteration: 3200, train loss: 0.0002, test loss: 0.0021 a : 124, b : 113, y : 237
Iteration: 3300, train loss: 0.0002, test loss: 0.0020 a : 891, b : 203, y : 1094
Iteration: 3400, train loss: 0.0002, test loss: 0.0019 a : 654, b : 75, y : 729
Iteration: 3500, train loss: 0.0002, test loss: 0.0018 a : 371, b : 227, y : 598
Iteration: 3600, train loss: 0.0001, test loss: 0.0017 a : 92, b : 126, y : 218
Iteration: 3700, train loss: 0.0001, test loss: 0.0017 a : 273, b : 193, y : 466
Iteration: 3800, train loss: 0.0001, test loss: 0.0016 a : 197, b : 163, y : 360
Iteration: 3900, train loss: 0.0001, test loss: 0.0015 a : 222, b : 21, y : 243
Iteration: 4000, train loss: 0.0001, test loss: 0.0015 a : 943, b : 109, y : 1052
Iteration: 4100, train loss: 0.0001, test loss: 0.0014 a : 513, b : 211, y : 724
Iteration: 4200, train loss: 0.0001, test loss: 0.0014 a : 42, b : 239, y : 281
Iteration: 4300, train loss: 0.0001, test loss: 0.0013 a : 810, b : 69, y : 879
Iteration: 4400, train loss: 0.0001, test loss: 0.0013 a : 465, b : 52, y : 517
Iteration: 4500, train loss: 0.0001, test loss: 0.0012 a : 563, b : 96, y : 659
Iteration: 4600, train loss: 0.0001, test loss: 0.0012 a : 464, b : 165, y : 629
Iteration: 4700, train loss: 0.0000, test loss: 0.0011 a : 307, b : 179, y : 486
Iteration: 4800, train loss: 0.0000, test loss: 0.0011 a : 455, b : 139, y : 594
Iteration: 4900, train loss: 0.0000, test loss: 0.0010 a : 344, b : 23, y : 367
Iteration: 5000, train loss: 0.0000, test loss: 0.0010 a : 826, b : 79, y : 905
Iteration: 5100, train loss: 0.0000, test loss: 0.0010 a : 158, b : 30, y : 188
Iteration: 5200, train loss: 0.0000, test loss: 0.0009 a : 664, b : 81, y : 745
Iteration: 5300, train loss: 0.0000, test loss: 0.0009 a : 870, b : 181, y : 1051
Iteration: 5400, train loss: 0.0000, test loss: 0.0009 a : 826, b : 9, y : 835
Iteration: 5500, train loss: 0.0000, test loss: 0.0008 a : 652, b : 109, y : 761
Iteration: 5600, train loss: 0.0000, test loss: 0.0008 a : 103, b : 214, y : 317
Iteration: 5700, train loss: 0.0000, test loss: 0.0008 a : 262, b : 132, y : 394
Iteration: 5800, train loss: 0.0000, test loss: 0.0008 a : 743, b : 158, y : 901
Iteration: 5900, train loss: 0.0000, test loss: 0.0007 a : 608, b : 13, y : 621
Iteration: 6000, train loss: 0.0000, test loss: 0.0007 a : 588, b : 125, y : 713
Iteration: 6100, train loss: 0.0000, test loss: 0.0007 a : 625, b : 146, y : 771
Iteration: 6200, train loss: 0.0000, test loss: 0.0007 a : 742, b : 31, y : 773
Iteration: 6300, train loss: 0.0000, test loss: 0.0007 a : 806, b : 115, y : 921
Iteration: 6400, train loss: 0.0000, test loss: 0.0006 a : 897, b : 95, y : 992
Iteration: 6500, train loss: 0.0000, test loss: 0.0006 a : 318, b : 180, y : 498
Iteration: 6600, train loss: 0.0000, test loss: 0.0006 a : 915, b : 46, y : 961
Iteration: 6700, train loss: 0.0000, test loss: 0.0006 a : 75, b : 216, y : 291
Iteration: 6800, train loss: 0.0000, test loss: 0.0006 a : 620, b : 1, y : 621
Iteration: 6900, train loss: 0.0000, test loss: 0.0006 a : 806, b : 224, y : 1030
Iteration: 7000, train loss: 0.0000, test loss: 0.0006 a : 163, b : 34, y : 197
Iteration: 7100, train loss: 0.0000, test loss: 0.0006 a : 187, b : 157, y : 344
Iteration: 7200, train loss: 0.0000, test loss: 0.0006 a : 379, b : 55, y : 434
Iteration: 7300, train loss: 0.0000, test loss: 0.0006 a : 435, b : 123, y : 558
Iteration: 7400, train loss: 0.0000, test loss: 0.0005 a : 15, b : 220, y : 235
Iteration: 7500, train loss: 0.0000, test loss: 0.0005 a : 847, b : 60, y : 907
Iteration: 7600, train loss: 0.0000, test loss: 0.0005 a : 29, b : 168, y : 197
Iteration: 7700, train loss: 0.0000, test loss: 0.0005 a : 181, b : 77, y : 258
Iteration: 7800, train loss: 0.0000, test loss: 0.0005 a : 248, b : 244, y : 492
Iteration: 7900, train loss: 0.0000, test loss: 0.0005 a : 314, b : 100, y : 414
Iteration: 8000, train loss: 0.0000, test loss: 0.0005 a : 783, b : 147, y : 930
Iteration: 8100, train loss: 0.0000, test loss: 0.0005 a : 626, b : 26, y : 652
Iteration: 8200, train loss: 0.0000, test loss: 0.0005 a : 1003, b : 215, y : 1218
Iteration: 8300, train loss: 0.0000, test loss: 0.0005 a : 687, b : 240, y : 927
Iteration: 8400, train loss: 0.0000, test loss: 0.0005 a : 438, b : 117, y : 555
Iteration: 8500, train loss: 0.0000, test loss: 0.0005 a : 106, b : 146, y : 252
Iteration: 8600, train loss: 0.0000, test loss: 0.0005 a : 1013, b : 67, y : 1080
Iteration: 8700, train loss: 0.0000, test loss: 0.0005 a : 1011, b : 132, y : 1143
Iteration: 8800, train loss: 0.0000, test loss: 0.0005 a : 594, b : 10, y : 604
Iteration: 8900, train loss: 0.0000, test loss: 0.0005 a : 196, b : 9, y : 205
Iteration: 9000, train loss: 0.0000, test loss: 0.0005 a : 371, b : 217, y : 588
Iteration: 9100, train loss: 0.0000, test loss: 0.0005 a : 348, b : 156, y : 504
Iteration: 9200, train loss: 0.0000, test loss: 0.0005 a : 180, b : 235, y : 415
Iteration: 9300, train loss: 0.0000, test loss: 0.0005 a : 168, b : 165, y : 333
Iteration: 9400, train loss: 0.0000, test loss: 0.0005 a : 575, b : 26, y : 601
Iteration: 9500, train loss: 0.0000, test loss: 0.0005 a : 236, b : 152, y : 388
Iteration: 9600, train loss: 0.0000, test loss: 0.0005 a : 825, b : 234, y : 1059
Iteration: 9700, train loss: 0.0000, test loss: 0.0005 a : 208, b : 38, y : 246
Iteration: 9800, train loss: 0.0000, test loss: 0.0005 a : 171, b : 55, y : 226
Iteration: 9900, train loss: 0.0000, test loss: 0.0005 a : 565, b : 236, y : 801
최근 대부분의 NLP(Natrual language processing) 응용은 RNN을 기반으로 함 RNN 기반 자연어 처리 기술은 기계 번역, 자동 요약 등에 사용되어 지고 있음
특히 기계 번역에 대해서는 Tensorflow의 Word2Vec, Seq2Seq tutorial에 잘 설명되어 있음
과거 텍스트 분석에서는 단어 하나에 하나의 인덱스 정수를 할당하는 Bag of Words 방법이 주로 사용되어 왔다.
단어 분류 벡터의 크기는 데이터셋에 존재하는 단어의 가짓수만큼 이루어진다.
데이터셋에 "I","You","He","She", "am","are","is", "a","an", "boy","girl" 의 11개의 단어가 존재할 경우 아래와 같이 인덱스 정수가 할당된다.
"I" : 0, "You" :1, "He" : 2, "She" : 3 "am" : 4, "are" : 5, "is" : 6, "a" : 7 "an" : 8, "boy" : 9, "girl" : 10
이러한 방식대로 인덱싱 된 단어의 인덱싱을 표현에는 단어별로 해당하는 인덱스의 값만 1이고 나머지는 모두 0의 값을 가지는 one-hot 벡터와 같은 희소 벡터(Sparse vector) 형태가 사용될 수 있다.
ex) "am" = [0,0,0,0,1,0,0,0,0,0]
데이터셋에 존재하는 단어의 가짓수가 십수개정도라면 희소 벡터로 표현하는 방법을 고려해볼 만 하다. 하지만 그러한 경우는 한정적이고 대부분의 경우 수천, 수만개 이상의 단어를 고려해야 하기 때문에 공간적 낭비를 야기시킨다.
단어의 가짓수가 N개일 경우에 N개의 차원을 가지는 벡터를 사용하지 않고, 아래와 같이 실수를 사용하여 더 적은 차원으로 표현한다면 상대적으로 공간적 이득을 취할 수 있을 것이다.
"I" : [0.1, 0.5], "You" :[0.2,0.4], "He" : [0.2,0.6], "She" : [0.2,0.3] "am" : [0.5,0.2], "are" : [0.5,0.3], "is" : [0.5,0.4], "a" : [0.7,0.1] "an" : [0.7,0.2] , "boy" : [0.9,0.1], "girl" : [0.9,0.2]
이러한 표현 방식을 밀집 벡터(Dense vector)라고 한다.
데이터셋에 존재하는 단어들을 밀집 벡터 형태로 표현하는 방법을 워드 임베딩(Word embedding)이라고 하고, 워드 임베딩의 결과물을 임베딩 벡터라고 한다.
임베딩 벡터와 one-hot 벡터의 특징 비교는 아래 표에서 확인할 수 있다
구분 | One-hot 벡터 | 임베딩 벡터 |
---|---|---|
차원 | 고차원(N개, 단어의 가짓수) | 저차원(임베딩 시에 지정) |
표현 방법 | 수동, 사용자가 일일히 설정 | 훈련 데이터로부터 학습하여 표현 |
값의 타입 | 정수형, 1과 0 | 실수 |
아래 그림과 같이 비슷한 의미의 단어들이 비슷한 벡터로 임베딩 된다면 벡터들의 합, 차 연산을 통하여 연관성이 있는 단어 찾기, 동일한 관계에 있는 단어 찾기 등에 활용될 수 있을 것이다.
워드 임베딩에 가장 많이 쓰이는 알고리즘은 word2Vec 알고리즘으로 단어 데이터 셋을 학습하여 비슷한 단어들을 비슷한 벡터값이 가지도록 임베딩하는 알고리즘이다.
word2vec의 기본 아이디어는 비슷한 의미를 가지는 단어들은 문장 내에서 비슷한 위치에 존재하는것에서 시작된다.
You shall know a word by the company it keeps. - J.R. Firth (1957)
from six.moves import urllib
import tensorflow as tf
import numpy as np
import errno
import os
import zipfile
WORDS_PATH = "datasets/words"
WORDS_URL = 'http://mattmahoney.net/dc/text8.zip'
def fetch_words_data(words_url=WORDS_URL, words_path=WORDS_PATH):
os.makedirs(words_path, exist_ok=True)
zip_path = os.path.join(words_path, "words.zip")
if not os.path.exists(zip_path):
urllib.request.urlretrieve(words_url, zip_path)
with zipfile.ZipFile(zip_path) as f:
data = f.read(f.namelist()[0])
return data.decode("ascii").split()
C:\Users\User\Anaconda3\lib\site-packages\h5py\__init__.py:36: FutureWarning: Conversion of the second argument of issubdtype from `float` to `np.floating` is deprecated. In future, it will be treated as `np.float64 == np.dtype(float).type`. from ._conv import register_converters as _register_converters
words = fetch_words_data()
words
['anarchism', 'originated', 'as', 'a', 'term', 'of', 'abuse', 'first', 'used', 'against', 'early', 'working', 'class', 'radicals', 'including', 'the', 'diggers', 'of', 'the', 'english', 'revolution', 'and', 'the', 'sans', 'culottes', 'of', 'the', 'french', 'revolution', 'whilst', 'the', 'term', 'is', 'still', 'used', 'in', 'a', 'pejorative', 'way', 'to', 'describe', 'any', 'act', 'that', 'used', 'violent', 'means', 'to', 'destroy', 'the', 'organization', 'of', 'society', 'it', 'has', 'also', 'been', 'taken', 'up', 'as', 'a', 'positive', 'label', 'by', 'self', 'defined', 'anarchists', 'the', 'word', 'anarchism', 'is', 'derived', 'from', 'the', 'greek', 'without', 'archons', 'ruler', 'chief', 'king', 'anarchism', 'as', 'a', 'political', 'philosophy', 'is', 'the', 'belief', 'that', 'rulers', 'are', 'unnecessary', 'and', 'should', 'be', 'abolished', 'although', 'there', 'are', 'differing', 'interpretations', 'of', 'what', 'this', 'means', 'anarchism', 'also', 'refers', 'to', 'related', 'social', 'movements', 'that', 'advocate', 'the', 'elimination', 'of', 'authoritarian', 'institutions', 'particularly', 'the', 'state', 'the', 'word', 'anarchy', 'as', 'most', 'anarchists', 'use', 'it', 'does', 'not', 'imply', 'chaos', 'nihilism', 'or', 'anomie', 'but', 'rather', 'a', 'harmonious', 'anti', 'authoritarian', 'society', 'in', 'place', 'of', 'what', 'are', 'regarded', 'as', 'authoritarian', 'political', 'structures', 'and', 'coercive', 'economic', 'institutions', 'anarchists', 'advocate', 'social', 'relations', 'based', 'upon', 'voluntary', 'association', 'of', 'autonomous', 'individuals', 'mutual', 'aid', 'and', 'self', 'governance', 'while', 'anarchism', 'is', 'most', 'easily', 'defined', 'by', 'what', 'it', 'is', 'against', 'anarchists', 'also', 'offer', 'positive', 'visions', 'of', 'what', 'they', 'believe', 'to', 'be', 'a', 'truly', 'free', 'society', 'however', 'ideas', 'about', 'how', 'an', 'anarchist', 'society', 'might', 'work', 'vary', 'considerably', 'especially', 'with', 'respect', 'to', 'economics', 'there', 'is', 'also', 'disagreement', 'about', 'how', 'a', 'free', 'society', 'might', 'be', 'brought', 'about', 'origins', 'and', 'predecessors', 'kropotkin', 'and', 'others', 'argue', 'that', 'before', 'recorded', 'history', 'human', 'society', 'was', 'organized', 'on', 'anarchist', 'principles', 'most', 'anthropologists', 'follow', 'kropotkin', 'and', 'engels', 'in', 'believing', 'that', 'hunter', 'gatherer', 'bands', 'were', 'egalitarian', 'and', 'lacked', 'division', 'of', 'labour', 'accumulated', 'wealth', 'or', 'decreed', 'law', 'and', 'had', 'equal', 'access', 'to', 'resources', 'william', 'godwin', 'anarchists', 'including', 'the', 'the', 'anarchy', 'organisation', 'and', 'rothbard', 'find', 'anarchist', 'attitudes', 'in', 'taoism', 'from', 'ancient', 'china', 'kropotkin', 'found', 'similar', 'ideas', 'in', 'stoic', 'zeno', 'of', 'citium', 'according', 'to', 'kropotkin', 'zeno', 'repudiated', 'the', 'omnipotence', 'of', 'the', 'state', 'its', 'intervention', 'and', 'regimentation', 'and', 'proclaimed', 'the', 'sovereignty', 'of', 'the', 'moral', 'law', 'of', 'the', 'individual', 'the', 'anabaptists', 'of', 'one', 'six', 'th', 'century', 'europe', 'are', 'sometimes', 'considered', 'to', 'be', 'religious', 'forerunners', 'of', 'modern', 'anarchism', 'bertrand', 'russell', 'in', 'his', 'history', 'of', 'western', 'philosophy', 'writes', 'that', 'the', 'anabaptists', 'repudiated', 'all', 'law', 'since', 'they', 'held', 'that', 'the', 'good', 'man', 'will', 'be', 'guided', 'at', 'every', 'moment', 'by', 'the', 'holy', 'spirit', 'from', 'this', 'premise', 'they', 'arrive', 'at', 'communism', 'the', 'diggers', 'or', 'true', 'levellers', 'were', 'an', 'early', 'communistic', 'movement', 'during', 'the', 'time', 'of', 'the', 'english', 'civil', 'war', 'and', 'are', 'considered', 'by', 'some', 'as', 'forerunners', 'of', 'modern', 'anarchism', 'in', 'the', 'modern', 'era', 'the', 'first', 'to', 'use', 'the', 'term', 'to', 'mean', 'something', 'other', 'than', 'chaos', 'was', 'louis', 'armand', 'baron', 'de', 'lahontan', 'in', 'his', 'nouveaux', 'voyages', 'dans', 'l', 'am', 'rique', 'septentrionale', 'one', 'seven', 'zero', 'three', 'where', 'he', 'described', 'the', 'indigenous', 'american', 'society', 'which', 'had', 'no', 'state', 'laws', 'prisons', 'priests', 'or', 'private', 'property', 'as', 'being', 'in', 'anarchy', 'russell', 'means', 'a', 'libertarian', 'and', 'leader', 'in', 'the', 'american', 'indian', 'movement', 'has', 'repeatedly', 'stated', 'that', 'he', 'is', 'an', 'anarchist', 'and', 'so', 'are', 'all', 'his', 'ancestors', 'in', 'one', 'seven', 'nine', 'three', 'in', 'the', 'thick', 'of', 'the', 'french', 'revolution', 'william', 'godwin', 'published', 'an', 'enquiry', 'concerning', 'political', 'justice', 'although', 'godwin', 'did', 'not', 'use', 'the', 'word', 'anarchism', 'many', 'later', 'anarchists', 'have', 'regarded', 'this', 'book', 'as', 'the', 'first', 'major', 'anarchist', 'text', 'and', 'godwin', 'as', 'the', 'founder', 'of', 'philosophical', 'anarchism', 'but', 'at', 'this', 'point', 'no', 'anarchist', 'movement', 'yet', 'existed', 'and', 'the', 'term', 'anarchiste', 'was', 'known', 'mainly', 'as', 'an', 'insult', 'hurled', 'by', 'the', 'bourgeois', 'girondins', 'at', 'more', 'radical', 'elements', 'in', 'the', 'french', 'revolution', 'the', 'first', 'self', 'labelled', 'anarchist', 'pierre', 'joseph', 'proudhon', 'it', 'is', 'commonly', 'held', 'that', 'it', 'wasn', 't', 'until', 'pierre', 'joseph', 'proudhon', 'published', 'what', 'is', 'property', 'in', 'one', 'eight', 'four', 'zero', 'that', 'the', 'term', 'anarchist', 'was', 'adopted', 'as', 'a', 'self', 'description', 'it', 'is', 'for', 'this', 'reason', 'that', 'some', 'claim', 'proudhon', 'as', 'the', 'founder', 'of', 'modern', 'anarchist', 'theory', 'in', 'what', 'is', 'property', 'proudhon', 'answers', 'with', 'the', 'famous', 'accusation', 'property', 'is', 'theft', 'in', 'this', 'work', 'he', 'opposed', 'the', 'institution', 'of', 'decreed', 'property', 'propri', 't', 'where', 'owners', 'have', 'complete', 'rights', 'to', 'use', 'and', 'abuse', 'their', 'property', 'as', 'they', 'wish', 'such', 'as', 'exploiting', 'workers', 'for', 'profit', 'in', 'its', 'place', 'proudhon', 'supported', 'what', 'he', 'called', 'possession', 'individuals', 'can', 'have', 'limited', 'rights', 'to', 'use', 'resources', 'capital', 'and', 'goods', 'in', 'accordance', 'with', 'principles', 'of', 'equality', 'and', 'justice', 'proudhon', 's', 'vision', 'of', 'anarchy', 'which', 'he', 'called', 'mutualism', 'mutuellisme', 'involved', 'an', 'exchange', 'economy', 'where', 'individuals', 'and', 'groups', 'could', 'trade', 'the', 'products', 'of', 'their', 'labor', 'using', 'labor', 'notes', 'which', 'represented', 'the', 'amount', 'of', 'working', 'time', 'involved', 'in', 'production', 'this', 'would', 'ensure', 'that', 'no', 'one', 'would', 'profit', 'from', 'the', 'labor', 'of', 'others', 'workers', 'could', 'freely', 'join', 'together', 'in', 'co', 'operative', 'workshops', 'an', 'interest', 'free', 'bank', 'would', 'be', 'set', 'up', 'to', 'provide', 'everyone', 'with', 'access', 'to', 'the', 'means', 'of', 'production', 'proudhon', 's', 'ideas', 'were', 'influential', 'within', 'french', 'working', 'class', 'movements', 'and', 'his', 'followers', 'were', 'active', 'in', 'the', 'revolution', 'of', 'one', 'eight', 'four', 'eight', 'in', 'france', 'proudhon', 's', 'philosophy', 'of', 'property', 'is', 'complex', 'it', 'was', 'developed', 'in', 'a', 'number', 'of', 'works', 'over', 'his', 'lifetime', 'and', 'there', 'are', 'differing', 'interpretations', 'of', 'some', 'of', 'his', 'ideas', 'for', 'more', 'detailed', 'discussion', 'see', 'here', 'max', 'stirner', 's', 'egoism', 'in', 'his', 'the', 'ego', 'and', 'its', 'own', 'stirner', 'argued', 'that', 'most', 'commonly', 'accepted', 'social', 'institutions', 'including', 'the', 'notion', 'of', 'state', 'property', 'as', 'a', 'right', 'natural', 'rights', 'in', 'general', 'and', 'the', 'very', 'notion', 'of', 'society', 'were', 'mere', 'illusions', 'or', 'ghosts', 'in', 'the', 'mind', 'saying', 'of', 'society', 'that', 'the', 'individuals', 'are', 'its', 'reality', 'he', 'advocated', 'egoism', 'and', 'a', 'form', 'of', 'amoralism', 'in', 'which', 'individuals', 'would', 'unite', 'in', 'associations', 'of', 'egoists', 'only', 'when', 'it', 'was', 'in', 'their', 'self', 'interest', 'to', 'do', 'so', 'for', 'him', 'property', 'simply', 'comes', 'about', 'through', 'might', 'whoever', 'knows', 'how', 'to', 'take', 'to', 'defend', 'the', 'thing', 'to', 'him', 'belongs', 'property', 'and', 'what', 'i', 'have', 'in', 'my', 'power', 'that', 'is', 'my', 'own', 'so', 'long', 'as', 'i', 'assert', 'myself', 'as', 'holder', 'i', 'am', 'the', 'proprietor', 'of', 'the', 'thing', 'stirner', 'never', 'called', 'himself', 'an', 'anarchist', 'he', 'accepted', 'only', 'the', 'label', 'egoist', 'nevertheless', 'his', 'ideas', 'were', 'influential', 'on', 'many', 'individualistically', 'inclined', 'anarchists', 'although', 'interpretations', 'of', 'his', 'thought', 'are', 'diverse', ...]
len(words)
17005207
from collections import Counter
vocabulary_size = 50000
#많이 나온 단어 순서대로 dictionary에 추가 및 index 번호 매칭
vocabulary = [("UNK", None)] + Counter(words).most_common(vocabulary_size - 1)
vocabulary = np.array([word for word, _ in vocabulary])
dictionary = {word: code for code, word in enumerate(vocabulary)}
#DIctionary의 index값에 대응하여 words(텍스트 데이터) 인덱싱
data = np.array([dictionary.get(word, 0) for word in words])
" ".join(words[:9]), data[:9]
('anarchism originated as a term of abuse first used', array([5234, 3081, 12, 6, 195, 2, 3134, 46, 59]))
" ".join([vocabulary[word_index] for word_index in [5241, 3081, 12, 6, 195, 2, 3134, 46, 59]])
'cycles originated as a term of abuse first used'
from collections import deque
def generate_batch(batch_size, num_skips, skip_window):
global data_index
assert batch_size % num_skips == 0
assert num_skips <= 2 * skip_window
batch = np.ndarray(shape=[batch_size], dtype=np.int32)
labels = np.ndarray(shape=[batch_size, 1], dtype=np.int32)
span = 2 * skip_window + 1 # [ skip_window target skip_window ]
buffer = deque(maxlen=span)
for _ in range(span):
buffer.append(data[data_index])
data_index = (data_index + 1) % len(data)
for i in range(batch_size // num_skips):
target = skip_window # target label at the center of the buffer
targets_to_avoid = [ skip_window ]
for j in range(num_skips):
while target in targets_to_avoid:
target = np.random.randint(0, span)
targets_to_avoid.append(target)
batch[i * num_skips + j] = buffer[skip_window]
labels[i * num_skips + j, 0] = buffer[target]
buffer.append(data[data_index])
data_index = (data_index + 1) % len(data)
return batch, labels
np.random.seed(42)
data_index = 0
batch, labels = generate_batch(8, 2, 1)
batch, [vocabulary[word] for word in batch]
(array([3081, 3081, 12, 12, 6, 6, 195, 195]), ['originated', 'originated', 'as', 'as', 'a', 'a', 'term', 'term'])
labels, [vocabulary[word] for word in labels[:, 0]]
(array([[ 12], [5234], [ 6], [3081], [ 12], [ 195], [ 2], [ 6]]), ['as', 'anarchism', 'a', 'originated', 'as', 'term', 'of', 'a'])
batch_size = 128
embedding_size = 128 # Dimension of the embedding vector.
skip_window = 1 # How many words to consider left and right.
num_skips = 2 # How many times to reuse an input to generate a label.
# We pick a random validation set to sample nearest neighbors. Here we limit the
# validation samples to the words that have a low numeric ID, which by
# construction are also the most frequent.
valid_size = 16 # Random set of words to evaluate similarity on.
valid_window = 100 # Only pick dev samples in the head of the distribution.
valid_examples = np.random.choice(valid_window, valid_size, replace=False)
num_sampled = 64 # Number of negative examples to sample.
learning_rate = 0.01
reset_graph()
# Input data.
train_labels = tf.placeholder(tf.int32, shape=[batch_size, 1])
valid_dataset = tf.constant(valid_examples, dtype=tf.int32)
vocabulary_size = 50000
embedding_size = 150
# Look up embeddings for inputs.
init_embeds = tf.random_uniform([vocabulary_size, embedding_size], -1.0, 1.0)
embeddings = tf.Variable(init_embeds)
train_inputs = tf.placeholder(tf.int32, shape=[None])
embed = tf.nn.embedding_lookup(embeddings, train_inputs)
# Construct the variables for the NCE loss
nce_weights = tf.Variable(
tf.truncated_normal([vocabulary_size, embedding_size],
stddev=1.0 / np.sqrt(embedding_size)))
nce_biases = tf.Variable(tf.zeros([vocabulary_size]))
# Compute the average NCE loss for the batch.
# tf.nce_loss automatically draws a new sample of the negative labels each
# time we evaluate the loss.
loss = tf.reduce_mean(
tf.nn.nce_loss(nce_weights, nce_biases, train_labels, embed,
num_sampled, vocabulary_size))
# Construct the Adam optimizer
optimizer = tf.train.AdamOptimizer(learning_rate)
training_op = optimizer.minimize(loss)
# Compute the cosine similarity between minibatch examples and all embeddings.
norm = tf.sqrt(tf.reduce_sum(tf.square(embeddings), axis=1, keepdims=True))
normalized_embeddings = embeddings / norm
valid_embeddings = tf.nn.embedding_lookup(normalized_embeddings, valid_dataset)
similarity = tf.matmul(valid_embeddings, normalized_embeddings, transpose_b=True)
# Add variable initializer.
init = tf.global_variables_initializer()
num_steps = 10001
with tf.Session() as session:
init.run()
average_loss = 0
for step in range(num_steps):
print("\rIteration: {}".format(step), end="\t")
batch_inputs, batch_labels = generate_batch(batch_size, num_skips, skip_window)
feed_dict = {train_inputs : batch_inputs, train_labels : batch_labels}
# We perform one update step by evaluating the training op (including it
# in the list of returned values for session.run()
_, loss_val = session.run([training_op, loss], feed_dict=feed_dict)
average_loss += loss_val
if step % 2000 == 0:
if step > 0:
average_loss /= 2000
# The average loss is an estimate of the loss over the last 2000 batches.
print("Average loss at step ", step, ": ", average_loss)
average_loss = 0
# Note that this is expensive (~20% slowdown if computed every 500 steps)
if step % 10000 == 0:
sim = similarity.eval()
for i in range(valid_size):
valid_word = vocabulary[valid_examples[i]]
top_k = 8 # number of nearest neighbors
nearest = (-sim[i, :]).argsort()[1:top_k+1]
log_str = "Nearest to %s:" % valid_word
for k in range(top_k):
close_word = vocabulary[nearest[k]]
log_str = "%s %s," % (log_str, close_word)
print(log_str)
final_embeddings = normalized_embeddings.eval()
Iteration: 0 Average loss at step 0 : 289.90948486328125 Nearest to over: tt, tuned, manichaeans, fractional, cambridge, balaguer, fluoride, strenuously, Nearest to one: imagines, tijuana, hindrance, motorcyclist, steadfastly, lords, letting, hutchinson, Nearest to were: bezier, antibodies, nicknamed, panthers, compiler, tao, smarter, busy, Nearest to may: failure, rna, efficacious, aspirin, lecompton, definitive, geese, amphibious, Nearest to two: annihilate, bettors, wir, cindy, epinephrine, team, voluntarily, crystallize, Nearest to its: knob, abeokuta, bracelet, bastards, ivens, objectivity, blanton, cold, Nearest to than: lame, watts, stones, sram, elves, zarqawi, applets, cloves, Nearest to these: pedro, condoned, neck, ssn, supervising, doug, thereto, melton, Nearest to they: lowly, deportation, shrewd, reznor, tojo, decadent, occured, risotto, Nearest to is: interests, golfers, dropouts, egyptians, richards, legionnaires, opener, leonel, Nearest to up: clair, drives, steadfast, missed, nashville, kilowatts, anal, vinland, Nearest to he: transitioned, winchell, resh, goldsmiths, standardised, markings, pursued, satirized, Nearest to people: blissymbolics, mike, buffers, untouchables, carolingian, posted, ville, hypertalk, Nearest to more: cactus, sta, reformation, poets, diligently, rsc, ravaged, nabokov, Nearest to was: russo, rammed, investiture, glucagon, heck, adventurer, sharada, homing, Nearest to UNK: reykjav, fi, rosalyn, mainline, archaeologist, armstrong, stevenage, ean, Iteration: 2000 Average loss at step 2000 : 132.04802756881713 Iteration: 4000 Average loss at step 4000 : 62.3403196284771 Iteration: 6000 Average loss at step 6000 : 41.11393769288063 Iteration: 8000 Average loss at step 8000 : 31.31713091826439 Iteration: 10000 Average loss at step 10000 : 25.68695637321472 Nearest to over: and, starvation, years, nine, draft, mctaggart, worst, for, Nearest to one: nine, four, five, three, two, seven, six, UNK, Nearest to were: are, loving, coulomb, peuple, pahlavi, accordion, be, prey, Nearest to may: seo, osce, egyptology, two, could, omotic, nine, absurd, Nearest to two: zero, three, four, one, nine, five, eight, six, Nearest to its: the, astatine, advocating, altaic, workings, atomists, nascar, confirm, Nearest to than: asparagales, levitt, more, subsets, nabokov, conformation, bradley, minnesota, Nearest to these: displaced, snowball, tuned, antigua, seceded, fallacy, or, kournikova, Nearest to they: it, vaginal, not, milne, lincoln, that, clues, starved, Nearest to is: are, exists, ampere, mosque, ions, logan, subgroups, in, Nearest to up: integrity, a, flow, laughton, peake, rn, bodyguard, natively, Nearest to he: not, his, was, it, hoxha, carrot, this, domestically, Nearest to people: harbored, simulate, typewriter, fins, men, wikipedia, simply, ignoring, Nearest to more: subproblems, kierkegaard, representations, aruba, lubricants, tetrapods, kano, astatine, Nearest to was: walther, passengers, by, hoxha, and, in, plutonium, breasted, Nearest to UNK: one, the, nine, six, and, five, seven, ginsberg,
np.save("./my_final_embeddings.npy", final_embeddings)
final_embeddings = np.load("./my_final_embeddings.npy")
def plot_with_labels(low_dim_embs, labels):
assert low_dim_embs.shape[0] >= len(labels), "More labels than embeddings"
plt.figure(figsize=(18, 18)) #in inches
for i, label in enumerate(labels):
x, y = low_dim_embs[i,:]
plt.scatter(x, y)
plt.annotate(label,
xy=(x, y),
xytext=(5, 2),
textcoords='offset points',
ha='right',
va='bottom')
from sklearn.manifold import TSNE
tsne = TSNE(perplexity=30, n_components=2, init='pca', n_iter=5000)
plot_only = 500
low_dim_embs = tsne.fit_transform(final_embeddings[:plot_only,:])
labels = [vocabulary[i] for i in range(plot_only)]
plot_with_labels(low_dim_embs, labels)
C:\Users\User\Anaconda3\lib\site-packages\matplotlib\font_manager.py:1328: UserWarning: findfont: Font family ['NanumBarunGothic'] not found. Falling back to DejaVu Sans (prop.get_family(), self.defaultFamily[fontext]))
def get_embedding_vector(word) :
if word in labels:
return final_embeddings[labels.index(word)]
else :
return final_embeddings[labels.index("UNK")]
def get_word_from_embedding_vector(vector) :
labels[final_embeddings.searchsorted(vector)]
book = get_embedding_vector("book")
books = get_embedding_vector("books")
print(book)
print(books)
[ 8.11967477e-02 -8.70217234e-02 -1.71966329e-01 -3.10007334e-02 1.22588813e-01 4.15035449e-02 -3.97832841e-02 1.03306666e-01 -5.13740666e-02 -1.50740325e-05 2.36677695e-02 4.94580008e-02 1.03616439e-01 3.71415205e-02 1.41070494e-02 -5.71096875e-02 -1.12072276e-02 -4.99432683e-02 -1.10635474e-01 9.15328860e-02 -1.11827679e-01 -9.32430178e-02 3.96184884e-02 -3.30627784e-02 -2.29057353e-02 6.84185475e-02 5.60284182e-02 1.02757953e-01 5.07382788e-02 -4.47140373e-02 5.89091182e-02 -2.92926375e-02 1.21545300e-01 -1.01840749e-01 -7.09879026e-02 -7.19989985e-02 -2.84198150e-02 2.27203071e-02 1.92235224e-02 1.26509935e-01 4.76578698e-02 -9.06215236e-02 -1.18482113e-02 6.88924864e-02 -3.59332263e-02 1.84279121e-02 1.53495401e-01 -1.53897218e-02 -1.96214676e-01 -5.81934154e-02 5.59540428e-02 5.05696535e-02 -4.57691699e-02 6.03613183e-02 1.41751757e-02 6.23774230e-02 6.63828552e-02 -1.43312952e-02 -1.52248427e-01 -1.10419700e-02 1.82753261e-02 8.35772008e-02 2.01262292e-02 1.63935460e-02 5.77252842e-02 -7.00234610e-04 -4.86567281e-02 7.78157860e-02 -7.00143278e-02 1.38980582e-01 -5.97267710e-02 -1.29259238e-02 2.76592299e-02 -3.16436328e-02 9.49173048e-03 -2.47863587e-02 2.88317911e-02 -4.44818251e-02 -2.39930283e-02 1.40019014e-01 -8.95779282e-02 -1.65202260e-01 -7.07005411e-02 1.19936042e-01 5.09924926e-02 9.66727827e-03 -4.41595577e-02 -1.83375463e-01 3.07281334e-02 2.93952804e-02 -1.21979989e-01 4.80465218e-02 -6.44721417e-03 -2.73825973e-02 9.73795578e-02 4.86021526e-02 2.05175385e-01 1.24760300e-01 -8.92800763e-02 8.69385898e-02 2.04830050e-01 -1.07171685e-01 5.26494235e-02 -7.12878108e-02 8.87956768e-02 9.40816775e-02 1.01239927e-01 2.13408485e-01 -8.22283253e-02 -2.02811107e-01 1.07662879e-01 -4.89561334e-02 -1.61828786e-01 -8.70434418e-02 5.63305020e-02 1.48246333e-01 1.54790683e-02 -2.32865196e-02 -5.04413508e-02 -1.07120126e-02 -5.36555052e-02 3.38100009e-02 -4.48867083e-02 4.44815233e-02 7.44109526e-02 5.25817312e-02 -6.95396727e-03 -1.09779701e-01 -1.69802997e-02 1.66244358e-02 -1.24319933e-01 -2.94107012e-02 -1.05879419e-01 4.31345664e-02 1.10730547e-02 4.19257134e-02 -8.10891092e-02 1.03221834e-01 -3.60140651e-02 -1.18757915e-02 7.07798311e-03 -1.04610033e-01 7.36901462e-02 3.17069888e-02 -4.37907502e-02 2.61099208e-02 1.89848542e-02 1.10179439e-01 5.62798418e-02 -8.57226029e-02] [ 0.09614198 -0.06738428 -0.12582357 -0.05169483 0.05552461 -0.0465885 0.00585051 0.10951979 -0.00273592 -0.0296318 0.02932433 -0.00595051 0.04685732 0.04693311 -0.13873982 -0.02362289 0.02116635 -0.09165668 0.03374662 0.27602634 -0.02162869 0.03618298 0.0780734 -0.0642141 -0.0074903 0.02315407 -0.09129235 0.07237143 0.04778126 -0.02027319 0.15120961 -0.12684101 -0.05345874 -0.02620143 -0.11929218 0.02733118 -0.0644905 -0.00139687 -0.1449325 0.00414733 0.0194439 -0.0203229 0.00696965 0.00954631 0.05373196 0.15218425 0.08328106 -0.08442029 -0.06138497 -0.09724754 -0.24286485 0.04632554 -0.06137327 0.11042858 0.12580295 0.06829493 0.0738036 -0.02920705 0.05265538 0.11691396 -0.00579731 0.15490174 -0.0333741 0.03474244 0.0894068 0.13453168 -0.12202108 0.03192778 -0.09567177 -0.1128623 -0.02608141 -0.03973531 -0.03200904 -0.01274096 -0.04744675 0.05102651 -0.05462563 -0.03985818 0.02517422 0.11964259 0.00870021 -0.09088323 -0.0014794 0.05104452 -0.06488837 -0.02799428 -0.07138211 0.05330136 0.16549452 0.10190323 -0.01939799 -0.1259147 0.09669183 0.08114678 0.03704776 -0.00848902 0.02221559 -0.0249784 -0.11121914 0.00322947 0.0600058 -0.04229376 -0.07224835 0.02624182 0.139776 -0.04121959 -0.02445537 0.06252084 0.00790126 -0.11362157 0.03540715 0.04894249 -0.11359426 0.11589114 -0.03024011 -0.10312349 -0.06332789 -0.07424396 -0.02119079 0.09676112 -0.09522741 -0.10041434 0.00058978 -0.10675956 0.02921161 0.12029356 -0.04405495 -0.05087327 0.02333982 -0.11894476 -0.11395201 -0.03469153 0.06921869 -0.04496962 -0.04514514 0.0011107 0.00574808 0.10513358 -0.08161799 -0.04749455 0.09605267 -0.1775613 0.13657683 0.02491989 -0.0903208 0.03579475 0.13865666 0.01744475 0.00098767 -0.13164486]
RNN을 이용한 seq2seq 모델은 길이가 다른 여러 시퀀스를 입력하여 여러 시퀀스를 출력한다.
seq2seq 모델의 대표적인 응용이 기계 번역인데, 하나의 문장(시퀀스)를 입력받고 다른 언어로 된 문장(시퀀스)를 출력한다.
본 구현은 연습문제 9번의 해답이다.
import os
import pickle
import copy
import numpy as np
def load_data(path):
input_file = os.path.join(path)
with open(input_file, 'r', encoding='utf-8') as f:
data = f.read()
return data
데이터 셋은 WMT 10 French-English corpus의 축소 버전 사용
source_path = 'data/small_vocab_en'
target_path = 'data/small_vocab_fr'
source_text = load_data(source_path)
target_text = load_data(target_path)
데이터셋의 구성 확인
import numpy as np
from collections import Counter
print('Dataset Brief Stats')
print('* number of unique words in English sample sentences: {}\
[this is roughly measured/without any preprocessing]'.format(len(Counter(source_text.split()))))
print()
english_sentences = source_text.split('\n')
print('* English sentences')
print('\t- number of sentences: {}'.format(len(english_sentences)))
print('\t- avg. number of words in a sentence: {}'.format(np.average([len(sentence.split()) for sentence in english_sentences])))
french_sentences = target_text.split('\n')
print('* French sentences')
print('\t- number of sentences: {} [data integrity check / should have the same number]'.format(len(french_sentences)))
print('\t- avg. number of words in a sentence: {}'.format(np.average([len(sentence.split()) for sentence in french_sentences])))
print()
sample_sentence_range = (0, 5)
side_by_side_sentences = list(zip(english_sentences, french_sentences))[sample_sentence_range[0]:sample_sentence_range[1]]
print('* Sample sentences range from {} to {}'.format(sample_sentence_range[0], sample_sentence_range[1]))
for index, sentence in enumerate(side_by_side_sentences):
en_sent, fr_sent = sentence
print('[{}-th] sentence'.format(index+1))
print('\tEN: {}'.format(en_sent))
print('\tFR: {}'.format(fr_sent))
print()
Dataset Brief Stats * number of unique words in English sample sentences: 227 [this is roughly measured/without any preprocessing] * English sentences - number of sentences: 137861 - avg. number of words in a sentence: 13.225277634719028 * French sentences - number of sentences: 137861 [data integrity check / should have the same number] - avg. number of words in a sentence: 14.226612312401622 * Sample sentences range from 0 to 5 [1-th] sentence EN: new jersey is sometimes quiet during autumn , and it is snowy in april . FR: new jersey est parfois calme pendant l' automne , et il est neigeux en avril . [2-th] sentence EN: the united states is usually chilly during july , and it is usually freezing in november . FR: les états-unis est généralement froid en juillet , et il gèle habituellement en novembre . [3-th] sentence EN: california is usually quiet during march , and it is usually hot in june . FR: california est généralement calme en mars , et il est généralement chaud en juin . [4-th] sentence EN: the united states is sometimes mild during june , and it is cold in september . FR: les états-unis est parfois légère en juin , et il fait froid en septembre . [5-th] sentence EN: your least liked fruit is the grape , but my least liked is the apple . FR: votre moins aimé fruit est le raisin , mais mon moins aimé est la pomme .
Create lookup table
두가지 종류의 매핑 테이블 생성
vocab_to_int -> (Key,value) == (unique word string, its unique index) : 분류기 학습 및 입력값의 임베딩 벡터 변환에 사용 -> (1)
int_to_vocab -> (Key,value) == (its unique index, unique word string) : 출력값의 단어 변환을 위한 lookup table -> (2)
CODES = {'<PAD>': 0, '<EOS>': 1, '<UNK>': 2, '<GO>': 3 }
def create_lookup_tables(text):
# make a list of unique words
vocab = set(text.split())
# (1)
# starts with the special tokens
vocab_to_int = copy.copy(CODES)
# the index (v_i) will starts from 4 (the 2nd arg in enumerate() specifies the starting index)
# since vocab_to_int already contains special tokens
for v_i, v in enumerate(vocab, len(CODES)):
vocab_to_int[v] = v_i
# (2)
int_to_vocab = {v_i: v for v, v_i in vocab_to_int.items()}
return vocab_to_int, int_to_vocab
Text to Word Ids
Lookup table의 인덱스 값을 기준으로 raw data(문자열)을 인덱스 값으로 변환 변환해주지 않으면 하나의 문장은 row가 문장, column이 인덱스 값인 2차원 배열 형태로 저장되어야 함
def text_to_ids(source_text, target_text, source_vocab_to_int, target_vocab_to_int):
"""
1st, 2nd args: raw string text to be converted
3rd, 4th args: lookup tables for 1st and 2nd args respectively
return: A tuple of lists (source_id_text, target_id_text) converted
"""
# empty list of converted sentences
source_text_id = []
target_text_id = []
# make a list of sentences (extraction)
source_sentences = source_text.split("\n")
target_sentences = target_text.split("\n")
max_source_sentence_length = max([len(sentence.split(" ")) for sentence in source_sentences])
max_target_sentence_length = max([len(sentence.split(" ")) for sentence in target_sentences])
# iterating through each sentences (# of sentences in source&target is the same)
for i in range(len(source_sentences)):
# extract sentences one by one
source_sentence = source_sentences[i]
target_sentence = target_sentences[i]
# make a list of tokens/words (extraction) from the chosen sentence
source_tokens = source_sentence.split(" ")
target_tokens = target_sentence.split(" ")
# empty list of converted words to index in the chosen sentence
source_token_id = []
target_token_id = []
for index, token in enumerate(source_tokens):
if (token != ""):
source_token_id.append(source_vocab_to_int[token])
for index, token in enumerate(target_tokens):
if (token != ""):
target_token_id.append(target_vocab_to_int[token])
# put <EOS> token at the end of the chosen target sentence
# this token suggests when to stop creating a sequence
target_token_id.append(target_vocab_to_int['<EOS>'])
# add each converted sentences in the final list
source_text_id.append(source_token_id)
target_text_id.append(target_token_id)
return source_text_id, target_text_id
Peprocess and save the data
def preprocess_and_save_data(source_path, target_path, text_to_ids):
# Preprocess
# load original data (English, French)
source_text = load_data(source_path)
target_text = load_data(target_path)
# to the lower case
source_text = source_text.lower()
target_text = target_text.lower()
# create lookup tables for English and French data
source_vocab_to_int, source_int_to_vocab = create_lookup_tables(source_text)
target_vocab_to_int, target_int_to_vocab = create_lookup_tables(target_text)
# create list of sentences whose words are represented in index
source_text, target_text = text_to_ids(source_text, target_text, source_vocab_to_int, target_vocab_to_int)
# Save data for later use
pickle.dump((
(source_text, target_text),
(source_vocab_to_int, target_vocab_to_int),
(source_int_to_vocab, target_int_to_vocab)), open('preprocess.p', 'wb'))
데이터 전처리 수행
preprocess_and_save_data(source_path, target_path, text_to_ids)
import pickle
def load_preprocess():
with open('preprocess.p', mode='rb') as in_file:
return pickle.load(in_file)
import numpy as np
(source_int_text, target_int_text), (source_vocab_to_int, target_vocab_to_int), _ = load_preprocess()
from distutils.version import LooseVersion
import warnings
import tensorflow as tf
from tensorflow.python.layers.core import Dense
# Check TensorFlow Version
assert LooseVersion(tf.__version__) >= LooseVersion('1.1'), 'Please use TensorFlow version 1.1 or newer'
print('TensorFlow Version: {}'.format(tf.__version__))
# Check for a GPU
if not tf.test.gpu_device_name():
warnings.warn('No GPU found. Please use a GPU to train your neural network.')
else:
print('Default GPU Device: {}'.format(tf.test.gpu_device_name()))
TensorFlow Version: 1.12.0 Default GPU Device: /device:GPU:0
인코더 모델과 디코더 모델 두가지 서브모델들로 이루어진 sequence to sequence 모델 생성
RNN 구조로 구성된 인코더는 raw 데이터를 받아 neural representation 형태로 결과값을 출력하고, 이것이 디코더의 입력으로 사용되어 결과값을 출력하게 된다.
아래와 같은 과정을 통하여 인코더-디코더 모델을 정의하고 학습 및 추론에 이용할 수 있다.
def enc_dec_model_inputs():
inputs = tf.placeholder(tf.int32, [None, None], name='input')
targets = tf.placeholder(tf.int32, [None, None], name='targets')
target_sequence_length = tf.placeholder(tf.int32, [None], name='target_sequence_length')
max_target_len = tf.reduce_max(target_sequence_length)
return inputs, targets, target_sequence_length, max_target_len
def hyperparam_inputs():
#learning rate
lr_rate = tf.placeholder(tf.float32, name='lr_rate')
#keep probability for dropouts
keep_prob = tf.placeholder(tf.float32, name='keep_prob')
return lr_rate, keep_prob
def process_decoder_input(target_data, target_vocab_to_int, batch_size):
"""
Preprocess target data for encoding
:return: Preprocessed target data
"""
# get '<GO>' id
#<GO> 토큰은 번역의 시작 지점을 가르킴
go_id = target_vocab_to_int['<GO>']
#tf.stride_slice() : 텐서를 쪼개는 함수
#Arguments -> Tensor, Begin, End, Stride
after_slice = tf.strided_slice(target_data, [0, 0], [batch_size, -1], [1, 1])
#tf.fill() : 스칼라값으로 채워진 텐서 생성
#tf.concat() : 두가지 텐서를 이어붙임
after_concat = tf.concat( [tf.fill([batch_size, 1], go_id), after_slice], 1)
return after_concat
인코딩 모델은 임베딩 계층과 RNN 계층으로 구성된다.
임베딩 계층은 tf.contrib.layers.embed_sequence()으로 구성하였다.
RNN 계층은 tf.contrib.rnn.LSTMCell(),tf.contrib.rnn.DropoutWrapper(), tf.contrib.rnn.MultiRNNCell() 함수를 사용하여 구성하였다.
def encoding_layer(rnn_inputs, rnn_size, num_layers, keep_prob,
source_vocab_size,
encoding_embedding_size):
"""
:return: tuple (RNN output, RNN state)
"""
embed = tf.contrib.layers.embed_sequence(rnn_inputs,
vocab_size=source_vocab_size,
embed_dim=encoding_embedding_size)
#MultiRNNCell은 여러 RNN cell들을 쌓을 수 있도록 함
#num_layer만큼 LSTM cell을 스태킹
stacked_cells = tf.contrib.rnn.MultiRNNCell([tf.contrib.rnn.DropoutWrapper(tf.contrib.rnn.LSTMCell(rnn_size), keep_prob) for _ in range(num_layers)])
#임베딩 레이어와 RNN 레이어를 통합하기 위한 함수
outputs, state = tf.nn.dynamic_rnn(stacked_cells,
embed,
dtype=tf.float32)
return outputs, state
디코딩 모델은 학습 단계와 추론 단계에서 서로 다른 프로세스가 이루어진다. 학습 단계에서는 타겟 데이터에 정해진 라벨 대로 정해진 값이 다음 스텝으로 전달되지만, 추론 단계에서는 매 스탭마다 결정된 동적인 값을 전달받는다.
두 단계가 서로 다른 방식으로 임베딩 데이터를 사용하므로 디코딩 계층을 만드는 함수를 각각 생성한다.
학습 단계에서는 입력에 따른 사전 정의된 임베딩 값을 사용한다.
tf.contrib.seq2seq.TrainingHelper() 함수를 사용하여 입력 값을 전달한다.
RNN 학습 과정에 사용되는 helper 함수로 단순히 입력 값을 읽어오고, 다음 스텝에 사용될수 있도록 해당하는 인덱스값을 리턴한다.
def decoding_layer_train(encoder_state, dec_cell, dec_embed_input,
target_sequence_length, max_summary_length,
output_layer, keep_prob):
"""
Create a training process in decoding layer
:return: BasicDecoderOutput containing training logits and sample_id
"""
dec_cell = tf.contrib.rnn.DropoutWrapper(dec_cell,
output_keep_prob=keep_prob)
# for only input layer
helper = tf.contrib.seq2seq.TrainingHelper(dec_embed_input,
target_sequence_length)
decoder = tf.contrib.seq2seq.BasicDecoder(dec_cell,
helper,
encoder_state,
output_layer)
# unrolling the decoder layer
outputs, _, _ = tf.contrib.seq2seq.dynamic_decode(decoder,
impute_finished=True,
maximum_iterations=max_summary_length)
return outputs
추론 단계에서는 매 스텝마다 생성되는 결과물을 재 입력 받아야 하기 때문에 동적으로 임베딩 계층을 통과시켜야 한다. tf.contrib.seq2seq.GreedyEmbeddingHelper() 함수를 사용하여 현재 스텝의 결과물을 임베딩 계층에 통과시켜 다음 입력으로 사용될 수 있도록 한다.
def decoding_layer_infer(encoder_state, dec_cell, dec_embeddings, start_of_sequence_id,
end_of_sequence_id, max_target_sequence_length,
vocab_size, output_layer, batch_size, keep_prob):
"""
Create a inference process in decoding layer
:return: BasicDecoderOutput containing inference logits and sample_id
"""
dec_cell = tf.contrib.rnn.DropoutWrapper(dec_cell,
output_keep_prob=keep_prob)
helper = tf.contrib.seq2seq.GreedyEmbeddingHelper(dec_embeddings,
tf.fill([batch_size], start_of_sequence_id),
end_of_sequence_id)
decoder = tf.contrib.seq2seq.BasicDecoder(dec_cell,
helper,
encoder_state,
output_layer)
outputs, _, _ = tf.contrib.seq2seq.dynamic_decode(decoder,
impute_finished=True,
maximum_iterations=max_target_sequence_length)
return outputs
def decoding_layer(dec_input, encoder_state,
target_sequence_length, max_target_sequence_length,
rnn_size,
num_layers, target_vocab_to_int, target_vocab_size,
batch_size, keep_prob, decoding_embedding_size):
"""
Create decoding layer
:return: Tuple of (Training BasicDecoderOutput, Inference BasicDecoderOutput)
"""
target_vocab_size = len(target_vocab_to_int)
dec_embeddings = tf.Variable(tf.random_uniform([target_vocab_size, decoding_embedding_size]))
dec_embed_input = tf.nn.embedding_lookup(dec_embeddings, dec_input)
cells = tf.contrib.rnn.MultiRNNCell([tf.contrib.rnn.LSTMCell(rnn_size) for _ in range(num_layers)])
with tf.variable_scope("decode"):
output_layer = tf.layers.Dense(target_vocab_size)
train_output = decoding_layer_train(encoder_state,
cells,
dec_embed_input,
target_sequence_length,
max_target_sequence_length,
output_layer,
keep_prob)
with tf.variable_scope("decode", reuse=True):
infer_output = decoding_layer_infer(encoder_state,
cells,
dec_embeddings,
target_vocab_to_int['<GO>'],
target_vocab_to_int['<EOS>'],
max_target_sequence_length,
target_vocab_size,
output_layer,
batch_size,
keep_prob)
return (train_output, infer_output)
1~6번 과정에서 정의한 함수들을 통합하여 seq2seq 모델 생성을 위한 함수를 정의한다.
def seq2seq_model(input_data, target_data, keep_prob, batch_size,
target_sequence_length,
max_target_sentence_length,
source_vocab_size, target_vocab_size,
enc_embedding_size, dec_embedding_size,
rnn_size, num_layers, target_vocab_to_int):
"""
Build the Sequence-to-Sequence model
:return: Tuple of (Training BasicDecoderOutput, Inference BasicDecoderOutput)
"""
enc_outputs, enc_states = encoding_layer(input_data,
rnn_size,
num_layers,
keep_prob,
source_vocab_size,
enc_embedding_size)
dec_input = process_decoder_input(target_data,
target_vocab_to_int,
batch_size)
train_output, infer_output = decoding_layer(dec_input,
enc_states,
target_sequence_length,
max_target_sentence_length,
rnn_size,
num_layers,
target_vocab_to_int,
target_vocab_size,
batch_size,
keep_prob,
dec_embedding_size)
return train_output, infer_output
모델 생성 및 학습을 위한 하이퍼 파라미터 설정
display_step = 300
epochs = 13
batch_size = 128
rnn_size = 128
num_layers = 3
encoding_embedding_size = 150
decoding_embedding_size = 150
learning_rate = 0.001
keep_probability = 0.5
save_path = 'checkpoints/dev'
(source_int_text, target_int_text), (source_vocab_to_int, target_vocab_to_int), _ = load_preprocess()
max_target_sentence_length = max([len(sentence) for sentence in source_int_text])
train_graph = tf.Graph()
with train_graph.as_default():
input_data, targets, target_sequence_length, max_target_sequence_length = enc_dec_model_inputs()
lr, keep_prob = hyperparam_inputs()
train_logits, inference_logits = seq2seq_model(tf.reverse(input_data, [-1]),
targets,
keep_prob,
batch_size,
target_sequence_length,
max_target_sequence_length,
len(source_vocab_to_int),
len(target_vocab_to_int),
encoding_embedding_size,
decoding_embedding_size,
rnn_size,
num_layers,
target_vocab_to_int)
training_logits = tf.identity(train_logits.rnn_output, name='logits')
inference_logits = tf.identity(inference_logits.sample_id, name='predictions')
# https://www.tensorflow.org/api_docs/python/tf/sequence_mask
# - Returns a mask tensor representing the first N positions of each cell.
masks = tf.sequence_mask(target_sequence_length, max_target_sequence_length, dtype=tf.float32, name='masks')
with tf.name_scope("optimization"):
# Loss function - weighted softmax cross entropy
cost = tf.contrib.seq2seq.sequence_loss(
training_logits,
targets,
masks)
# Optimizer
optimizer = tf.train.AdamOptimizer(lr)
# Gradient Clipping
gradients = optimizer.compute_gradients(cost)
capped_gradients = [(tf.clip_by_value(grad, -1., 1.), var) for grad, var in gradients if grad is not None]
train_op = optimizer.apply_gradients(capped_gradients)
def pad_sentence_batch(sentence_batch, pad_int):
"""Pad sentences with <PAD> so that each sentence of a batch has the same length"""
max_sentence = max([len(sentence) for sentence in sentence_batch])
return [sentence + [pad_int] * (max_sentence - len(sentence)) for sentence in sentence_batch]
def get_batches(sources, targets, batch_size, source_pad_int, target_pad_int):
"""Batch targets, sources, and the lengths of their sentences together"""
for batch_i in range(0, len(sources)//batch_size):
start_i = batch_i * batch_size
# Slice the right amount for the batch
sources_batch = sources[start_i:start_i + batch_size]
targets_batch = targets[start_i:start_i + batch_size]
# Pad
pad_sources_batch = np.array(pad_sentence_batch(sources_batch, source_pad_int))
pad_targets_batch = np.array(pad_sentence_batch(targets_batch, target_pad_int))
# Need the lengths for the _lengths parameters
pad_targets_lengths = []
for target in pad_targets_batch:
pad_targets_lengths.append(len(target))
pad_source_lengths = []
for source in pad_sources_batch:
pad_source_lengths.append(len(source))
yield pad_sources_batch, pad_targets_batch, pad_source_lengths, pad_targets_lengths
def get_accuracy(target, logits):
"""
Calculate accuracy
"""
max_seq = max(target.shape[1], logits.shape[1])
if max_seq - target.shape[1]:
target = np.pad(
target,
[(0,0),(0,max_seq - target.shape[1])],
'constant')
if max_seq - logits.shape[1]:
logits = np.pad(
logits,
[(0,0),(0,max_seq - logits.shape[1])],
'constant')
return np.mean(np.equal(target, logits))
# Split data to training and validation sets
train_source = source_int_text[batch_size:]
train_target = target_int_text[batch_size:]
valid_source = source_int_text[:batch_size]
valid_target = target_int_text[:batch_size]
(valid_sources_batch, valid_targets_batch, valid_sources_lengths, valid_targets_lengths ) = next(get_batches(valid_source,
valid_target,
batch_size,
source_vocab_to_int['<PAD>'],
target_vocab_to_int['<PAD>']))
with tf.Session(graph=train_graph) as sess:
sess.run(tf.global_variables_initializer())
for epoch_i in range(epochs):
for batch_i, (source_batch, target_batch, sources_lengths, targets_lengths) in enumerate(
get_batches(train_source, train_target, batch_size,
source_vocab_to_int['<PAD>'],
target_vocab_to_int['<PAD>'])):
_, loss = sess.run(
[train_op, cost],
{input_data: source_batch,
targets: target_batch,
lr: learning_rate,
target_sequence_length: targets_lengths,
keep_prob: keep_probability})
if batch_i % display_step == 0 and batch_i > 0:
batch_train_logits = sess.run(
inference_logits,
{input_data: source_batch,
target_sequence_length: targets_lengths,
keep_prob: 1.0})
batch_valid_logits = sess.run(
inference_logits,
{input_data: valid_sources_batch,
target_sequence_length: valid_targets_lengths,
keep_prob: 1.0})
train_acc = get_accuracy(target_batch, batch_train_logits)
valid_acc = get_accuracy(valid_targets_batch, batch_valid_logits)
print('Epoch {:>3} Batch {:>4}/{} - Train Accuracy: {:>6.4f}, Validation Accuracy: {:>6.4f}, Loss: {:>6.4f}'
.format(epoch_i, batch_i, len(source_int_text) // batch_size, train_acc, valid_acc, loss))
# Save Model
saver = tf.train.Saver()
saver.save(sess, save_path)
print('Model Trained and Saved')
Epoch 0 Batch 300/1077 - Train Accuracy: 0.4215, Validation Accuracy: 0.5110, Loss: 1.9840 Epoch 0 Batch 600/1077 - Train Accuracy: 0.5141, Validation Accuracy: 0.5131, Loss: 1.1287 Epoch 0 Batch 900/1077 - Train Accuracy: 0.5156, Validation Accuracy: 0.5600, Loss: 0.9501 Epoch 1 Batch 300/1077 - Train Accuracy: 0.5831, Validation Accuracy: 0.6200, Loss: 0.7324 Epoch 1 Batch 600/1077 - Train Accuracy: 0.6425, Validation Accuracy: 0.6502, Loss: 0.5848 Epoch 1 Batch 900/1077 - Train Accuracy: 0.6660, Validation Accuracy: 0.6701, Loss: 0.5478 Epoch 2 Batch 300/1077 - Train Accuracy: 0.6780, Validation Accuracy: 0.6839, Loss: 0.4700 Epoch 2 Batch 600/1077 - Train Accuracy: 0.7217, Validation Accuracy: 0.7074, Loss: 0.4078 Epoch 2 Batch 900/1077 - Train Accuracy: 0.7238, Validation Accuracy: 0.7330, Loss: 0.4036 Epoch 3 Batch 300/1077 - Train Accuracy: 0.7278, Validation Accuracy: 0.7379, Loss: 0.3529 Epoch 3 Batch 600/1077 - Train Accuracy: 0.7853, Validation Accuracy: 0.7518, Loss: 0.2771 Epoch 3 Batch 900/1077 - Train Accuracy: 0.8098, Validation Accuracy: 0.8139, Loss: 0.2763 Epoch 4 Batch 300/1077 - Train Accuracy: 0.8725, Validation Accuracy: 0.8438, Loss: 0.2163 Epoch 4 Batch 600/1077 - Train Accuracy: 0.8702, Validation Accuracy: 0.8675, Loss: 0.1758 Epoch 4 Batch 900/1077 - Train Accuracy: 0.8855, Validation Accuracy: 0.8384, Loss: 0.1922 Epoch 5 Batch 300/1077 - Train Accuracy: 0.9243, Validation Accuracy: 0.8697, Loss: 0.1424 Epoch 5 Batch 600/1077 - Train Accuracy: 0.8891, Validation Accuracy: 0.8697, Loss: 0.1355 Epoch 5 Batch 900/1077 - Train Accuracy: 0.9152, Validation Accuracy: 0.8803, Loss: 0.1337 Epoch 6 Batch 300/1077 - Train Accuracy: 0.9449, Validation Accuracy: 0.8782, Loss: 0.0973 Epoch 6 Batch 600/1077 - Train Accuracy: 0.9301, Validation Accuracy: 0.8967, Loss: 0.1025 Epoch 6 Batch 900/1077 - Train Accuracy: 0.9324, Validation Accuracy: 0.8999, Loss: 0.0924 Epoch 7 Batch 300/1077 - Train Accuracy: 0.9626, Validation Accuracy: 0.9119, Loss: 0.0785 Epoch 7 Batch 600/1077 - Train Accuracy: 0.9405, Validation Accuracy: 0.9048, Loss: 0.0765 Epoch 7 Batch 900/1077 - Train Accuracy: 0.9492, Validation Accuracy: 0.9233, Loss: 0.0809 Epoch 8 Batch 300/1077 - Train Accuracy: 0.9585, Validation Accuracy: 0.9137, Loss: 0.0586 Epoch 8 Batch 600/1077 - Train Accuracy: 0.9524, Validation Accuracy: 0.9105, Loss: 0.0633 Epoch 8 Batch 900/1077 - Train Accuracy: 0.9426, Validation Accuracy: 0.9041, Loss: 0.0676 Epoch 9 Batch 300/1077 - Train Accuracy: 0.9675, Validation Accuracy: 0.9254, Loss: 0.0453 Epoch 9 Batch 600/1077 - Train Accuracy: 0.9576, Validation Accuracy: 0.9286, Loss: 0.0559 Epoch 9 Batch 900/1077 - Train Accuracy: 0.9652, Validation Accuracy: 0.9332, Loss: 0.0544 Epoch 10 Batch 300/1077 - Train Accuracy: 0.9630, Validation Accuracy: 0.9371, Loss: 0.0442 Epoch 10 Batch 600/1077 - Train Accuracy: 0.9535, Validation Accuracy: 0.9421, Loss: 0.0522 Epoch 10 Batch 900/1077 - Train Accuracy: 0.9566, Validation Accuracy: 0.9471, Loss: 0.0499 Epoch 11 Batch 300/1077 - Train Accuracy: 0.9605, Validation Accuracy: 0.9393, Loss: 0.0364 Epoch 11 Batch 600/1077 - Train Accuracy: 0.9647, Validation Accuracy: 0.9524, Loss: 0.0378 Epoch 11 Batch 900/1077 - Train Accuracy: 0.9641, Validation Accuracy: 0.9570, Loss: 0.0463 Epoch 12 Batch 300/1077 - Train Accuracy: 0.9663, Validation Accuracy: 0.9364, Loss: 0.0306 Epoch 12 Batch 600/1077 - Train Accuracy: 0.9542, Validation Accuracy: 0.9634, Loss: 0.0414 Epoch 12 Batch 900/1077 - Train Accuracy: 0.9621, Validation Accuracy: 0.9513, Loss: 0.0455 Model Trained and Saved
def save_params(params):
with open('params.p', 'wb') as out_file:
pickle.dump(params, out_file)
def load_params():
with open('params.p', mode='rb') as in_file:
return pickle.load(in_file)
# Save parameters for checkpoint
save_params(save_path)
import tensorflow as tf
import numpy as np
_, (source_vocab_to_int, target_vocab_to_int), (source_int_to_vocab, target_int_to_vocab) = load_preprocess()
load_path = load_params()
def sentence_to_seq(sentence, vocab_to_int):
results = []
for word in sentence.split(" "):
if word in vocab_to_int:
results.append(vocab_to_int[word])
else:
results.append(vocab_to_int['<UNK>'])
return results
translate_sentence = 'i like apple .'
translate_sentence = sentence_to_seq(translate_sentence, source_vocab_to_int)
loaded_graph = tf.Graph()
with tf.Session(graph=loaded_graph) as sess:
# Load saved model
loader = tf.train.import_meta_graph(load_path + '.meta')
loader.restore(sess, load_path)
input_data = loaded_graph.get_tensor_by_name('input:0')
logits = loaded_graph.get_tensor_by_name('predictions:0')
target_sequence_length = loaded_graph.get_tensor_by_name('target_sequence_length:0')
keep_prob = loaded_graph.get_tensor_by_name('keep_prob:0')
translate_logits = sess.run(logits, {input_data: [translate_sentence]*batch_size,
target_sequence_length: [len(translate_sentence)*2]*batch_size,
keep_prob: 1.0})[0]
print('Input')
print(' Word Ids: {}'.format([i for i in translate_sentence]))
print(' English Words: {}'.format([source_int_to_vocab[i] for i in translate_sentence]))
print('\nPrediction')
print(' Word Ids: {}'.format([i for i in translate_logits]))
print(' French Words: {}'.format(" ".join([target_int_to_vocab[i] for i in translate_logits])))
INFO:tensorflow:Restoring parameters from checkpoints/dev Input Word Ids: [200, 176, 91, 139] English Words: ['i', 'like', 'apple', '.'] Prediction Word Ids: [285, 109, 346, 1] French Words: aimé proches pensez <EOS>
Seqeunce to sequence RNN application : 날씨 예측, 기께 번역, 비디오 캡션 생성, 스피치 투 텍스트, 음악 생성, 노래의 화음 식별
시퀀스 투 벡터 RNN : 음악 샘플을 장르로 구분하기, 책 후기에 대한 감성 분석, 뇌에 심은 인공칩에서 익은 데이터를 기반으로 실어증 환자가 생각하는 단어 예측하기, 사용자의 영화 시청 이력을 바탕으로 복 싶어 할 영화의 확률 예측하기
일반적으로 문장을 한번에 단어 하나씩 번역하면 결과가 매우 좋지 않음
하지만 이를 한 단어씩 번역하면 'I you in pray'가 되버림. 따라서 먼저 전체 문장을 읽고 난 다음에 번역하는것이 훨씬 좋음 보통의 시퀀스- 투 시퀀스 RNN은 첫 단어를 읽은 후 즉시 문장을 번역하기 시작하지만 인코더-디코더 RNN은 먼저 전체 문장을 일고 난 다음에 번역을 함 이는 다음에 말할 것이 확실하지 않을 때마다 침묵을 출력하는 시퀀스-투-시퀀스 RNN으로 생각할 수도 있습니다.
화면 내용을 기초로 동영상을 분류하려면 초당 한 프레임을 받아 각 프레임을 합성곱 신경망에 통과시키고 이 CNN의 출력을 시퀀스-투 벡터 RNN에 주입하고 마지막에 소프트맥스 층을 통과시켜 모든 클래스에 대한 확률을 구하는 구조를 생각해볼 수 있음.
훈련을 위해서는 크로스 엔트로피를 비용 함수로 사용하면 됩니다. 분류에 오디오도 사용하려면 매 초의 오디오를 스펙트럼 사진으로 변환하고 이 사진을 CNN에 주입한 다음 이 CNN의 출력을 RNN에 주입함
가변 길이 입력 시퀀스를 다루기 위한 가장 간단한 방법은
가변 길이의 출력 시퀀스를 다루기 위해서는
여러 GPU에 심층 RNN의 훈련과 실행을 분산시키이 위한 일반적인 방법은 각각의 층을 다른 GPU에 배치하는 것입니다.
from random import choice, seed
# 일관된 출력을 위한 유사난수 초기화
seed(42)
np.random.seed(42)
default_reber_grammar = [
[("B", 1)], # (상태 0) =B=>(상태 1)
[("T", 2), ("P", 3)], # (상태 1) =T=>(상태 2) or =P=>(상태 3)
[("S", 2), ("X", 4)], # (상태 2) =S=>(상태 2) or =X=>(상태 4)
[("T", 3), ("V", 5)], # 등등..
[("X", 3), ("S", 6)],
[("P", 4), ("V", 6)],
[("E", None)]] # (상태 6) =E=>(종료 상태)
embedded_reber_grammar = [
[("B", 1)],
[("T", 2), ("P", 3)],
[(default_reber_grammar, 4)],
[(default_reber_grammar, 5)],
[("T", 6)],
[("P", 6)],
[("E", None)]]
def generate_string(grammar):
state = 0
output = []
while state is not None:
production, state = choice(grammar[state])
if isinstance(production, list):
production = generate_string(grammar=production)
output.append(production)
return "".join(output)
for _ in range(25):
print(generate_string(default_reber_grammar), end=" ")
BTXXTTTTVPXTTTTTVPSE BTXSE BTXXTVPSE BTXXVPSE BTSSXXTTVVE BTXSE BTSSSXSE BPTTTVVE BTXXVVE BPTTVVE BTSXXTTTTVPSE BPTTVVE BPTVPSE BPTTVPXVVE BPVPXTTTVPXTVPSE BTXSE BPTTTTVPXTTTTTTTVPXVVE BPTVVE BTXSE BPTTTVVE BTSXXVPSE BTXXTTTTTVVE BPTTVPSE BPVVE BPTTTVPXVPXTTTTTVPXTTVVE
for _ in range(25):
print(generate_string(embedded_reber_grammar), end=" ")
BPBPTVVEPE BTBPTVPXVVETE BPBPTTTVVEPE BPBTXSEPE BPBPTTTTTVPSEPE BTBTSXSETE BPBPVPSEPE BPBPVVEPE BPBTXSEPE BPBTSXSEPE BTBPTTVVETE BPBPVVEPE BTBTXSETE BPBPTTVVEPE BTBTSXXVVETE BTBTXXTVPXTVPSETE BTBPTVVETE BPBPVPXTTVPXTVVEPE BTBTXSETE BPBTXSEPE BPBTSXXTVPSEPE BPBPVVEPE BPBPTTTTTTTTTTVPXVVEPE BPBPVVEPE BPBPVVEPE
def generate_corrupted_string(grammar, chars="BEPSTVX"):
good_string = generate_string(grammar)
index = np.random.randint(len(good_string))
good_char = good_string[index]
bad_char = choice(list(set(chars) - set(good_char)))
return good_string[:index] + bad_char + good_string[index + 1:]
for _ in range(25):
print(generate_corrupted_string(embedded_reber_grammar), end=" ")
BPBPVPEEPE BPBSXSEPE BPBPTVVBPE BTBPPPSETE BTBPVVSTE BPBTSSXXTSTVVEPE BPTTSXXTVPSEPE BPBTXSTPE BTBPTTTVPSBTE BPBTSXXTTTXTTVVEPE BPBVXXVPXTVPXTTVVEPE BPBPTTVXEPE BPBPVVEXE BPEPTTVVEPE BPBPVXSEPE BPBTVXXVVEPE BEBPTTTVPXVVETE BPBTSSXTEPE BPBPVXEPE BEBTXSEPE BTBPTVPXVPXVVETS PPBTSXXTVPXVPSEPE BPBTSXXTTTVVSEPE BPBPVPXVVTPE BTBTSVSETE
(나중에, 텐서플로에게 각 문자열의 실제 길이를 sequence_length 매개변수로 전달할 것).
def string_to_one_hot_vectors(string, n_steps, chars="BEPSTVX"):
char_to_index = {char: index for index, char in enumerate(chars)}
output = np.zeros((n_steps, len(chars)), dtype=np.int32)
for index, char in enumerate(string):
output[index, char_to_index[char]] = 1.
return output
string_to_one_hot_vectors("BTBTXSETE", 12)
array([[1, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 1, 0, 0], [1, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 1, 0, 0], [0, 0, 0, 0, 0, 0, 1], [0, 0, 0, 1, 0, 0, 0], [0, 1, 0, 0, 0, 0, 0], [0, 0, 0, 0, 1, 0, 0], [0, 1, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0]])
def generate_dataset(size):
good_strings = [generate_string(embedded_reber_grammar)
for _ in range(size // 2)]
bad_strings = [generate_corrupted_string(embedded_reber_grammar)
for _ in range(size - size // 2)]
all_strings = good_strings + bad_strings
n_steps = max([len(string) for string in all_strings])
X = np.array([string_to_one_hot_vectors(string, n_steps)
for string in all_strings])
seq_length = np.array([len(string) for string in all_strings])
y = np.array([[1] for _ in range(len(good_strings))] +
[[0] for _ in range(len(bad_strings))])
rnd_idx = np.random.permutation(size)
return X[rnd_idx], seq_length[rnd_idx], y[rnd_idx]
X_train, l_train, y_train = generate_dataset(10000)
X_train[0]
array([[1, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 1, 0, 0], [1, 0, 0, 0, 0, 0, 0], [0, 0, 1, 0, 0, 0, 0], [0, 0, 0, 0, 1, 0, 0], [0, 0, 0, 0, 1, 0, 0], [0, 0, 0, 0, 0, 1, 0], [0, 0, 1, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 1], [0, 0, 0, 0, 0, 1, 0], [0, 0, 1, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 1], [0, 0, 0, 0, 0, 1, 0], [0, 0, 1, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 1], [0, 1, 0, 0, 0, 0, 0], [0, 0, 1, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 1], [0, 0, 0, 0, 0, 1, 0], [0, 0, 0, 0, 0, 1, 0], [0, 1, 0, 0, 0, 0, 0], [0, 0, 0, 0, 1, 0, 0], [0, 1, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0]])
데이터 세트에 가장 긴 문자열이 존재하기 때문에 0값이 많은 것을 확인 할 수 있음.
MNIST 이미지를 분류하기 위해 앞서 만들었던 시퀀스 classifier 만들기.
주의사항
reset_graph()
possible_chars = "BEPSTVX"
n_inputs = len(possible_chars)
n_neurons = 30
n_outputs = 1
learning_rate = 0.02
momentum = 0.95
X = tf.placeholder(tf.float32, [None, None, n_inputs], name="X")
seq_length = tf.placeholder(tf.int32, [None], name="seq_length")
y = tf.placeholder(tf.float32, [None, 1], name="y")
gru_cell = tf.nn.rnn_cell.GRUCell(num_units=n_neurons)
outputs, states = tf.nn.dynamic_rnn(gru_cell, X, dtype=tf.float32,
sequence_length=seq_length)
logits = tf.layers.dense(states, n_outputs, name="logits")
y_pred = tf.cast(tf.greater(logits, 0.), tf.float32, name="y_pred")
y_proba = tf.nn.sigmoid(logits, name="y_proba")
xentropy = tf.nn.sigmoid_cross_entropy_with_logits(labels=y, logits=logits)
loss = tf.reduce_mean(xentropy, name="loss")
optimizer = tf.train.MomentumOptimizer(learning_rate=learning_rate,
momentum=momentum,
use_nesterov=True)
training_op = optimizer.minimize(loss)
correct = tf.equal(y_pred, y, name="correct")
accuracy = tf.reduce_mean(tf.cast(correct, tf.float32), name="accuracy")
init = tf.global_variables_initializer()
saver = tf.train.Saver()
X_val, l_val, y_val = generate_dataset(5000)
n_epochs = 50
batch_size = 50
with tf.Session() as sess:
init.run()
for epoch in range(n_epochs):
X_batches = np.array_split(X_train, len(X_train) // batch_size)
l_batches = np.array_split(l_train, len(l_train) // batch_size)
y_batches = np.array_split(y_train, len(y_train) // batch_size)
for X_batch, l_batch, y_batch in zip(X_batches, l_batches, y_batches):
loss_val, _ = sess.run(
[loss, training_op],
feed_dict={X: X_batch, seq_length: l_batch, y: y_batch})
acc_train = accuracy.eval(feed_dict={X: X_batch, seq_length: l_batch, y: y_batch})
acc_val = accuracy.eval(feed_dict={X: X_val, seq_length: l_val, y: y_val})
print("{:4d} Train loss: {:.4f}, accuracy: {:.2f}% Validation accuracy: {:.2f}%".format(
epoch, loss_val, 100 * acc_train, 100 * acc_val))
saver.save(sess, "./my_reber_classifier")
0 Train loss: 0.6844, accuracy: 54.00% Validation accuracy: 58.32% 1 Train loss: 0.6495, accuracy: 62.00% Validation accuracy: 63.44% 2 Train loss: 0.5680, accuracy: 78.00% Validation accuracy: 69.04% 3 Train loss: 0.6128, accuracy: 72.00% Validation accuracy: 65.34% 4 Train loss: 0.4428, accuracy: 88.00% Validation accuracy: 81.08% 5 Train loss: 0.4864, accuracy: 82.00% Validation accuracy: 75.56% 6 Train loss: 0.2839, accuracy: 90.00% Validation accuracy: 82.56% 7 Train loss: 0.2792, accuracy: 88.00% Validation accuracy: 83.32% 8 Train loss: 0.1639, accuracy: 94.00% Validation accuracy: 92.76% 9 Train loss: 0.0233, accuracy: 100.00% Validation accuracy: 96.52% 10 Train loss: 0.0572, accuracy: 100.00% Validation accuracy: 98.14% 11 Train loss: 0.0266, accuracy: 100.00% Validation accuracy: 98.44% 12 Train loss: 0.0207, accuracy: 100.00% Validation accuracy: 99.14% 13 Train loss: 0.1537, accuracy: 96.00% Validation accuracy: 90.56% 14 Train loss: 0.0088, accuracy: 100.00% Validation accuracy: 99.26% 15 Train loss: 0.0019, accuracy: 100.00% Validation accuracy: 100.00% 16 Train loss: 0.0012, accuracy: 100.00% Validation accuracy: 100.00% 17 Train loss: 0.0010, accuracy: 100.00% Validation accuracy: 100.00% 18 Train loss: 0.0008, accuracy: 100.00% Validation accuracy: 100.00% 19 Train loss: 0.0007, accuracy: 100.00% Validation accuracy: 100.00% 20 Train loss: 0.0006, accuracy: 100.00% Validation accuracy: 100.00% 21 Train loss: 0.0005, accuracy: 100.00% Validation accuracy: 100.00% 22 Train loss: 0.0005, accuracy: 100.00% Validation accuracy: 100.00% 23 Train loss: 0.0004, accuracy: 100.00% Validation accuracy: 100.00% 24 Train loss: 0.0004, accuracy: 100.00% Validation accuracy: 100.00% 25 Train loss: 0.0003, accuracy: 100.00% Validation accuracy: 100.00% 26 Train loss: 0.0003, accuracy: 100.00% Validation accuracy: 100.00% 27 Train loss: 0.0003, accuracy: 100.00% Validation accuracy: 100.00% 28 Train loss: 0.0003, accuracy: 100.00% Validation accuracy: 100.00% 29 Train loss: 0.0003, accuracy: 100.00% Validation accuracy: 100.00% 30 Train loss: 0.0003, accuracy: 100.00% Validation accuracy: 100.00% 31 Train loss: 0.0002, accuracy: 100.00% Validation accuracy: 100.00% 32 Train loss: 0.0002, accuracy: 100.00% Validation accuracy: 100.00% 33 Train loss: 0.0002, accuracy: 100.00% Validation accuracy: 100.00% 34 Train loss: 0.0002, accuracy: 100.00% Validation accuracy: 100.00% 35 Train loss: 0.0002, accuracy: 100.00% Validation accuracy: 100.00% 36 Train loss: 0.0002, accuracy: 100.00% Validation accuracy: 100.00% 37 Train loss: 0.0002, accuracy: 100.00% Validation accuracy: 100.00% 38 Train loss: 0.0002, accuracy: 100.00% Validation accuracy: 100.00% 39 Train loss: 0.0002, accuracy: 100.00% Validation accuracy: 100.00% 40 Train loss: 0.0002, accuracy: 100.00% Validation accuracy: 100.00% 41 Train loss: 0.0002, accuracy: 100.00% Validation accuracy: 100.00% 42 Train loss: 0.0001, accuracy: 100.00% Validation accuracy: 100.00% 43 Train loss: 0.0001, accuracy: 100.00% Validation accuracy: 100.00% 44 Train loss: 0.0001, accuracy: 100.00% Validation accuracy: 100.00% 45 Train loss: 0.0001, accuracy: 100.00% Validation accuracy: 100.00% 46 Train loss: 0.0001, accuracy: 100.00% Validation accuracy: 100.00% 47 Train loss: 0.0001, accuracy: 100.00% Validation accuracy: 100.00% 48 Train loss: 0.0001, accuracy: 100.00% Validation accuracy: 100.00% 49 Train loss: 0.0001, accuracy: 100.00% Validation accuracy: 100.00%
이제 RNN을 두 개의 까다로운 문자열로 테스트 해보기 첫번째는 나쁜것이고 두번째는 좋은것. 두번째와 마지막 문자만 다르며, 두번째 문자가 항상 마지막 문자와 같아야 한다는 패턴을 알아 차릴 수 있다는 것을 보여줌
test_strings = [
"BPBTSSSSSSSXXTTVPXVPXTTTTTVVETE",
"BPBTSSSSSSSXXTTVPXVPXTTTTTVVEPE"]
l_test = np.array([len(s) for s in test_strings])
max_length = l_test.max()
X_test = [string_to_one_hot_vectors(s, n_steps=max_length)
for s in test_strings]
with tf.Session() as sess:
saver.restore(sess, "./my_reber_classifier")
y_proba_val = y_proba.eval(feed_dict={X: X_test, seq_length: l_test})
print()
print("Estimated probability that these are Reber strings:")
for index, string in enumerate(test_strings):
print("{}: {:.2f}%".format(string, 100 * y_proba_val[index][0]))
INFO:tensorflow:Restoring parameters from ./my_reber_classifier Estimated probability that these are Reber strings: BPBTSSSSSSSXXTTVPXVPXTTTTTVVETE: 0.33% BPBTSSSSSSSXXTTVPXVPXTTTTTVVEPE: 99.99%
참조 자료