Train on 8399 samples, validate on 2100 samples
Epoch 1/100
8399/8399 [==============================] - 0s - loss: 0.7075 - acc: 0.7378 - val_loss: 0.4806 - val_acc: 0.7724
Epoch 2/100
8399/8399 [==============================] - 0s - loss: 0.5155 - acc: 0.7553 - val_loss: 0.4538 - val_acc: 0.7705
Epoch 3/100
8399/8399 [==============================] - 0s - loss: 0.4946 - acc: 0.7606 - val_loss: 0.4499 - val_acc: 0.7652
Epoch 4/100
8399/8399 [==============================] - 0s - loss: 0.4803 - acc: 0.7626 - val_loss: 0.4456 - val_acc: 0.7633
Epoch 5/100
8399/8399 [==============================] - 0s - loss: 0.4708 - acc: 0.7645 - val_loss: 0.4414 - val_acc: 0.7662
Epoch 6/100
8399/8399 [==============================] - 0s - loss: 0.4602 - acc: 0.7662 - val_loss: 0.4364 - val_acc: 0.7686
Epoch 7/100
8399/8399 [==============================] - 0s - loss: 0.4492 - acc: 0.7720 - val_loss: 0.4274 - val_acc: 0.7748
Epoch 8/100
8399/8399 [==============================] - 0s - loss: 0.4359 - acc: 0.7783 - val_loss: 0.4141 - val_acc: 0.7771
Epoch 9/100
8399/8399 [==============================] - 0s - loss: 0.4200 - acc: 0.7882 - val_loss: 0.3981 - val_acc: 0.7795
Epoch 10/100
8399/8399 [==============================] - 0s - loss: 0.4021 - acc: 0.7993 - val_loss: 0.3834 - val_acc: 0.7848
Epoch 11/100
8399/8399 [==============================] - 0s - loss: 0.3854 - acc: 0.8090 - val_loss: 0.3643 - val_acc: 0.7938
Epoch 12/100
8399/8399 [==============================] - 0s - loss: 0.3688 - acc: 0.8202 - val_loss: 0.3458 - val_acc: 0.8062
Epoch 13/100
8399/8399 [==============================] - 0s - loss: 0.3532 - acc: 0.8306 - val_loss: 0.3287 - val_acc: 0.8200
Epoch 14/100
8399/8399 [==============================] - 0s - loss: 0.3407 - acc: 0.8409 - val_loss: 0.3158 - val_acc: 0.8338
Epoch 15/100
8399/8399 [==============================] - 0s - loss: 0.3300 - acc: 0.8494 - val_loss: 0.3052 - val_acc: 0.8486
Epoch 16/100
8399/8399 [==============================] - 0s - loss: 0.3211 - acc: 0.8566 - val_loss: 0.2946 - val_acc: 0.8595
Epoch 17/100
8399/8399 [==============================] - 0s - loss: 0.3119 - acc: 0.8634 - val_loss: 0.2874 - val_acc: 0.8752
Epoch 18/100
8399/8399 [==============================] - 0s - loss: 0.3056 - acc: 0.8682 - val_loss: 0.2798 - val_acc: 0.8843
Epoch 19/100
8399/8399 [==============================] - 0s - loss: 0.3002 - acc: 0.8721 - val_loss: 0.2744 - val_acc: 0.8919
Epoch 20/100
8399/8399 [==============================] - 0s - loss: 0.2941 - acc: 0.8769 - val_loss: 0.2703 - val_acc: 0.8990
Epoch 21/100
8399/8399 [==============================] - 0s - loss: 0.2896 - acc: 0.8797 - val_loss: 0.2670 - val_acc: 0.9033
Epoch 22/100
8399/8399 [==============================] - 0s - loss: 0.2853 - acc: 0.8830 - val_loss: 0.2642 - val_acc: 0.9062
Epoch 23/100
8399/8399 [==============================] - 0s - loss: 0.2809 - acc: 0.8855 - val_loss: 0.2635 - val_acc: 0.9043
Epoch 24/100
8399/8399 [==============================] - 0s - loss: 0.2772 - acc: 0.8886 - val_loss: 0.2613 - val_acc: 0.9057
Epoch 25/100
8399/8399 [==============================] - 0s - loss: 0.2754 - acc: 0.8886 - val_loss: 0.2626 - val_acc: 0.9043
Epoch 26/100
8399/8399 [==============================] - 0s - loss: 0.2701 - acc: 0.8922 - val_loss: 0.2597 - val_acc: 0.9048
Epoch 27/100
8399/8399 [==============================] - 0s - loss: 0.2663 - acc: 0.8938 - val_loss: 0.2588 - val_acc: 0.9067
Epoch 28/100
8399/8399 [==============================] - 0s - loss: 0.2642 - acc: 0.8944 - val_loss: 0.2603 - val_acc: 0.9081
Epoch 29/100
8399/8399 [==============================] - 0s - loss: 0.2601 - acc: 0.8980 - val_loss: 0.2598 - val_acc: 0.9062
Epoch 30/100
8399/8399 [==============================] - 0s - loss: 0.2580 - acc: 0.9000 - val_loss: 0.2585 - val_acc: 0.9095
Epoch 31/100
8399/8399 [==============================] - 0s - loss: 0.2552 - acc: 0.9018 - val_loss: 0.2572 - val_acc: 0.9119
Epoch 32/100
8399/8399 [==============================] - 0s - loss: 0.2537 - acc: 0.9024 - val_loss: 0.2583 - val_acc: 0.9105
Epoch 33/100
8399/8399 [==============================] - 0s - loss: 0.2512 - acc: 0.9043 - val_loss: 0.2565 - val_acc: 0.9105
Epoch 34/100
8399/8399 [==============================] - 0s - loss: 0.2499 - acc: 0.9048 - val_loss: 0.2566 - val_acc: 0.9114
Epoch 35/100
8399/8399 [==============================] - 0s - loss: 0.2487 - acc: 0.9056 - val_loss: 0.2577 - val_acc: 0.9100
Epoch 36/100
8399/8399 [==============================] - 0s - loss: 0.2482 - acc: 0.9056 - val_loss: 0.2655 - val_acc: 0.9019
Epoch 37/100
8399/8399 [==============================] - 0s - loss: 0.2459 - acc: 0.9070 - val_loss: 0.2594 - val_acc: 0.9081
Epoch 38/100
8399/8399 [==============================] - 0s - loss: 0.2458 - acc: 0.9071 - val_loss: 0.2709 - val_acc: 0.8986
Epoch 39/100
8399/8399 [==============================] - 0s - loss: 0.2461 - acc: 0.9071 - val_loss: 0.2703 - val_acc: 0.8986
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
dense_3 (Dense) (None, 100) 2100
_________________________________________________________________
dense_4 (Dense) (None, 2) 202
=================================================================
Total params: 2,302
Trainable params: 2,302
Non-trainable params: 0
_________________________________________________________________
32/4500 [..............................] - ETA: 0s [0.24917428104082742, 0.90933333333333333]