Neural Network

TensorFlow と Keras による IRIS の分類

準備

ライブラリとデータの読み込み

In [0]:
import tensorflow as tf

import keras
from keras.layers import Dense, Activation

from sklearn import datasets
iris = datasets.load_iris()
In [0]:
iris.data
In [0]:
iris.target

データの分離

学習用 80%  テスト用 20%

In [0]:
from sklearn.model_selection import train_test_split as split
X_train, X_test, y_train, y_test = split(iris.data, iris.target, test_size=0.2)

モデルの作成

入力層4個:sepal_length sepal_width petal_length petal_width
隠れ層32個
出力層3個:setosa, versicolor, virginica

In [0]:
model = keras.models.Sequential()
model.add(Dense(units=32, input_shape=(4,), activation = 'relu'))
model.add(Dense(units=3, activation = 'softmax'))
model.compile(loss='sparse_categorical_crossentropy', optimizer='sgd', metrics=['accuracy'])

機械学習

epocs は学習回数で、ここでは120個(全体150個の80%)の
データの学習を100回繰り返し行っています。

In [28]:
model.fit(X_train, y_train, epochs=100)
Epoch 1/100
120/120 [==============================] - 0s 2ms/step - loss: 1.0028 - acc: 0.5750
Epoch 2/100
120/120 [==============================] - 0s 182us/step - loss: 0.8790 - acc: 0.6667
Epoch 3/100
120/120 [==============================] - 0s 162us/step - loss: 0.8457 - acc: 0.6917
Epoch 4/100
120/120 [==============================] - 0s 183us/step - loss: 0.8195 - acc: 0.7750
Epoch 5/100
120/120 [==============================] - 0s 167us/step - loss: 0.7984 - acc: 0.7250
Epoch 6/100
120/120 [==============================] - 0s 163us/step - loss: 0.7781 - acc: 0.6667
Epoch 7/100
120/120 [==============================] - 0s 173us/step - loss: 0.7562 - acc: 0.7083
Epoch 8/100
120/120 [==============================] - 0s 164us/step - loss: 0.7373 - acc: 0.6750
Epoch 9/100
120/120 [==============================] - 0s 167us/step - loss: 0.7211 - acc: 0.7333
Epoch 10/100
120/120 [==============================] - 0s 140us/step - loss: 0.7055 - acc: 0.7333
Epoch 11/100
120/120 [==============================] - 0s 170us/step - loss: 0.6899 - acc: 0.7250
Epoch 12/100
120/120 [==============================] - 0s 189us/step - loss: 0.6776 - acc: 0.8083
Epoch 13/100
120/120 [==============================] - 0s 152us/step - loss: 0.6642 - acc: 0.7583
Epoch 14/100
120/120 [==============================] - 0s 155us/step - loss: 0.6676 - acc: 0.7417
Epoch 15/100
120/120 [==============================] - 0s 143us/step - loss: 0.6374 - acc: 0.8000
Epoch 16/100
120/120 [==============================] - 0s 155us/step - loss: 0.6248 - acc: 0.7917
Epoch 17/100
120/120 [==============================] - 0s 152us/step - loss: 0.6159 - acc: 0.8583
Epoch 18/100
120/120 [==============================] - 0s 171us/step - loss: 0.6041 - acc: 0.8250
Epoch 19/100
120/120 [==============================] - 0s 140us/step - loss: 0.5960 - acc: 0.8250
Epoch 20/100
120/120 [==============================] - 0s 157us/step - loss: 0.5854 - acc: 0.7583
Epoch 21/100
120/120 [==============================] - 0s 160us/step - loss: 0.5761 - acc: 0.8583
Epoch 22/100
120/120 [==============================] - 0s 147us/step - loss: 0.5681 - acc: 0.8667
Epoch 23/100
120/120 [==============================] - 0s 155us/step - loss: 0.5618 - acc: 0.8917
Epoch 24/100
120/120 [==============================] - 0s 147us/step - loss: 0.5542 - acc: 0.8750
Epoch 25/100
120/120 [==============================] - 0s 173us/step - loss: 0.5480 - acc: 0.8583
Epoch 26/100
120/120 [==============================] - 0s 176us/step - loss: 0.5402 - acc: 0.8917
Epoch 27/100
120/120 [==============================] - 0s 138us/step - loss: 0.5371 - acc: 0.8500
Epoch 28/100
120/120 [==============================] - 0s 138us/step - loss: 0.5319 - acc: 0.7833
Epoch 29/100
120/120 [==============================] - 0s 178us/step - loss: 0.5198 - acc: 0.8917
Epoch 30/100
120/120 [==============================] - 0s 164us/step - loss: 0.5229 - acc: 0.8167
Epoch 31/100
120/120 [==============================] - 0s 170us/step - loss: 0.5144 - acc: 0.8250
Epoch 32/100
120/120 [==============================] - 0s 160us/step - loss: 0.5059 - acc: 0.8667
Epoch 33/100
120/120 [==============================] - 0s 164us/step - loss: 0.4979 - acc: 0.8833
Epoch 34/100
120/120 [==============================] - 0s 152us/step - loss: 0.4913 - acc: 0.8333
Epoch 35/100
120/120 [==============================] - 0s 149us/step - loss: 0.4796 - acc: 0.9667
Epoch 36/100
120/120 [==============================] - 0s 152us/step - loss: 0.4759 - acc: 0.9167
Epoch 37/100
120/120 [==============================] - 0s 150us/step - loss: 0.4704 - acc: 0.9000
Epoch 38/100
120/120 [==============================] - 0s 174us/step - loss: 0.4652 - acc: 0.9000
Epoch 39/100
120/120 [==============================] - 0s 177us/step - loss: 0.4623 - acc: 0.8667
Epoch 40/100
120/120 [==============================] - 0s 175us/step - loss: 0.4553 - acc: 0.9583
Epoch 41/100
120/120 [==============================] - 0s 169us/step - loss: 0.4583 - acc: 0.8333
Epoch 42/100
120/120 [==============================] - 0s 164us/step - loss: 0.4461 - acc: 0.9500
Epoch 43/100
120/120 [==============================] - 0s 152us/step - loss: 0.4433 - acc: 0.8917
Epoch 44/100
120/120 [==============================] - 0s 158us/step - loss: 0.4518 - acc: 0.8833
Epoch 45/100
120/120 [==============================] - 0s 182us/step - loss: 0.4373 - acc: 0.9417
Epoch 46/100
120/120 [==============================] - 0s 208us/step - loss: 0.4649 - acc: 0.8417
Epoch 47/100
120/120 [==============================] - 0s 204us/step - loss: 0.4298 - acc: 0.9250
Epoch 48/100
120/120 [==============================] - 0s 179us/step - loss: 0.4279 - acc: 0.8917
Epoch 49/100
120/120 [==============================] - 0s 159us/step - loss: 0.4222 - acc: 0.9667
Epoch 50/100
120/120 [==============================] - 0s 205us/step - loss: 0.4193 - acc: 0.9000
Epoch 51/100
120/120 [==============================] - 0s 158us/step - loss: 0.4172 - acc: 0.9250
Epoch 52/100
120/120 [==============================] - 0s 129us/step - loss: 0.4130 - acc: 0.9667
Epoch 53/100
120/120 [==============================] - 0s 140us/step - loss: 0.4093 - acc: 0.9750
Epoch 54/100
120/120 [==============================] - 0s 197us/step - loss: 0.4090 - acc: 0.9000
Epoch 55/100
120/120 [==============================] - 0s 196us/step - loss: 0.4057 - acc: 0.9333
Epoch 56/100
120/120 [==============================] - 0s 161us/step - loss: 0.4075 - acc: 0.9167
Epoch 57/100
120/120 [==============================] - 0s 142us/step - loss: 0.3999 - acc: 0.9667
Epoch 58/100
120/120 [==============================] - 0s 158us/step - loss: 0.3953 - acc: 0.9083
Epoch 59/100
120/120 [==============================] - 0s 162us/step - loss: 0.3988 - acc: 0.9583
Epoch 60/100
120/120 [==============================] - 0s 153us/step - loss: 0.3950 - acc: 0.9417
Epoch 61/100
120/120 [==============================] - 0s 186us/step - loss: 0.3884 - acc: 0.9167
Epoch 62/100
120/120 [==============================] - 0s 132us/step - loss: 0.3916 - acc: 0.9500
Epoch 63/100
120/120 [==============================] - 0s 153us/step - loss: 0.3835 - acc: 0.9750
Epoch 64/100
120/120 [==============================] - 0s 169us/step - loss: 0.3863 - acc: 0.9583
Epoch 65/100
120/120 [==============================] - 0s 163us/step - loss: 0.3853 - acc: 0.9333
Epoch 66/100
120/120 [==============================] - 0s 156us/step - loss: 0.3746 - acc: 0.9500
Epoch 67/100
120/120 [==============================] - 0s 134us/step - loss: 0.3718 - acc: 0.9667
Epoch 68/100
120/120 [==============================] - 0s 154us/step - loss: 0.3745 - acc: 0.9167
Epoch 69/100
120/120 [==============================] - 0s 156us/step - loss: 0.3713 - acc: 0.9583
Epoch 70/100
120/120 [==============================] - 0s 148us/step - loss: 0.3817 - acc: 0.9083
Epoch 71/100
120/120 [==============================] - 0s 141us/step - loss: 0.3678 - acc: 0.9333
Epoch 72/100
120/120 [==============================] - 0s 149us/step - loss: 0.3627 - acc: 0.9333
Epoch 73/100
120/120 [==============================] - 0s 165us/step - loss: 0.3628 - acc: 0.9583
Epoch 74/100
120/120 [==============================] - 0s 139us/step - loss: 0.3605 - acc: 0.9583
Epoch 75/100
120/120 [==============================] - 0s 167us/step - loss: 0.3560 - acc: 0.9583
Epoch 76/100
120/120 [==============================] - 0s 145us/step - loss: 0.3590 - acc: 0.9583
Epoch 77/100
120/120 [==============================] - 0s 162us/step - loss: 0.3517 - acc: 0.9667
Epoch 78/100
120/120 [==============================] - 0s 135us/step - loss: 0.3494 - acc: 0.9500
Epoch 79/100
120/120 [==============================] - 0s 144us/step - loss: 0.3497 - acc: 0.9583
Epoch 80/100
120/120 [==============================] - 0s 146us/step - loss: 0.3454 - acc: 0.9750
Epoch 81/100
120/120 [==============================] - 0s 164us/step - loss: 0.3438 - acc: 0.9583
Epoch 82/100
120/120 [==============================] - 0s 178us/step - loss: 0.3462 - acc: 0.9500
Epoch 83/100
120/120 [==============================] - 0s 178us/step - loss: 0.3425 - acc: 0.9250
Epoch 84/100
120/120 [==============================] - 0s 227us/step - loss: 0.3414 - acc: 0.9417
Epoch 85/100
120/120 [==============================] - 0s 209us/step - loss: 0.3379 - acc: 0.9667
Epoch 86/100
120/120 [==============================] - 0s 150us/step - loss: 0.3363 - acc: 0.9333
Epoch 87/100
120/120 [==============================] - 0s 178us/step - loss: 0.3321 - acc: 0.9500
Epoch 88/100
120/120 [==============================] - 0s 197us/step - loss: 0.3363 - acc: 0.9750
Epoch 89/100
120/120 [==============================] - 0s 169us/step - loss: 0.3277 - acc: 0.9583
Epoch 90/100
120/120 [==============================] - 0s 172us/step - loss: 0.3289 - acc: 0.9667
Epoch 91/100
120/120 [==============================] - 0s 159us/step - loss: 0.3263 - acc: 0.9667
Epoch 92/100
120/120 [==============================] - 0s 174us/step - loss: 0.3240 - acc: 0.9667
Epoch 93/100
120/120 [==============================] - 0s 155us/step - loss: 0.3214 - acc: 0.9667
Epoch 94/100
120/120 [==============================] - 0s 174us/step - loss: 0.3218 - acc: 0.9583
Epoch 95/100
120/120 [==============================] - 0s 196us/step - loss: 0.3228 - acc: 0.9583
Epoch 96/100
120/120 [==============================] - 0s 172us/step - loss: 0.3194 - acc: 0.9500
Epoch 97/100
120/120 [==============================] - 0s 183us/step - loss: 0.3137 - acc: 0.9667
Epoch 98/100
120/120 [==============================] - 0s 183us/step - loss: 0.3137 - acc: 0.9750
Epoch 99/100
120/120 [==============================] - 0s 177us/step - loss: 0.3110 - acc: 0.9417
Epoch 100/100
120/120 [==============================] - 0s 178us/step - loss: 0.3105 - acc: 0.9750
Out[28]:
<keras.callbacks.History at 0x7f4697555a20>

テスト用のデータでモデルの評価

以下のとおり、テストデータでは96.7%の正解率になることがわかります

In [29]:
score = model.evaluate(X_test, y_test, batch_size = 1)
print("正解率(accuracy)=", score[1])
30/30 [==============================] - 0s 4ms/step
正解率(accuracy)= 0.9666666666666667

任意のデータを分類させる

事例として 4個の入力を、6.4, 2.7, 5.3, 1.9 として、正しく分類するかを試します。
ちなみにこれは virginica 種のデータです。

In [17]:
import numpy as np
x = np.array([[6.4, 2.7, 5.3, 1.9]])
r = model.predict(x)

r
Out[17]:
array([[0.00543651, 0.2909139 , 0.7036496 ]], dtype=float32)

結果

出力された3つの数字はそれぞれ、

setosa種であるの確率:0.5%
versicolor種である確率:29.1%
virginica種である確率:70.4%

を意味します。
virginia種であることを正しく言い当てています。