Similar Use Cases: Glass Quality Assessment App Wine Quality Prediction App
In this project, we will examine the data and build a deep neural network that will classify glass based upon certain features.¶
!wget https://cainvas-static.s3.amazonaws.com/media/user_data/cainvas-admin/glass.csv
--2021-07-14 17:24:00-- https://cainvas-static.s3.amazonaws.com/media/user_data/cainvas-admin/glass.csv Resolving cainvas-static.s3.amazonaws.com (cainvas-static.s3.amazonaws.com)... 52.219.62.32 Connecting to cainvas-static.s3.amazonaws.com (cainvas-static.s3.amazonaws.com)|52.219.62.32|:443... connected. HTTP request sent, awaiting response... 200 OK Length: 10053 (9.8K) [text/csv] Saving to: ‘glass.csv’ glass.csv 100%[===================>] 9.82K --.-KB/s in 0s 2021-07-14 17:24:00 (180 MB/s) - ‘glass.csv’ saved [10053/10053]
Importing the python libraries and packages¶
import pandas as pd
import numpy as np
from sklearn.preprocessing import MinMaxScaler
import matplotlib.pyplot as plt
import seaborn as sns
from sklearn.model_selection import train_test_split
from sklearn.metrics import accuracy_score
import matplotlib.pyplot as plt
import tensorflow as tf
from tensorflow import keras
from tensorflow.keras import Sequential
from tensorflow.keras.layers import Dense, Dropout, BatchNormalization
from tensorflow.keras.optimizers import Adam
from sklearn.metrics import confusion_matrix,classification_report
from tensorflow.keras.utils import to_categorical, normalize
Reading the CSV file of the dataset¶
Pandas read_csv() function imports a CSV file (in our case, ‘glass.csv’) to DataFrame format.
df=pd.read_csv('glass.csv')
Exploring the Data¶
After importing the data, let's learn more about the dataset.
df.head()
RI | Na | Mg | Al | Si | K | Ca | Ba | Fe | Type | |
---|---|---|---|---|---|---|---|---|---|---|
0 | 1.52101 | 13.64 | 4.49 | 1.10 | 71.78 | 0.06 | 8.75 | 0.0 | 0.0 | 1 |
1 | 1.51761 | 13.89 | 3.60 | 1.36 | 72.73 | 0.48 | 7.83 | 0.0 | 0.0 | 1 |
2 | 1.51618 | 13.53 | 3.55 | 1.54 | 72.99 | 0.39 | 7.78 | 0.0 | 0.0 | 1 |
3 | 1.51766 | 13.21 | 3.69 | 1.29 | 72.61 | 0.57 | 8.22 | 0.0 | 0.0 | 1 |
4 | 1.51742 | 13.27 | 3.62 | 1.24 | 73.08 | 0.55 | 8.07 | 0.0 | 0.0 | 1 |
df.info()
<class 'pandas.core.frame.DataFrame'> RangeIndex: 214 entries, 0 to 213 Data columns (total 10 columns): # Column Non-Null Count Dtype --- ------ -------------- ----- 0 RI 214 non-null float64 1 Na 214 non-null float64 2 Mg 214 non-null float64 3 Al 214 non-null float64 4 Si 214 non-null float64 5 K 214 non-null float64 6 Ca 214 non-null float64 7 Ba 214 non-null float64 8 Fe 214 non-null float64 9 Type 214 non-null int64 dtypes: float64(9), int64(1) memory usage: 16.8 KB
df.describe()
RI | Na | Mg | Al | Si | K | Ca | Ba | Fe | Type | |
---|---|---|---|---|---|---|---|---|---|---|
count | 214.000000 | 214.000000 | 214.000000 | 214.000000 | 214.000000 | 214.000000 | 214.000000 | 214.000000 | 214.000000 | 214.000000 |
mean | 1.518365 | 13.407850 | 2.684533 | 1.444907 | 72.650935 | 0.497056 | 8.956963 | 0.175047 | 0.057009 | 2.780374 |
std | 0.003037 | 0.816604 | 1.442408 | 0.499270 | 0.774546 | 0.652192 | 1.423153 | 0.497219 | 0.097439 | 2.103739 |
min | 1.511150 | 10.730000 | 0.000000 | 0.290000 | 69.810000 | 0.000000 | 5.430000 | 0.000000 | 0.000000 | 1.000000 |
25% | 1.516523 | 12.907500 | 2.115000 | 1.190000 | 72.280000 | 0.122500 | 8.240000 | 0.000000 | 0.000000 | 1.000000 |
50% | 1.517680 | 13.300000 | 3.480000 | 1.360000 | 72.790000 | 0.555000 | 8.600000 | 0.000000 | 0.000000 | 2.000000 |
75% | 1.519157 | 13.825000 | 3.600000 | 1.630000 | 73.087500 | 0.610000 | 9.172500 | 0.000000 | 0.100000 | 3.000000 |
max | 1.533930 | 17.380000 | 4.490000 | 3.500000 | 75.410000 | 6.210000 | 16.190000 | 3.150000 | 0.510000 | 7.000000 |
Null check¶
df.isna().sum()
RI 0 Na 0 Mg 0 Al 0 Si 0 K 0 Ca 0 Ba 0 Fe 0 Type 0 dtype: int64
There’s is no null value in a dataset.
Duplicate Check¶
df[df.duplicated()]
RI | Na | Mg | Al | Si | K | Ca | Ba | Fe | Type | |
---|---|---|---|---|---|---|---|---|---|---|
39 | 1.52213 | 14.21 | 3.82 | 0.47 | 71.77 | 0.11 | 9.57 | 0.0 | 0.0 | 1 |
Dropping Duplicate¶
There’re multiple ways to deal with the duplicate records but we have adopted the approach by keeping the last rows and drooping the rows which occurred first in the dataset.
df.drop_duplicates(keep='last',inplace=True)
Pairplot¶
Pairplot shows the relations pairwise among features.
corr_mat=df.corr()
plt.figure(figsize=(16,10))
sns.heatmap(corr_mat,annot=True,fmt='.2f',alpha = 0.7, cmap= 'coolwarm')
plt.show()
Classes Distribution¶
plt.figure(figsize=(10,10))
sns.countplot(x='Type', data=df, order=df['Type'].value_counts().index)
<AxesSubplot:xlabel='Type', ylabel='count'>
The distribution shows us that the data is imbalanced
Data Manipulation¶
Usually the normalization is performed to bring down all the features on the same scale. By brining down all the features to same scale benefit is that model treat each feature as same.
X=df.drop('Type',axis=1)
X=normalize(X)
y=df['Type']
Class Balancing¶
As above from Distribution of class we can see that the classes are imbalance so if we develop the model of unbalance dataset the model will bias towards the class containing most of the samples so dealing with imbalance classes will help in developing fair model
from imblearn.over_sampling import RandomOverSampler
ros = RandomOverSampler(random_state=42)
x_ros, y_ros = ros.fit_resample(X, y)
Preparing the Data¶
X_train, X_test, y_train, y_test = train_test_split(x_ros,y_ros,test_size=0.2,random_state=42)
y_train=to_categorical(y_train)
y_test=to_categorical(y_test)
print('X_train :',X_train.shape)
print('y_train :',y_train.shape)
print('X_test :',X_test.shape)
print('y_test :',y_test.shape)
X_train : (364, 9) y_train : (364, 8) X_test : (92, 9) y_test : (92, 8)
Building the model¶
We create a Sequential model and add layers one at a time until we are happy with our network architecture.
model = tf.keras.models.Sequential([
tf.keras.layers.Dense(256, input_shape=(9,), activation='relu'),
tf.keras.layers.BatchNormalization(),
tf.keras.layers.Dropout(0.3),
tf.keras.layers.Dense(256, activation='relu'),
tf.keras.layers.BatchNormalization(),
tf.keras.layers.Dropout(0.3),
tf.keras.layers.Dense(512, activation='relu'),
tf.keras.layers.BatchNormalization(),
tf.keras.layers.Dropout(0.5),
tf.keras.layers.Dense(8, activation='softmax')
])
Compile the Model¶
Now that the model is defined, we can compile it.
model.compile(loss='categorical_crossentropy',
optimizer=Adam(0.0001),
metrics=['acc'])
Model Summary¶
model.summary()
Model: "sequential" _________________________________________________________________ Layer (type) Output Shape Param # ================================================================= dense (Dense) (None, 256) 2560 _________________________________________________________________ batch_normalization (BatchNo (None, 256) 1024 _________________________________________________________________ dropout (Dropout) (None, 256) 0 _________________________________________________________________ dense_1 (Dense) (None, 256) 65792 _________________________________________________________________ batch_normalization_1 (Batch (None, 256) 1024 _________________________________________________________________ dropout_1 (Dropout) (None, 256) 0 _________________________________________________________________ dense_2 (Dense) (None, 512) 131584 _________________________________________________________________ batch_normalization_2 (Batch (None, 512) 2048 _________________________________________________________________ dropout_2 (Dropout) (None, 512) 0 _________________________________________________________________ dense_3 (Dense) (None, 8) 4104 ================================================================= Total params: 208,136 Trainable params: 206,088 Non-trainable params: 2,048 _________________________________________________________________
Fitting the Model¶
We have defined our model and compiled it ready for efficient computation.
history = model.fit(X_train, y_train,
epochs=400,
validation_data=(X_test, y_test),
verbose=1,
)
Epoch 1/400 12/12 [==============================] - 0s 16ms/step - loss: 3.3028 - acc: 0.1429 - val_loss: 2.0722 - val_acc: 0.1848 Epoch 2/400 12/12 [==============================] - 0s 4ms/step - loss: 2.5345 - acc: 0.2500 - val_loss: 2.0587 - val_acc: 0.1848 Epoch 3/400 12/12 [==============================] - 0s 3ms/step - loss: 2.1647 - acc: 0.3269 - val_loss: 2.0453 - val_acc: 0.1848 Epoch 4/400 12/12 [==============================] - 0s 3ms/step - loss: 2.0455 - acc: 0.3626 - val_loss: 2.0335 - val_acc: 0.1848 Epoch 5/400 12/12 [==============================] - 0s 3ms/step - loss: 1.8094 - acc: 0.4478 - val_loss: 2.0225 - val_acc: 0.1848 Epoch 6/400 12/12 [==============================] - 0s 4ms/step - loss: 1.6704 - acc: 0.4533 - val_loss: 2.0131 - val_acc: 0.1848 Epoch 7/400 12/12 [==============================] - 0s 3ms/step - loss: 1.5667 - acc: 0.4808 - val_loss: 2.0046 - val_acc: 0.1848 Epoch 8/400 12/12 [==============================] - 0s 3ms/step - loss: 1.5318 - acc: 0.4945 - val_loss: 1.9951 - val_acc: 0.1848 Epoch 9/400 12/12 [==============================] - 0s 3ms/step - loss: 1.3878 - acc: 0.5495 - val_loss: 1.9867 - val_acc: 0.1848 Epoch 10/400 12/12 [==============================] - 0s 3ms/step - loss: 1.3815 - acc: 0.5220 - val_loss: 1.9791 - val_acc: 0.1848 Epoch 11/400 12/12 [==============================] - 0s 4ms/step - loss: 1.4113 - acc: 0.5467 - val_loss: 1.9717 - val_acc: 0.1848 Epoch 12/400 12/12 [==============================] - 0s 4ms/step - loss: 1.2949 - acc: 0.5467 - val_loss: 1.9628 - val_acc: 0.1848 Epoch 13/400 12/12 [==============================] - 0s 3ms/step - loss: 1.2791 - acc: 0.5659 - val_loss: 1.9537 - val_acc: 0.1848 Epoch 14/400 12/12 [==============================] - 0s 3ms/step - loss: 1.2925 - acc: 0.5907 - val_loss: 1.9434 - val_acc: 0.1848 Epoch 15/400 12/12 [==============================] - 0s 3ms/step - loss: 1.3415 - acc: 0.5440 - val_loss: 1.9350 - val_acc: 0.1848 Epoch 16/400 12/12 [==============================] - 0s 3ms/step - loss: 1.3280 - acc: 0.5577 - val_loss: 1.9265 - val_acc: 0.1848 Epoch 17/400 12/12 [==============================] - 0s 4ms/step - loss: 1.2673 - acc: 0.5742 - val_loss: 1.9165 - val_acc: 0.1848 Epoch 18/400 12/12 [==============================] - 0s 3ms/step - loss: 1.2576 - acc: 0.5797 - val_loss: 1.9058 - val_acc: 0.1848 Epoch 19/400 12/12 [==============================] - 0s 4ms/step - loss: 1.1717 - acc: 0.6126 - val_loss: 1.8986 - val_acc: 0.2500 Epoch 20/400 12/12 [==============================] - 0s 3ms/step - loss: 1.1273 - acc: 0.5879 - val_loss: 1.8892 - val_acc: 0.3370 Epoch 21/400 12/12 [==============================] - 0s 3ms/step - loss: 1.2552 - acc: 0.5824 - val_loss: 1.8739 - val_acc: 0.3587 Epoch 22/400 12/12 [==============================] - 0s 3ms/step - loss: 1.1587 - acc: 0.6154 - val_loss: 1.8559 - val_acc: 0.3587 Epoch 23/400 12/12 [==============================] - 0s 4ms/step - loss: 1.1922 - acc: 0.5604 - val_loss: 1.8384 - val_acc: 0.3804 Epoch 24/400 12/12 [==============================] - 0s 4ms/step - loss: 1.1435 - acc: 0.6264 - val_loss: 1.8161 - val_acc: 0.3804 Epoch 25/400 12/12 [==============================] - 0s 3ms/step - loss: 1.2469 - acc: 0.6044 - val_loss: 1.7916 - val_acc: 0.4457 Epoch 26/400 12/12 [==============================] - 0s 3ms/step - loss: 1.0371 - acc: 0.6319 - val_loss: 1.7667 - val_acc: 0.5000 Epoch 27/400 12/12 [==============================] - 0s 3ms/step - loss: 1.2216 - acc: 0.5659 - val_loss: 1.7470 - val_acc: 0.5109 Epoch 28/400 12/12 [==============================] - 0s 4ms/step - loss: 1.1009 - acc: 0.6484 - val_loss: 1.7196 - val_acc: 0.5543 Epoch 29/400 12/12 [==============================] - 0s 3ms/step - loss: 1.0803 - acc: 0.6209 - val_loss: 1.6847 - val_acc: 0.6087 Epoch 30/400 12/12 [==============================] - 0s 4ms/step - loss: 1.2183 - acc: 0.5962 - val_loss: 1.6491 - val_acc: 0.6413 Epoch 31/400 12/12 [==============================] - 0s 3ms/step - loss: 1.0455 - acc: 0.6071 - val_loss: 1.6136 - val_acc: 0.6522 Epoch 32/400 12/12 [==============================] - 0s 4ms/step - loss: 1.1263 - acc: 0.6209 - val_loss: 1.5782 - val_acc: 0.6522 Epoch 33/400 12/12 [==============================] - 0s 3ms/step - loss: 1.0636 - acc: 0.6264 - val_loss: 1.5414 - val_acc: 0.6522 Epoch 34/400 12/12 [==============================] - 0s 3ms/step - loss: 1.1067 - acc: 0.6291 - val_loss: 1.4988 - val_acc: 0.6304 Epoch 35/400 12/12 [==============================] - 0s 3ms/step - loss: 1.0689 - acc: 0.6071 - val_loss: 1.4486 - val_acc: 0.5978 Epoch 36/400 12/12 [==============================] - 0s 3ms/step - loss: 1.1386 - acc: 0.5824 - val_loss: 1.3872 - val_acc: 0.6196 Epoch 37/400 12/12 [==============================] - 0s 3ms/step - loss: 1.0633 - acc: 0.6099 - val_loss: 1.3273 - val_acc: 0.6304 Epoch 38/400 12/12 [==============================] - 0s 3ms/step - loss: 1.1734 - acc: 0.5989 - val_loss: 1.2739 - val_acc: 0.6304 Epoch 39/400 12/12 [==============================] - 0s 3ms/step - loss: 1.0132 - acc: 0.6319 - val_loss: 1.2343 - val_acc: 0.6304 Epoch 40/400 12/12 [==============================] - 0s 4ms/step - loss: 1.0544 - acc: 0.6016 - val_loss: 1.1822 - val_acc: 0.6304 Epoch 41/400 12/12 [==============================] - 0s 4ms/step - loss: 1.0139 - acc: 0.6016 - val_loss: 1.1386 - val_acc: 0.6196 Epoch 42/400 12/12 [==============================] - 0s 3ms/step - loss: 1.1267 - acc: 0.6209 - val_loss: 1.0870 - val_acc: 0.6196 Epoch 43/400 12/12 [==============================] - 0s 4ms/step - loss: 0.9966 - acc: 0.6429 - val_loss: 1.0473 - val_acc: 0.6196 Epoch 44/400 12/12 [==============================] - 0s 3ms/step - loss: 0.9370 - acc: 0.6319 - val_loss: 1.0180 - val_acc: 0.6196 Epoch 45/400 12/12 [==============================] - 0s 3ms/step - loss: 1.0553 - acc: 0.6264 - val_loss: 0.9718 - val_acc: 0.6304 Epoch 46/400 12/12 [==============================] - 0s 3ms/step - loss: 1.0859 - acc: 0.6044 - val_loss: 0.9232 - val_acc: 0.6413 Epoch 47/400 12/12 [==============================] - 0s 3ms/step - loss: 0.9561 - acc: 0.6346 - val_loss: 0.8932 - val_acc: 0.6413 Epoch 48/400 12/12 [==============================] - 0s 3ms/step - loss: 1.0110 - acc: 0.6429 - val_loss: 0.8627 - val_acc: 0.6630 Epoch 49/400 12/12 [==============================] - 0s 3ms/step - loss: 1.0606 - acc: 0.6236 - val_loss: 0.8364 - val_acc: 0.6630 Epoch 50/400 12/12 [==============================] - 0s 4ms/step - loss: 0.9551 - acc: 0.6456 - val_loss: 0.7867 - val_acc: 0.6739 Epoch 51/400 12/12 [==============================] - 0s 3ms/step - loss: 0.9816 - acc: 0.6291 - val_loss: 0.7539 - val_acc: 0.6630 Epoch 52/400 12/12 [==============================] - 0s 3ms/step - loss: 0.9859 - acc: 0.6401 - val_loss: 0.7111 - val_acc: 0.6848 Epoch 53/400 12/12 [==============================] - 0s 3ms/step - loss: 1.0138 - acc: 0.6291 - val_loss: 0.6850 - val_acc: 0.7174 Epoch 54/400 12/12 [==============================] - 0s 4ms/step - loss: 0.9043 - acc: 0.6676 - val_loss: 0.6704 - val_acc: 0.7283 Epoch 55/400 12/12 [==============================] - 0s 3ms/step - loss: 0.9585 - acc: 0.6374 - val_loss: 0.6438 - val_acc: 0.7609 Epoch 56/400 12/12 [==============================] - 0s 3ms/step - loss: 0.9539 - acc: 0.6181 - val_loss: 0.6131 - val_acc: 0.7826 Epoch 57/400 12/12 [==============================] - 0s 3ms/step - loss: 0.8816 - acc: 0.6566 - val_loss: 0.6044 - val_acc: 0.7935 Epoch 58/400 12/12 [==============================] - 0s 3ms/step - loss: 1.0060 - acc: 0.6291 - val_loss: 0.5905 - val_acc: 0.8152 Epoch 59/400 12/12 [==============================] - 0s 3ms/step - loss: 0.9459 - acc: 0.6456 - val_loss: 0.5834 - val_acc: 0.8152 Epoch 60/400 12/12 [==============================] - 0s 4ms/step - loss: 0.9290 - acc: 0.6676 - val_loss: 0.5995 - val_acc: 0.7826 Epoch 61/400 12/12 [==============================] - 0s 3ms/step - loss: 1.0831 - acc: 0.5824 - val_loss: 0.6044 - val_acc: 0.7391 Epoch 62/400 12/12 [==============================] - 0s 4ms/step - loss: 0.9270 - acc: 0.6621 - val_loss: 0.6276 - val_acc: 0.6957 Epoch 63/400 12/12 [==============================] - 0s 4ms/step - loss: 0.8652 - acc: 0.6374 - val_loss: 0.6062 - val_acc: 0.7609 Epoch 64/400 12/12 [==============================] - 0s 3ms/step - loss: 0.8720 - acc: 0.6731 - val_loss: 0.5785 - val_acc: 0.8043 Epoch 65/400 12/12 [==============================] - 0s 3ms/step - loss: 0.8961 - acc: 0.6374 - val_loss: 0.5623 - val_acc: 0.8043 Epoch 66/400 12/12 [==============================] - 0s 3ms/step - loss: 0.9252 - acc: 0.6731 - val_loss: 0.5523 - val_acc: 0.7935 Epoch 67/400 12/12 [==============================] - 0s 3ms/step - loss: 0.8289 - acc: 0.6731 - val_loss: 0.5557 - val_acc: 0.7717 Epoch 68/400 12/12 [==============================] - 0s 3ms/step - loss: 0.9555 - acc: 0.6758 - val_loss: 0.5655 - val_acc: 0.7500 Epoch 69/400 12/12 [==============================] - 0s 3ms/step - loss: 0.9240 - acc: 0.6538 - val_loss: 0.5733 - val_acc: 0.7500 Epoch 70/400 12/12 [==============================] - 0s 3ms/step - loss: 0.8816 - acc: 0.6346 - val_loss: 0.5804 - val_acc: 0.7500 Epoch 71/400 12/12 [==============================] - 0s 3ms/step - loss: 0.9193 - acc: 0.6429 - val_loss: 0.5698 - val_acc: 0.7609 Epoch 72/400 12/12 [==============================] - 0s 3ms/step - loss: 0.9706 - acc: 0.6429 - val_loss: 0.5479 - val_acc: 0.7717 Epoch 73/400 12/12 [==============================] - 0s 3ms/step - loss: 0.9173 - acc: 0.6566 - val_loss: 0.5447 - val_acc: 0.7609 Epoch 74/400 12/12 [==============================] - 0s 3ms/step - loss: 0.9632 - acc: 0.6346 - val_loss: 0.5383 - val_acc: 0.7500 Epoch 75/400 12/12 [==============================] - 0s 4ms/step - loss: 0.9376 - acc: 0.6374 - val_loss: 0.5425 - val_acc: 0.7935 Epoch 76/400 12/12 [==============================] - 0s 4ms/step - loss: 0.9651 - acc: 0.6154 - val_loss: 0.5490 - val_acc: 0.7826 Epoch 77/400 12/12 [==============================] - 0s 3ms/step - loss: 0.8847 - acc: 0.6538 - val_loss: 0.5342 - val_acc: 0.8043 Epoch 78/400 12/12 [==============================] - 0s 3ms/step - loss: 0.8516 - acc: 0.6566 - val_loss: 0.5155 - val_acc: 0.8261 Epoch 79/400 12/12 [==============================] - 0s 3ms/step - loss: 0.8643 - acc: 0.6593 - val_loss: 0.5080 - val_acc: 0.8152 Epoch 80/400 12/12 [==============================] - 0s 3ms/step - loss: 0.8926 - acc: 0.6429 - val_loss: 0.4945 - val_acc: 0.7609 Epoch 81/400 12/12 [==============================] - 0s 3ms/step - loss: 0.9232 - acc: 0.6209 - val_loss: 0.4994 - val_acc: 0.7500 Epoch 82/400 12/12 [==============================] - 0s 4ms/step - loss: 0.8712 - acc: 0.6429 - val_loss: 0.5019 - val_acc: 0.7826 Epoch 83/400 12/12 [==============================] - 0s 3ms/step - loss: 0.8286 - acc: 0.6731 - val_loss: 0.5086 - val_acc: 0.8043 Epoch 84/400 12/12 [==============================] - 0s 3ms/step - loss: 0.8794 - acc: 0.6676 - val_loss: 0.5202 - val_acc: 0.7935 Epoch 85/400 12/12 [==============================] - 0s 3ms/step - loss: 0.9169 - acc: 0.6621 - val_loss: 0.5100 - val_acc: 0.7935 Epoch 86/400 12/12 [==============================] - 0s 3ms/step - loss: 0.8046 - acc: 0.6951 - val_loss: 0.5110 - val_acc: 0.8261 Epoch 87/400 12/12 [==============================] - 0s 3ms/step - loss: 0.8409 - acc: 0.6813 - val_loss: 0.5300 - val_acc: 0.7935 Epoch 88/400 12/12 [==============================] - 0s 3ms/step - loss: 0.8237 - acc: 0.6758 - val_loss: 0.5462 - val_acc: 0.7935 Epoch 89/400 12/12 [==============================] - 0s 3ms/step - loss: 0.8690 - acc: 0.6538 - val_loss: 0.5417 - val_acc: 0.7826 Epoch 90/400 12/12 [==============================] - 0s 4ms/step - loss: 0.8352 - acc: 0.6648 - val_loss: 0.5292 - val_acc: 0.7935 Epoch 91/400 12/12 [==============================] - 0s 3ms/step - loss: 0.8709 - acc: 0.6951 - val_loss: 0.5230 - val_acc: 0.7935 Epoch 92/400 12/12 [==============================] - 0s 3ms/step - loss: 0.9713 - acc: 0.6291 - val_loss: 0.5143 - val_acc: 0.8152 Epoch 93/400 12/12 [==============================] - 0s 3ms/step - loss: 0.8694 - acc: 0.6648 - val_loss: 0.5124 - val_acc: 0.8152 Epoch 94/400 12/12 [==============================] - 0s 3ms/step - loss: 0.8264 - acc: 0.6758 - val_loss: 0.5076 - val_acc: 0.7935 Epoch 95/400 12/12 [==============================] - 0s 4ms/step - loss: 0.8307 - acc: 0.6813 - val_loss: 0.5097 - val_acc: 0.7935 Epoch 96/400 12/12 [==============================] - 0s 3ms/step - loss: 0.8287 - acc: 0.6758 - val_loss: 0.5167 - val_acc: 0.7935 Epoch 97/400 12/12 [==============================] - 0s 4ms/step - loss: 0.9121 - acc: 0.6374 - val_loss: 0.5223 - val_acc: 0.7935 Epoch 98/400 12/12 [==============================] - 0s 3ms/step - loss: 0.9209 - acc: 0.6511 - val_loss: 0.5298 - val_acc: 0.8152 Epoch 99/400 12/12 [==============================] - 0s 3ms/step - loss: 0.8791 - acc: 0.6401 - val_loss: 0.5346 - val_acc: 0.8152 Epoch 100/400 12/12 [==============================] - 0s 3ms/step - loss: 0.8305 - acc: 0.6648 - val_loss: 0.5331 - val_acc: 0.8043 Epoch 101/400 12/12 [==============================] - 0s 3ms/step - loss: 0.8200 - acc: 0.6896 - val_loss: 0.5244 - val_acc: 0.8152 Epoch 102/400 12/12 [==============================] - 0s 3ms/step - loss: 0.7859 - acc: 0.6621 - val_loss: 0.5401 - val_acc: 0.8043 Epoch 103/400 12/12 [==============================] - 0s 3ms/step - loss: 0.8216 - acc: 0.6786 - val_loss: 0.5321 - val_acc: 0.8152 Epoch 104/400 12/12 [==============================] - 0s 3ms/step - loss: 0.7819 - acc: 0.6758 - val_loss: 0.5216 - val_acc: 0.8152 Epoch 105/400 12/12 [==============================] - 0s 3ms/step - loss: 0.7486 - acc: 0.7088 - val_loss: 0.5381 - val_acc: 0.8261 Epoch 106/400 12/12 [==============================] - 0s 3ms/step - loss: 0.7695 - acc: 0.6813 - val_loss: 0.5427 - val_acc: 0.8152 Epoch 107/400 12/12 [==============================] - 0s 4ms/step - loss: 0.7518 - acc: 0.6868 - val_loss: 0.5444 - val_acc: 0.8261 Epoch 108/400 12/12 [==============================] - 0s 3ms/step - loss: 0.8028 - acc: 0.6511 - val_loss: 0.5521 - val_acc: 0.8261 Epoch 109/400 12/12 [==============================] - 0s 3ms/step - loss: 0.8351 - acc: 0.6538 - val_loss: 0.5679 - val_acc: 0.7935 Epoch 110/400 12/12 [==============================] - 0s 3ms/step - loss: 0.8343 - acc: 0.6566 - val_loss: 0.5954 - val_acc: 0.7500 Epoch 111/400 12/12 [==============================] - 0s 3ms/step - loss: 0.7576 - acc: 0.6896 - val_loss: 0.6016 - val_acc: 0.7391 Epoch 112/400 12/12 [==============================] - 0s 3ms/step - loss: 0.8038 - acc: 0.6511 - val_loss: 0.5781 - val_acc: 0.7717 Epoch 113/400 12/12 [==============================] - 0s 3ms/step - loss: 0.7871 - acc: 0.6648 - val_loss: 0.5685 - val_acc: 0.7609 Epoch 114/400 12/12 [==============================] - 0s 3ms/step - loss: 0.8886 - acc: 0.6538 - val_loss: 0.5648 - val_acc: 0.7609 Epoch 115/400 12/12 [==============================] - 0s 3ms/step - loss: 0.7624 - acc: 0.6978 - val_loss: 0.5557 - val_acc: 0.7935 Epoch 116/400 12/12 [==============================] - 0s 3ms/step - loss: 0.7925 - acc: 0.6648 - val_loss: 0.5457 - val_acc: 0.8043 Epoch 117/400 12/12 [==============================] - 0s 3ms/step - loss: 0.8212 - acc: 0.6676 - val_loss: 0.5335 - val_acc: 0.8043 Epoch 118/400 12/12 [==============================] - 0s 3ms/step - loss: 0.7516 - acc: 0.6868 - val_loss: 0.5282 - val_acc: 0.8043 Epoch 119/400 12/12 [==============================] - 0s 3ms/step - loss: 0.8742 - acc: 0.6484 - val_loss: 0.5221 - val_acc: 0.8152 Epoch 120/400 12/12 [==============================] - 0s 3ms/step - loss: 0.6509 - acc: 0.7170 - val_loss: 0.5061 - val_acc: 0.8043 Epoch 121/400 12/12 [==============================] - 0s 3ms/step - loss: 0.8135 - acc: 0.6676 - val_loss: 0.4974 - val_acc: 0.8043 Epoch 122/400 12/12 [==============================] - 0s 3ms/step - loss: 0.7891 - acc: 0.6841 - val_loss: 0.4909 - val_acc: 0.8043 Epoch 123/400 12/12 [==============================] - 0s 4ms/step - loss: 0.7904 - acc: 0.6621 - val_loss: 0.5065 - val_acc: 0.8043 Epoch 124/400 12/12 [==============================] - 0s 4ms/step - loss: 0.7728 - acc: 0.6786 - val_loss: 0.5071 - val_acc: 0.7935 Epoch 125/400 12/12 [==============================] - 0s 3ms/step - loss: 0.8096 - acc: 0.6868 - val_loss: 0.5215 - val_acc: 0.8152 Epoch 126/400 12/12 [==============================] - 0s 4ms/step - loss: 0.8484 - acc: 0.6593 - val_loss: 0.5501 - val_acc: 0.8043 Epoch 127/400 12/12 [==============================] - 0s 4ms/step - loss: 0.7015 - acc: 0.7170 - val_loss: 0.5323 - val_acc: 0.8043 Epoch 128/400 12/12 [==============================] - 0s 3ms/step - loss: 0.7195 - acc: 0.6868 - val_loss: 0.5285 - val_acc: 0.8043 Epoch 129/400 12/12 [==============================] - 0s 3ms/step - loss: 0.7763 - acc: 0.6593 - val_loss: 0.5165 - val_acc: 0.7935 Epoch 130/400 12/12 [==============================] - 0s 3ms/step - loss: 0.8232 - acc: 0.6676 - val_loss: 0.5110 - val_acc: 0.8043 Epoch 131/400 12/12 [==============================] - 0s 3ms/step - loss: 0.7495 - acc: 0.6813 - val_loss: 0.5111 - val_acc: 0.8043 Epoch 132/400 12/12 [==============================] - 0s 4ms/step - loss: 0.7712 - acc: 0.7143 - val_loss: 0.5379 - val_acc: 0.7935 Epoch 133/400 12/12 [==============================] - 0s 3ms/step - loss: 0.7377 - acc: 0.6868 - val_loss: 0.5181 - val_acc: 0.7935 Epoch 134/400 12/12 [==============================] - 0s 4ms/step - loss: 0.7484 - acc: 0.6951 - val_loss: 0.5208 - val_acc: 0.7935 Epoch 135/400 12/12 [==============================] - 0s 4ms/step - loss: 0.7997 - acc: 0.6813 - val_loss: 0.5268 - val_acc: 0.8152 Epoch 136/400 12/12 [==============================] - 0s 3ms/step - loss: 0.7066 - acc: 0.6923 - val_loss: 0.5341 - val_acc: 0.8043 Epoch 137/400 12/12 [==============================] - 0s 3ms/step - loss: 0.7306 - acc: 0.6648 - val_loss: 0.5266 - val_acc: 0.8152 Epoch 138/400 12/12 [==============================] - 0s 4ms/step - loss: 0.7168 - acc: 0.7363 - val_loss: 0.5028 - val_acc: 0.8152 Epoch 139/400 12/12 [==============================] - 0s 3ms/step - loss: 0.7248 - acc: 0.6841 - val_loss: 0.4903 - val_acc: 0.8152 Epoch 140/400 12/12 [==============================] - 0s 4ms/step - loss: 0.7767 - acc: 0.7143 - val_loss: 0.4794 - val_acc: 0.8043 Epoch 141/400 12/12 [==============================] - 0s 4ms/step - loss: 0.7308 - acc: 0.6868 - val_loss: 0.4947 - val_acc: 0.8043 Epoch 142/400 12/12 [==============================] - 0s 3ms/step - loss: 0.7354 - acc: 0.6896 - val_loss: 0.5094 - val_acc: 0.8261 Epoch 143/400 12/12 [==============================] - 0s 4ms/step - loss: 0.7765 - acc: 0.6923 - val_loss: 0.5212 - val_acc: 0.8043 Epoch 144/400 12/12 [==============================] - 0s 3ms/step - loss: 0.7672 - acc: 0.6621 - val_loss: 0.5064 - val_acc: 0.8043 Epoch 145/400 12/12 [==============================] - 0s 3ms/step - loss: 0.8613 - acc: 0.6648 - val_loss: 0.5068 - val_acc: 0.8043 Epoch 146/400 12/12 [==============================] - 0s 3ms/step - loss: 0.7818 - acc: 0.6841 - val_loss: 0.5233 - val_acc: 0.8043 Epoch 147/400 12/12 [==============================] - 0s 3ms/step - loss: 0.6920 - acc: 0.7308 - val_loss: 0.5469 - val_acc: 0.7826 Epoch 148/400 12/12 [==============================] - 0s 4ms/step - loss: 0.6812 - acc: 0.7280 - val_loss: 0.5516 - val_acc: 0.7935 Epoch 149/400 12/12 [==============================] - 0s 4ms/step - loss: 0.7455 - acc: 0.7033 - val_loss: 0.5502 - val_acc: 0.8152 Epoch 150/400 12/12 [==============================] - 0s 4ms/step - loss: 0.8089 - acc: 0.7060 - val_loss: 0.5306 - val_acc: 0.8261 Epoch 151/400 12/12 [==============================] - 0s 3ms/step - loss: 0.7441 - acc: 0.6813 - val_loss: 0.5141 - val_acc: 0.8370 Epoch 152/400 12/12 [==============================] - 0s 4ms/step - loss: 0.7793 - acc: 0.6923 - val_loss: 0.5017 - val_acc: 0.8370 Epoch 153/400 12/12 [==============================] - 0s 4ms/step - loss: 0.7490 - acc: 0.6841 - val_loss: 0.4949 - val_acc: 0.8261 Epoch 154/400 12/12 [==============================] - 0s 3ms/step - loss: 0.7346 - acc: 0.6978 - val_loss: 0.4951 - val_acc: 0.8152 Epoch 155/400 12/12 [==============================] - 0s 4ms/step - loss: 0.6809 - acc: 0.7198 - val_loss: 0.5071 - val_acc: 0.8261 Epoch 156/400 12/12 [==============================] - 0s 3ms/step - loss: 0.6828 - acc: 0.6978 - val_loss: 0.5374 - val_acc: 0.8261 Epoch 157/400 12/12 [==============================] - 0s 3ms/step - loss: 0.8247 - acc: 0.6538 - val_loss: 0.5407 - val_acc: 0.7826 Epoch 158/400 12/12 [==============================] - 0s 4ms/step - loss: 0.8041 - acc: 0.6731 - val_loss: 0.5460 - val_acc: 0.7826 Epoch 159/400 12/12 [==============================] - 0s 3ms/step - loss: 0.6729 - acc: 0.7308 - val_loss: 0.5038 - val_acc: 0.8043 Epoch 160/400 12/12 [==============================] - 0s 3ms/step - loss: 0.7674 - acc: 0.6841 - val_loss: 0.4975 - val_acc: 0.8043 Epoch 161/400 12/12 [==============================] - 0s 4ms/step - loss: 0.7170 - acc: 0.7033 - val_loss: 0.4898 - val_acc: 0.8043 Epoch 162/400 12/12 [==============================] - 0s 3ms/step - loss: 0.7244 - acc: 0.6896 - val_loss: 0.5140 - val_acc: 0.8043 Epoch 163/400 12/12 [==============================] - 0s 4ms/step - loss: 0.7574 - acc: 0.7170 - val_loss: 0.5606 - val_acc: 0.7935 Epoch 164/400 12/12 [==============================] - 0s 3ms/step - loss: 0.7878 - acc: 0.6593 - val_loss: 0.5544 - val_acc: 0.8043 Epoch 165/400 12/12 [==============================] - 0s 4ms/step - loss: 0.7102 - acc: 0.7143 - val_loss: 0.5390 - val_acc: 0.7826 Epoch 166/400 12/12 [==============================] - 0s 4ms/step - loss: 0.7264 - acc: 0.7170 - val_loss: 0.5291 - val_acc: 0.7935 Epoch 167/400 12/12 [==============================] - 0s 3ms/step - loss: 0.7172 - acc: 0.7115 - val_loss: 0.5305 - val_acc: 0.8152 Epoch 168/400 12/12 [==============================] - 0s 3ms/step - loss: 0.7559 - acc: 0.7060 - val_loss: 0.5421 - val_acc: 0.7717 Epoch 169/400 12/12 [==============================] - 0s 4ms/step - loss: 0.6641 - acc: 0.7225 - val_loss: 0.5353 - val_acc: 0.7935 Epoch 170/400 12/12 [==============================] - 0s 3ms/step - loss: 0.6763 - acc: 0.6951 - val_loss: 0.5127 - val_acc: 0.7935 Epoch 171/400 12/12 [==============================] - 0s 4ms/step - loss: 0.7238 - acc: 0.6978 - val_loss: 0.5070 - val_acc: 0.8152 Epoch 172/400 12/12 [==============================] - 0s 3ms/step - loss: 0.7297 - acc: 0.6813 - val_loss: 0.5033 - val_acc: 0.8261 Epoch 173/400 12/12 [==============================] - 0s 3ms/step - loss: 0.6707 - acc: 0.7225 - val_loss: 0.5119 - val_acc: 0.7935 Epoch 174/400 12/12 [==============================] - 0s 3ms/step - loss: 0.7180 - acc: 0.7033 - val_loss: 0.5080 - val_acc: 0.8261 Epoch 175/400 12/12 [==============================] - 0s 4ms/step - loss: 0.7029 - acc: 0.7060 - val_loss: 0.5052 - val_acc: 0.8152 Epoch 176/400 12/12 [==============================] - 0s 4ms/step - loss: 0.7751 - acc: 0.6484 - val_loss: 0.5141 - val_acc: 0.7826 Epoch 177/400 12/12 [==============================] - 0s 3ms/step - loss: 0.6605 - acc: 0.7582 - val_loss: 0.5045 - val_acc: 0.7826 Epoch 178/400 12/12 [==============================] - 0s 3ms/step - loss: 0.6417 - acc: 0.7198 - val_loss: 0.4919 - val_acc: 0.7717 Epoch 179/400 12/12 [==============================] - 0s 4ms/step - loss: 0.7115 - acc: 0.6978 - val_loss: 0.4816 - val_acc: 0.7935 Epoch 180/400 12/12 [==============================] - 0s 3ms/step - loss: 0.6645 - acc: 0.7225 - val_loss: 0.4763 - val_acc: 0.7935 Epoch 181/400 12/12 [==============================] - 0s 3ms/step - loss: 0.7660 - acc: 0.6896 - val_loss: 0.4562 - val_acc: 0.7935 Epoch 182/400 12/12 [==============================] - 0s 3ms/step - loss: 0.6889 - acc: 0.6951 - val_loss: 0.4478 - val_acc: 0.8043 Epoch 183/400 12/12 [==============================] - 0s 3ms/step - loss: 0.7076 - acc: 0.7170 - val_loss: 0.4520 - val_acc: 0.8370 Epoch 184/400 12/12 [==============================] - 0s 4ms/step - loss: 0.7254 - acc: 0.7088 - val_loss: 0.4505 - val_acc: 0.8370 Epoch 185/400 12/12 [==============================] - 0s 4ms/step - loss: 0.7457 - acc: 0.7060 - val_loss: 0.4699 - val_acc: 0.8261 Epoch 186/400 12/12 [==============================] - 0s 3ms/step - loss: 0.7626 - acc: 0.6951 - val_loss: 0.4774 - val_acc: 0.8152 Epoch 187/400 12/12 [==============================] - 0s 4ms/step - loss: 0.6800 - acc: 0.7088 - val_loss: 0.4767 - val_acc: 0.8043 Epoch 188/400 12/12 [==============================] - 0s 3ms/step - loss: 0.6915 - acc: 0.7005 - val_loss: 0.4868 - val_acc: 0.8043 Epoch 189/400 12/12 [==============================] - 0s 3ms/step - loss: 0.7122 - acc: 0.7088 - val_loss: 0.4950 - val_acc: 0.7935 Epoch 190/400 12/12 [==============================] - 0s 4ms/step - loss: 0.7551 - acc: 0.6841 - val_loss: 0.4795 - val_acc: 0.8043 Epoch 191/400 12/12 [==============================] - 0s 3ms/step - loss: 0.6706 - acc: 0.7253 - val_loss: 0.4743 - val_acc: 0.8152 Epoch 192/400 12/12 [==============================] - 0s 3ms/step - loss: 0.7357 - acc: 0.6841 - val_loss: 0.4824 - val_acc: 0.7935 Epoch 193/400 12/12 [==============================] - 0s 3ms/step - loss: 0.6785 - acc: 0.6978 - val_loss: 0.5139 - val_acc: 0.7717 Epoch 194/400 12/12 [==============================] - 0s 3ms/step - loss: 0.6440 - acc: 0.6978 - val_loss: 0.5349 - val_acc: 0.7500 Epoch 195/400 12/12 [==============================] - 0s 3ms/step - loss: 0.7369 - acc: 0.7033 - val_loss: 0.5342 - val_acc: 0.7609 Epoch 196/400 12/12 [==============================] - 0s 3ms/step - loss: 0.6563 - acc: 0.7253 - val_loss: 0.5294 - val_acc: 0.7826 Epoch 197/400 12/12 [==============================] - 0s 3ms/step - loss: 0.6777 - acc: 0.7088 - val_loss: 0.4968 - val_acc: 0.8043 Epoch 198/400 12/12 [==============================] - 0s 4ms/step - loss: 0.6173 - acc: 0.7527 - val_loss: 0.4978 - val_acc: 0.8043 Epoch 199/400 12/12 [==============================] - 0s 4ms/step - loss: 0.6742 - acc: 0.7060 - val_loss: 0.4901 - val_acc: 0.8043 Epoch 200/400 12/12 [==============================] - 0s 3ms/step - loss: 0.6346 - acc: 0.7363 - val_loss: 0.4881 - val_acc: 0.8152 Epoch 201/400 12/12 [==============================] - 0s 3ms/step - loss: 0.6346 - acc: 0.7335 - val_loss: 0.5256 - val_acc: 0.8043 Epoch 202/400 12/12 [==============================] - 0s 4ms/step - loss: 0.6192 - acc: 0.7363 - val_loss: 0.5441 - val_acc: 0.7826 Epoch 203/400 12/12 [==============================] - 0s 3ms/step - loss: 0.6581 - acc: 0.7280 - val_loss: 0.5456 - val_acc: 0.7826 Epoch 204/400 12/12 [==============================] - 0s 3ms/step - loss: 0.7019 - acc: 0.7225 - val_loss: 0.5384 - val_acc: 0.7935 Epoch 205/400 12/12 [==============================] - 0s 4ms/step - loss: 0.6868 - acc: 0.7225 - val_loss: 0.5376 - val_acc: 0.7935 Epoch 206/400 12/12 [==============================] - 0s 3ms/step - loss: 0.7026 - acc: 0.7060 - val_loss: 0.5264 - val_acc: 0.7826 Epoch 207/400 12/12 [==============================] - 0s 4ms/step - loss: 0.7095 - acc: 0.6841 - val_loss: 0.5306 - val_acc: 0.7826 Epoch 208/400 12/12 [==============================] - 0s 4ms/step - loss: 0.7000 - acc: 0.7088 - val_loss: 0.5201 - val_acc: 0.7826 Epoch 209/400 12/12 [==============================] - 0s 3ms/step - loss: 0.6507 - acc: 0.7418 - val_loss: 0.4889 - val_acc: 0.8043 Epoch 210/400 12/12 [==============================] - 0s 3ms/step - loss: 0.5947 - acc: 0.7363 - val_loss: 0.5000 - val_acc: 0.8043 Epoch 211/400 12/12 [==============================] - 0s 3ms/step - loss: 0.7171 - acc: 0.7198 - val_loss: 0.5038 - val_acc: 0.8043 Epoch 212/400 12/12 [==============================] - 0s 3ms/step - loss: 0.6920 - acc: 0.6923 - val_loss: 0.5368 - val_acc: 0.7609 Epoch 213/400 12/12 [==============================] - 0s 3ms/step - loss: 0.6665 - acc: 0.7170 - val_loss: 0.5321 - val_acc: 0.7609 Epoch 214/400 12/12 [==============================] - 0s 4ms/step - loss: 0.6773 - acc: 0.7170 - val_loss: 0.5308 - val_acc: 0.7935 Epoch 215/400 12/12 [==============================] - 0s 4ms/step - loss: 0.6772 - acc: 0.7005 - val_loss: 0.5478 - val_acc: 0.7826 Epoch 216/400 12/12 [==============================] - 0s 3ms/step - loss: 0.6504 - acc: 0.7198 - val_loss: 0.5272 - val_acc: 0.8043 Epoch 217/400 12/12 [==============================] - 0s 3ms/step - loss: 0.6882 - acc: 0.7170 - val_loss: 0.4993 - val_acc: 0.8152 Epoch 218/400 12/12 [==============================] - 0s 4ms/step - loss: 0.6248 - acc: 0.7390 - val_loss: 0.5047 - val_acc: 0.7935 Epoch 219/400 12/12 [==============================] - 0s 4ms/step - loss: 0.6756 - acc: 0.7170 - val_loss: 0.5315 - val_acc: 0.7717 Epoch 220/400 12/12 [==============================] - 0s 4ms/step - loss: 0.5767 - acc: 0.7390 - val_loss: 0.5338 - val_acc: 0.7717 Epoch 221/400 12/12 [==============================] - 0s 3ms/step - loss: 0.6095 - acc: 0.7280 - val_loss: 0.5481 - val_acc: 0.7717 Epoch 222/400 12/12 [==============================] - 0s 4ms/step - loss: 0.6722 - acc: 0.7198 - val_loss: 0.5410 - val_acc: 0.7826 Epoch 223/400 12/12 [==============================] - 0s 3ms/step - loss: 0.6691 - acc: 0.7253 - val_loss: 0.5249 - val_acc: 0.7826 Epoch 224/400 12/12 [==============================] - 0s 3ms/step - loss: 0.6404 - acc: 0.7198 - val_loss: 0.5209 - val_acc: 0.7935 Epoch 225/400 12/12 [==============================] - 0s 3ms/step - loss: 0.6689 - acc: 0.7198 - val_loss: 0.5302 - val_acc: 0.7826 Epoch 226/400 12/12 [==============================] - 0s 4ms/step - loss: 0.6867 - acc: 0.7198 - val_loss: 0.5461 - val_acc: 0.7717 Epoch 227/400 12/12 [==============================] - 0s 4ms/step - loss: 0.6111 - acc: 0.7418 - val_loss: 0.5400 - val_acc: 0.7935 Epoch 228/400 12/12 [==============================] - 0s 4ms/step - loss: 0.6457 - acc: 0.7500 - val_loss: 0.5526 - val_acc: 0.8152 Epoch 229/400 12/12 [==============================] - 0s 4ms/step - loss: 0.6711 - acc: 0.7445 - val_loss: 0.5541 - val_acc: 0.7826 Epoch 230/400 12/12 [==============================] - 0s 4ms/step - loss: 0.7045 - acc: 0.7033 - val_loss: 0.5440 - val_acc: 0.8043 Epoch 231/400 12/12 [==============================] - 0s 3ms/step - loss: 0.6635 - acc: 0.7143 - val_loss: 0.5338 - val_acc: 0.8043 Epoch 232/400 12/12 [==============================] - 0s 3ms/step - loss: 0.6679 - acc: 0.7308 - val_loss: 0.5175 - val_acc: 0.8152 Epoch 233/400 12/12 [==============================] - 0s 3ms/step - loss: 0.7473 - acc: 0.7088 - val_loss: 0.5127 - val_acc: 0.7935 Epoch 234/400 12/12 [==============================] - 0s 3ms/step - loss: 0.5910 - acc: 0.7280 - val_loss: 0.5185 - val_acc: 0.8043 Epoch 235/400 12/12 [==============================] - 0s 3ms/step - loss: 0.6954 - acc: 0.7143 - val_loss: 0.5345 - val_acc: 0.7826 Epoch 236/400 12/12 [==============================] - 0s 4ms/step - loss: 0.5890 - acc: 0.7363 - val_loss: 0.5174 - val_acc: 0.8261 Epoch 237/400 12/12 [==============================] - 0s 4ms/step - loss: 0.6218 - acc: 0.7088 - val_loss: 0.5085 - val_acc: 0.8152 Epoch 238/400 12/12 [==============================] - 0s 4ms/step - loss: 0.6770 - acc: 0.7198 - val_loss: 0.5141 - val_acc: 0.8261 Epoch 239/400 12/12 [==============================] - 0s 3ms/step - loss: 0.5940 - acc: 0.7363 - val_loss: 0.5349 - val_acc: 0.7935 Epoch 240/400 12/12 [==============================] - 0s 3ms/step - loss: 0.6922 - acc: 0.7005 - val_loss: 0.5563 - val_acc: 0.7609 Epoch 241/400 12/12 [==============================] - 0s 3ms/step - loss: 0.6365 - acc: 0.7143 - val_loss: 0.5517 - val_acc: 0.7717 Epoch 242/400 12/12 [==============================] - 0s 4ms/step - loss: 0.6489 - acc: 0.7170 - val_loss: 0.5449 - val_acc: 0.7717 Epoch 243/400 12/12 [==============================] - 0s 3ms/step - loss: 0.6201 - acc: 0.7060 - val_loss: 0.5578 - val_acc: 0.7609 Epoch 244/400 12/12 [==============================] - 0s 4ms/step - loss: 0.7034 - acc: 0.6786 - val_loss: 0.5738 - val_acc: 0.7391 Epoch 245/400 12/12 [==============================] - 0s 3ms/step - loss: 0.6819 - acc: 0.7363 - val_loss: 0.5835 - val_acc: 0.7283 Epoch 246/400 12/12 [==============================] - 0s 4ms/step - loss: 0.6901 - acc: 0.7033 - val_loss: 0.5670 - val_acc: 0.7283 Epoch 247/400 12/12 [==============================] - 0s 3ms/step - loss: 0.6886 - acc: 0.6978 - val_loss: 0.5415 - val_acc: 0.7717 Epoch 248/400 12/12 [==============================] - 0s 4ms/step - loss: 0.6945 - acc: 0.6978 - val_loss: 0.5050 - val_acc: 0.8152 Epoch 249/400 12/12 [==============================] - 0s 3ms/step - loss: 0.6286 - acc: 0.7088 - val_loss: 0.4998 - val_acc: 0.7935 Epoch 250/400 12/12 [==============================] - 0s 4ms/step - loss: 0.6302 - acc: 0.7143 - val_loss: 0.4908 - val_acc: 0.7935 Epoch 251/400 12/12 [==============================] - 0s 3ms/step - loss: 0.6004 - acc: 0.7253 - val_loss: 0.5105 - val_acc: 0.7717 Epoch 252/400 12/12 [==============================] - 0s 4ms/step - loss: 0.6543 - acc: 0.6868 - val_loss: 0.5071 - val_acc: 0.7717 Epoch 253/400 12/12 [==============================] - 0s 3ms/step - loss: 0.6198 - acc: 0.7253 - val_loss: 0.5067 - val_acc: 0.7717 Epoch 254/400 12/12 [==============================] - 0s 3ms/step - loss: 0.6418 - acc: 0.7280 - val_loss: 0.5092 - val_acc: 0.7826 Epoch 255/400 12/12 [==============================] - 0s 3ms/step - loss: 0.6511 - acc: 0.7143 - val_loss: 0.5333 - val_acc: 0.7826 Epoch 256/400 12/12 [==============================] - 0s 4ms/step - loss: 0.6337 - acc: 0.7445 - val_loss: 0.5262 - val_acc: 0.7935 Epoch 257/400 12/12 [==============================] - 0s 3ms/step - loss: 0.5936 - acc: 0.7390 - val_loss: 0.5318 - val_acc: 0.7935 Epoch 258/400 12/12 [==============================] - 0s 3ms/step - loss: 0.6356 - acc: 0.7253 - val_loss: 0.5326 - val_acc: 0.8043 Epoch 259/400 12/12 [==============================] - 0s 4ms/step - loss: 0.6153 - acc: 0.7390 - val_loss: 0.5158 - val_acc: 0.7826 Epoch 260/400 12/12 [==============================] - 0s 3ms/step - loss: 0.6963 - acc: 0.7225 - val_loss: 0.4982 - val_acc: 0.8152 Epoch 261/400 12/12 [==============================] - 0s 3ms/step - loss: 0.7039 - acc: 0.6978 - val_loss: 0.5086 - val_acc: 0.8261 Epoch 262/400 12/12 [==============================] - 0s 3ms/step - loss: 0.6231 - acc: 0.7143 - val_loss: 0.5126 - val_acc: 0.8261 Epoch 263/400 12/12 [==============================] - 0s 4ms/step - loss: 0.6280 - acc: 0.7335 - val_loss: 0.5172 - val_acc: 0.8261 Epoch 264/400 12/12 [==============================] - 0s 4ms/step - loss: 0.6030 - acc: 0.7473 - val_loss: 0.5214 - val_acc: 0.7826 Epoch 265/400 12/12 [==============================] - 0s 3ms/step - loss: 0.6150 - acc: 0.7637 - val_loss: 0.5087 - val_acc: 0.8043 Epoch 266/400 12/12 [==============================] - 0s 3ms/step - loss: 0.6880 - acc: 0.7253 - val_loss: 0.5127 - val_acc: 0.7826 Epoch 267/400 12/12 [==============================] - 0s 3ms/step - loss: 0.6386 - acc: 0.7033 - val_loss: 0.4951 - val_acc: 0.7935 Epoch 268/400 12/12 [==============================] - 0s 4ms/step - loss: 0.5797 - acc: 0.7473 - val_loss: 0.5054 - val_acc: 0.7826 Epoch 269/400 12/12 [==============================] - 0s 3ms/step - loss: 0.5548 - acc: 0.7445 - val_loss: 0.5166 - val_acc: 0.7826 Epoch 270/400 12/12 [==============================] - 0s 4ms/step - loss: 0.6450 - acc: 0.7363 - val_loss: 0.4974 - val_acc: 0.8370 Epoch 271/400 12/12 [==============================] - 0s 3ms/step - loss: 0.5695 - acc: 0.7747 - val_loss: 0.4909 - val_acc: 0.8370 Epoch 272/400 12/12 [==============================] - 0s 3ms/step - loss: 0.6168 - acc: 0.7610 - val_loss: 0.4967 - val_acc: 0.8261 Epoch 273/400 12/12 [==============================] - 0s 4ms/step - loss: 0.5733 - acc: 0.7418 - val_loss: 0.5068 - val_acc: 0.8261 Epoch 274/400 12/12 [==============================] - 0s 4ms/step - loss: 0.5870 - acc: 0.7335 - val_loss: 0.5057 - val_acc: 0.8152 Epoch 275/400 12/12 [==============================] - 0s 3ms/step - loss: 0.6905 - acc: 0.7308 - val_loss: 0.5044 - val_acc: 0.8152 Epoch 276/400 12/12 [==============================] - 0s 3ms/step - loss: 0.6169 - acc: 0.7445 - val_loss: 0.4804 - val_acc: 0.8152 Epoch 277/400 12/12 [==============================] - 0s 3ms/step - loss: 0.5574 - acc: 0.7473 - val_loss: 0.4868 - val_acc: 0.8370 Epoch 278/400 12/12 [==============================] - 0s 3ms/step - loss: 0.6667 - acc: 0.7143 - val_loss: 0.4961 - val_acc: 0.7826 Epoch 279/400 12/12 [==============================] - 0s 4ms/step - loss: 0.6328 - acc: 0.7143 - val_loss: 0.4930 - val_acc: 0.8043 Epoch 280/400 12/12 [==============================] - 0s 3ms/step - loss: 0.6135 - acc: 0.7582 - val_loss: 0.5168 - val_acc: 0.7826 Epoch 281/400 12/12 [==============================] - 0s 3ms/step - loss: 0.6431 - acc: 0.7445 - val_loss: 0.4906 - val_acc: 0.8043 Epoch 282/400 12/12 [==============================] - 0s 3ms/step - loss: 0.6240 - acc: 0.7198 - val_loss: 0.4642 - val_acc: 0.7935 Epoch 283/400 12/12 [==============================] - 0s 3ms/step - loss: 0.5880 - acc: 0.7582 - val_loss: 0.4572 - val_acc: 0.7935 Epoch 284/400 12/12 [==============================] - 0s 4ms/step - loss: 0.6317 - acc: 0.7418 - val_loss: 0.4559 - val_acc: 0.7935 Epoch 285/400 12/12 [==============================] - 0s 4ms/step - loss: 0.6340 - acc: 0.7143 - val_loss: 0.4634 - val_acc: 0.7935 Epoch 286/400 12/12 [==============================] - 0s 4ms/step - loss: 0.6346 - acc: 0.7308 - val_loss: 0.4623 - val_acc: 0.7935 Epoch 287/400 12/12 [==============================] - 0s 4ms/step - loss: 0.6384 - acc: 0.7390 - val_loss: 0.4653 - val_acc: 0.8043 Epoch 288/400 12/12 [==============================] - 0s 4ms/step - loss: 0.6126 - acc: 0.7198 - val_loss: 0.5068 - val_acc: 0.7935 Epoch 289/400 12/12 [==============================] - 0s 3ms/step - loss: 0.6373 - acc: 0.7225 - val_loss: 0.5153 - val_acc: 0.7826 Epoch 290/400 12/12 [==============================] - 0s 3ms/step - loss: 0.6384 - acc: 0.7253 - val_loss: 0.4951 - val_acc: 0.7935 Epoch 291/400 12/12 [==============================] - 0s 3ms/step - loss: 0.6346 - acc: 0.7418 - val_loss: 0.4825 - val_acc: 0.8261 Epoch 292/400 12/12 [==============================] - 0s 3ms/step - loss: 0.5706 - acc: 0.7473 - val_loss: 0.4514 - val_acc: 0.8152 Epoch 293/400 12/12 [==============================] - 0s 3ms/step - loss: 0.6367 - acc: 0.7225 - val_loss: 0.4569 - val_acc: 0.8152 Epoch 294/400 12/12 [==============================] - 0s 4ms/step - loss: 0.5463 - acc: 0.7527 - val_loss: 0.4760 - val_acc: 0.8261 Epoch 295/400 12/12 [==============================] - 0s 3ms/step - loss: 0.6149 - acc: 0.7500 - val_loss: 0.4867 - val_acc: 0.8152 Epoch 296/400 12/12 [==============================] - 0s 4ms/step - loss: 0.6296 - acc: 0.7335 - val_loss: 0.4933 - val_acc: 0.7935 Epoch 297/400 12/12 [==============================] - 0s 3ms/step - loss: 0.6610 - acc: 0.7115 - val_loss: 0.5008 - val_acc: 0.7609 Epoch 298/400 12/12 [==============================] - 0s 3ms/step - loss: 0.6359 - acc: 0.7170 - val_loss: 0.4967 - val_acc: 0.7935 Epoch 299/400 12/12 [==============================] - 0s 3ms/step - loss: 0.6256 - acc: 0.7225 - val_loss: 0.4968 - val_acc: 0.7935 Epoch 300/400 12/12 [==============================] - 0s 3ms/step - loss: 0.5515 - acc: 0.7637 - val_loss: 0.4961 - val_acc: 0.8043 Epoch 301/400 12/12 [==============================] - 0s 3ms/step - loss: 0.5954 - acc: 0.7198 - val_loss: 0.4800 - val_acc: 0.8370 Epoch 302/400 12/12 [==============================] - 0s 3ms/step - loss: 0.5976 - acc: 0.7527 - val_loss: 0.4803 - val_acc: 0.8370 Epoch 303/400 12/12 [==============================] - 0s 4ms/step - loss: 0.5649 - acc: 0.7637 - val_loss: 0.4773 - val_acc: 0.8478 Epoch 304/400 12/12 [==============================] - 0s 3ms/step - loss: 0.6388 - acc: 0.7390 - val_loss: 0.4882 - val_acc: 0.8261 Epoch 305/400 12/12 [==============================] - 0s 3ms/step - loss: 0.5957 - acc: 0.7637 - val_loss: 0.4889 - val_acc: 0.8152 Epoch 306/400 12/12 [==============================] - 0s 4ms/step - loss: 0.5968 - acc: 0.7610 - val_loss: 0.5021 - val_acc: 0.8043 Epoch 307/400 12/12 [==============================] - 0s 4ms/step - loss: 0.6339 - acc: 0.7473 - val_loss: 0.4740 - val_acc: 0.7935 Epoch 308/400 12/12 [==============================] - 0s 3ms/step - loss: 0.5988 - acc: 0.7445 - val_loss: 0.4707 - val_acc: 0.7935 Epoch 309/400 12/12 [==============================] - 0s 3ms/step - loss: 0.5720 - acc: 0.7582 - val_loss: 0.4648 - val_acc: 0.8152 Epoch 310/400 12/12 [==============================] - 0s 4ms/step - loss: 0.6041 - acc: 0.7308 - val_loss: 0.4604 - val_acc: 0.8152 Epoch 311/400 12/12 [==============================] - 0s 3ms/step - loss: 0.6061 - acc: 0.7308 - val_loss: 0.4628 - val_acc: 0.8152 Epoch 312/400 12/12 [==============================] - 0s 3ms/step - loss: 0.5723 - acc: 0.7445 - val_loss: 0.4660 - val_acc: 0.8043 Epoch 313/400 12/12 [==============================] - 0s 3ms/step - loss: 0.6498 - acc: 0.7280 - val_loss: 0.4800 - val_acc: 0.8043 Epoch 314/400 12/12 [==============================] - 0s 3ms/step - loss: 0.6414 - acc: 0.7527 - val_loss: 0.4629 - val_acc: 0.8043 Epoch 315/400 12/12 [==============================] - 0s 3ms/step - loss: 0.6295 - acc: 0.7115 - val_loss: 0.4793 - val_acc: 0.8152 Epoch 316/400 12/12 [==============================] - 0s 3ms/step - loss: 0.5408 - acc: 0.7555 - val_loss: 0.4985 - val_acc: 0.7935 Epoch 317/400 12/12 [==============================] - 0s 4ms/step - loss: 0.5939 - acc: 0.7198 - val_loss: 0.4904 - val_acc: 0.8152 Epoch 318/400 12/12 [==============================] - 0s 3ms/step - loss: 0.5544 - acc: 0.7445 - val_loss: 0.5016 - val_acc: 0.7826 Epoch 319/400 12/12 [==============================] - 0s 3ms/step - loss: 0.6106 - acc: 0.7170 - val_loss: 0.4799 - val_acc: 0.8261 Epoch 320/400 12/12 [==============================] - 0s 4ms/step - loss: 0.6388 - acc: 0.7115 - val_loss: 0.4800 - val_acc: 0.8370 Epoch 321/400 12/12 [==============================] - 0s 4ms/step - loss: 0.6110 - acc: 0.7363 - val_loss: 0.5041 - val_acc: 0.8043 Epoch 322/400 12/12 [==============================] - 0s 4ms/step - loss: 0.5619 - acc: 0.7582 - val_loss: 0.5015 - val_acc: 0.8152 Epoch 323/400 12/12 [==============================] - 0s 4ms/step - loss: 0.6312 - acc: 0.7280 - val_loss: 0.5051 - val_acc: 0.7826 Epoch 324/400 12/12 [==============================] - 0s 3ms/step - loss: 0.5021 - acc: 0.7747 - val_loss: 0.5007 - val_acc: 0.7717 Epoch 325/400 12/12 [==============================] - 0s 3ms/step - loss: 0.6409 - acc: 0.7170 - val_loss: 0.4702 - val_acc: 0.8370 Epoch 326/400 12/12 [==============================] - 0s 3ms/step - loss: 0.6302 - acc: 0.7390 - val_loss: 0.4695 - val_acc: 0.8370 Epoch 327/400 12/12 [==============================] - 0s 3ms/step - loss: 0.5668 - acc: 0.7637 - val_loss: 0.4677 - val_acc: 0.8370 Epoch 328/400 12/12 [==============================] - 0s 4ms/step - loss: 0.5359 - acc: 0.7418 - val_loss: 0.4610 - val_acc: 0.8478 Epoch 329/400 12/12 [==============================] - 0s 4ms/step - loss: 0.5895 - acc: 0.7308 - val_loss: 0.4673 - val_acc: 0.8370 Epoch 330/400 12/12 [==============================] - 0s 4ms/step - loss: 0.5631 - acc: 0.7665 - val_loss: 0.4660 - val_acc: 0.8370 Epoch 331/400 12/12 [==============================] - 0s 3ms/step - loss: 0.6226 - acc: 0.7363 - val_loss: 0.4917 - val_acc: 0.7935 Epoch 332/400 12/12 [==============================] - 0s 4ms/step - loss: 0.6164 - acc: 0.7225 - val_loss: 0.5031 - val_acc: 0.7826 Epoch 333/400 12/12 [==============================] - 0s 4ms/step - loss: 0.5599 - acc: 0.7610 - val_loss: 0.4856 - val_acc: 0.8043 Epoch 334/400 12/12 [==============================] - 0s 4ms/step - loss: 0.6037 - acc: 0.7747 - val_loss: 0.4948 - val_acc: 0.7935 Epoch 335/400 12/12 [==============================] - 0s 4ms/step - loss: 0.5791 - acc: 0.7335 - val_loss: 0.4760 - val_acc: 0.7826 Epoch 336/400 12/12 [==============================] - 0s 3ms/step - loss: 0.5831 - acc: 0.7198 - val_loss: 0.4783 - val_acc: 0.7826 Epoch 337/400 12/12 [==============================] - 0s 3ms/step - loss: 0.5873 - acc: 0.7527 - val_loss: 0.4737 - val_acc: 0.8043 Epoch 338/400 12/12 [==============================] - 0s 4ms/step - loss: 0.5917 - acc: 0.7665 - val_loss: 0.5104 - val_acc: 0.7717 Epoch 339/400 12/12 [==============================] - 0s 3ms/step - loss: 0.5685 - acc: 0.7747 - val_loss: 0.5234 - val_acc: 0.7717 Epoch 340/400 12/12 [==============================] - 0s 4ms/step - loss: 0.6128 - acc: 0.7527 - val_loss: 0.5108 - val_acc: 0.7826 Epoch 341/400 12/12 [==============================] - 0s 4ms/step - loss: 0.5055 - acc: 0.8077 - val_loss: 0.5079 - val_acc: 0.7717 Epoch 342/400 12/12 [==============================] - 0s 4ms/step - loss: 0.5994 - acc: 0.7500 - val_loss: 0.5088 - val_acc: 0.7717 Epoch 343/400 12/12 [==============================] - 0s 3ms/step - loss: 0.5539 - acc: 0.7582 - val_loss: 0.5232 - val_acc: 0.7609 Epoch 344/400 12/12 [==============================] - 0s 3ms/step - loss: 0.5686 - acc: 0.7363 - val_loss: 0.5333 - val_acc: 0.7717 Epoch 345/400 12/12 [==============================] - 0s 4ms/step - loss: 0.5394 - acc: 0.7610 - val_loss: 0.5546 - val_acc: 0.7500 Epoch 346/400 12/12 [==============================] - 0s 3ms/step - loss: 0.5244 - acc: 0.7720 - val_loss: 0.5612 - val_acc: 0.7283 Epoch 347/400 12/12 [==============================] - 0s 3ms/step - loss: 0.5555 - acc: 0.7582 - val_loss: 0.5773 - val_acc: 0.7283 Epoch 348/400 12/12 [==============================] - 0s 4ms/step - loss: 0.5819 - acc: 0.7445 - val_loss: 0.5182 - val_acc: 0.7717 Epoch 349/400 12/12 [==============================] - 0s 3ms/step - loss: 0.5527 - acc: 0.7610 - val_loss: 0.5213 - val_acc: 0.7717 Epoch 350/400 12/12 [==============================] - 0s 4ms/step - loss: 0.5463 - acc: 0.7610 - val_loss: 0.5598 - val_acc: 0.7174 Epoch 351/400 12/12 [==============================] - 0s 4ms/step - loss: 0.5854 - acc: 0.7335 - val_loss: 0.5395 - val_acc: 0.7283 Epoch 352/400 12/12 [==============================] - 0s 4ms/step - loss: 0.5403 - acc: 0.7830 - val_loss: 0.5430 - val_acc: 0.7391 Epoch 353/400 12/12 [==============================] - 0s 4ms/step - loss: 0.5707 - acc: 0.7582 - val_loss: 0.5287 - val_acc: 0.7609 Epoch 354/400 12/12 [==============================] - 0s 3ms/step - loss: 0.5881 - acc: 0.7308 - val_loss: 0.5438 - val_acc: 0.7609 Epoch 355/400 12/12 [==============================] - 0s 4ms/step - loss: 0.5588 - acc: 0.7555 - val_loss: 0.5415 - val_acc: 0.7391 Epoch 356/400 12/12 [==============================] - 0s 3ms/step - loss: 0.5235 - acc: 0.8077 - val_loss: 0.5370 - val_acc: 0.7609 Epoch 357/400 12/12 [==============================] - 0s 3ms/step - loss: 0.5627 - acc: 0.7527 - val_loss: 0.5095 - val_acc: 0.8043 Epoch 358/400 12/12 [==============================] - 0s 4ms/step - loss: 0.5298 - acc: 0.7775 - val_loss: 0.4923 - val_acc: 0.8261 Epoch 359/400 12/12 [==============================] - 0s 3ms/step - loss: 0.5897 - acc: 0.7610 - val_loss: 0.4564 - val_acc: 0.7935 Epoch 360/400 12/12 [==============================] - 0s 3ms/step - loss: 0.5432 - acc: 0.7692 - val_loss: 0.4764 - val_acc: 0.7935 Epoch 361/400 12/12 [==============================] - 0s 3ms/step - loss: 0.5627 - acc: 0.7747 - val_loss: 0.4864 - val_acc: 0.7935 Epoch 362/400 12/12 [==============================] - 0s 4ms/step - loss: 0.5802 - acc: 0.7225 - val_loss: 0.4645 - val_acc: 0.8043 Epoch 363/400 12/12 [==============================] - 0s 3ms/step - loss: 0.5487 - acc: 0.7637 - val_loss: 0.4759 - val_acc: 0.7609 Epoch 364/400 12/12 [==============================] - 0s 3ms/step - loss: 0.5726 - acc: 0.7582 - val_loss: 0.4761 - val_acc: 0.7935 Epoch 365/400 12/12 [==============================] - 0s 3ms/step - loss: 0.6017 - acc: 0.7527 - val_loss: 0.4642 - val_acc: 0.8152 Epoch 366/400 12/12 [==============================] - 0s 4ms/step - loss: 0.5263 - acc: 0.7527 - val_loss: 0.4747 - val_acc: 0.7826 Epoch 367/400 12/12 [==============================] - 0s 4ms/step - loss: 0.6309 - acc: 0.7445 - val_loss: 0.5056 - val_acc: 0.7935 Epoch 368/400 12/12 [==============================] - 0s 3ms/step - loss: 0.5508 - acc: 0.7473 - val_loss: 0.5276 - val_acc: 0.7500 Epoch 369/400 12/12 [==============================] - 0s 4ms/step - loss: 0.5356 - acc: 0.7445 - val_loss: 0.5351 - val_acc: 0.7500 Epoch 370/400 12/12 [==============================] - 0s 3ms/step - loss: 0.5508 - acc: 0.7610 - val_loss: 0.5481 - val_acc: 0.7283 Epoch 371/400 12/12 [==============================] - 0s 3ms/step - loss: 0.5802 - acc: 0.7830 - val_loss: 0.5481 - val_acc: 0.7174 Epoch 372/400 12/12 [==============================] - 0s 3ms/step - loss: 0.5721 - acc: 0.7418 - val_loss: 0.5284 - val_acc: 0.7717 Epoch 373/400 12/12 [==============================] - 0s 4ms/step - loss: 0.5353 - acc: 0.7500 - val_loss: 0.5407 - val_acc: 0.7609 Epoch 374/400 12/12 [==============================] - 0s 4ms/step - loss: 0.5018 - acc: 0.7747 - val_loss: 0.4955 - val_acc: 0.8152 Epoch 375/400 12/12 [==============================] - 0s 3ms/step - loss: 0.5964 - acc: 0.7555 - val_loss: 0.4816 - val_acc: 0.8043 Epoch 376/400 12/12 [==============================] - 0s 4ms/step - loss: 0.6247 - acc: 0.7115 - val_loss: 0.5064 - val_acc: 0.8261 Epoch 377/400 12/12 [==============================] - 0s 4ms/step - loss: 0.5381 - acc: 0.7473 - val_loss: 0.5472 - val_acc: 0.7609 Epoch 378/400 12/12 [==============================] - 0s 3ms/step - loss: 0.5676 - acc: 0.7445 - val_loss: 0.5542 - val_acc: 0.7283 Epoch 379/400 12/12 [==============================] - 0s 4ms/step - loss: 0.5619 - acc: 0.7555 - val_loss: 0.5305 - val_acc: 0.7500 Epoch 380/400 12/12 [==============================] - 0s 4ms/step - loss: 0.5648 - acc: 0.7418 - val_loss: 0.5156 - val_acc: 0.7500 Epoch 381/400 12/12 [==============================] - 0s 4ms/step - loss: 0.6262 - acc: 0.7445 - val_loss: 0.4942 - val_acc: 0.7826 Epoch 382/400 12/12 [==============================] - 0s 3ms/step - loss: 0.5999 - acc: 0.7308 - val_loss: 0.5110 - val_acc: 0.7717 Epoch 383/400 12/12 [==============================] - 0s 3ms/step - loss: 0.5738 - acc: 0.7363 - val_loss: 0.5161 - val_acc: 0.7826 Epoch 384/400 12/12 [==============================] - 0s 3ms/step - loss: 0.5313 - acc: 0.7665 - val_loss: 0.5097 - val_acc: 0.7826 Epoch 385/400 12/12 [==============================] - 0s 3ms/step - loss: 0.5369 - acc: 0.7637 - val_loss: 0.5367 - val_acc: 0.7609 Epoch 386/400 12/12 [==============================] - 0s 4ms/step - loss: 0.5195 - acc: 0.7830 - val_loss: 0.5759 - val_acc: 0.7065 Epoch 387/400 12/12 [==============================] - 0s 3ms/step - loss: 0.5925 - acc: 0.7527 - val_loss: 0.6008 - val_acc: 0.6848 Epoch 388/400 12/12 [==============================] - 0s 3ms/step - loss: 0.5519 - acc: 0.7637 - val_loss: 0.6551 - val_acc: 0.6848 Epoch 389/400 12/12 [==============================] - 0s 4ms/step - loss: 0.5111 - acc: 0.7720 - val_loss: 0.6327 - val_acc: 0.7174 Epoch 390/400 12/12 [==============================] - 0s 4ms/step - loss: 0.5431 - acc: 0.7830 - val_loss: 0.6298 - val_acc: 0.6957 Epoch 391/400 12/12 [==============================] - 0s 4ms/step - loss: 0.5589 - acc: 0.7418 - val_loss: 0.6388 - val_acc: 0.6739 Epoch 392/400 12/12 [==============================] - 0s 3ms/step - loss: 0.5660 - acc: 0.7280 - val_loss: 0.5978 - val_acc: 0.7391 Epoch 393/400 12/12 [==============================] - 0s 4ms/step - loss: 0.5752 - acc: 0.7308 - val_loss: 0.6122 - val_acc: 0.7609 Epoch 394/400 12/12 [==============================] - 0s 3ms/step - loss: 0.5376 - acc: 0.7665 - val_loss: 0.5887 - val_acc: 0.7717 Epoch 395/400 12/12 [==============================] - 0s 4ms/step - loss: 0.6013 - acc: 0.7527 - val_loss: 0.5698 - val_acc: 0.7935 Epoch 396/400 12/12 [==============================] - 0s 3ms/step - loss: 0.5071 - acc: 0.7775 - val_loss: 0.5531 - val_acc: 0.7717 Epoch 397/400 12/12 [==============================] - 0s 3ms/step - loss: 0.5479 - acc: 0.7775 - val_loss: 0.5075 - val_acc: 0.7717 Epoch 398/400 12/12 [==============================] - 0s 4ms/step - loss: 0.5623 - acc: 0.7582 - val_loss: 0.5373 - val_acc: 0.7609 Epoch 399/400 12/12 [==============================] - 0s 3ms/step - loss: 0.5286 - acc: 0.7692 - val_loss: 0.5299 - val_acc: 0.7717 Epoch 400/400 12/12 [==============================] - 0s 3ms/step - loss: 0.4884 - acc: 0.7940 - val_loss: 0.5313 - val_acc: 0.7609
Accuracy and Loss Plots¶
def plot_learningCurve(history, epoch):
# Plot training & validation accuracy values
epoch_range = range(1, epoch+1)
plt.plot(epoch_range, history.history['acc'])
plt.plot(epoch_range, history.history['val_acc'])
plt.title('Model accuracy')
plt.ylabel('Accuracy')
plt.xlabel('Epoch')
plt.legend(['Train', 'Val'], loc='upper left')
plt.show()
# Plot training & validation loss values
plt.plot(epoch_range, history.history['loss'])
plt.plot(epoch_range, history.history['val_loss'])
plt.title('Model loss')
plt.ylabel('Loss')
plt.xlabel('Epoch')
plt.legend(['Train', 'Val'], loc='upper left')
plt.show()
plot_learningCurve(history, 400)
Model Evaluation¶
The evaluate() function will return a list with two values. The first will be the loss of the model on the dataset and the second will be the accuracy of the model on the dataset.
model.evaluate(X_test, y_test)
3/3 [==============================] - 0s 1ms/step - loss: 0.5313 - acc: 0.7609
[0.53132563829422, 0.760869562625885]
Confusion matrix¶
Let's look at the confusion matrix.
y_pred = model.predict_classes(X_test)
y_true = np.argmax(y_test, axis = 1)
cf=confusion_matrix(y_true, y_pred)
sns.heatmap(cf, annot=True)
WARNING:tensorflow:From <ipython-input-22-766a1276ead7>:1: Sequential.predict_classes (from tensorflow.python.keras.engine.sequential) is deprecated and will be removed after 2021-01-01. Instructions for updating: Please use instead:* `np.argmax(model.predict(x), axis=-1)`, if your model does multi-class classification (e.g. if it uses a `softmax` last-layer activation).* `(model.predict(x) > 0.5).astype("int32")`, if your model does binary classification (e.g. if it uses a `sigmoid` last-layer activation).
<AxesSubplot:>
Classification Report¶
print(classification_report(y_true,y_pred))
precision recall f1-score support 1 0.64 0.53 0.58 17 2 0.90 0.47 0.62 19 3 0.36 0.80 0.50 10 5 0.88 1.00 0.93 14 6 1.00 1.00 1.00 14 7 1.00 0.89 0.94 18 accuracy 0.76 92 macro avg 0.80 0.78 0.76 92 weighted avg 0.83 0.76 0.77 92
Making predictions¶
y_pred = model.predict_classes(X_test)
y_pred
array([6, 1, 5, 2, 3, 7, 5, 1, 6, 1, 2, 5, 1, 6, 6, 1, 5, 5, 7, 2, 3, 3, 3, 2, 3, 7, 2, 7, 3, 3, 6, 3, 7, 3, 7, 2, 5, 5, 3, 2, 6, 3, 5, 2, 1, 3, 3, 3, 7, 5, 3, 1, 3, 5, 6, 7, 1, 6, 1, 7, 5, 3, 3, 7, 5, 1, 7, 6, 5, 7, 5, 6, 1, 6, 7, 7, 3, 6, 5, 2, 6, 5, 6, 1, 3, 7, 7, 2, 1, 3, 1, 3])
Now, let's save the model¶
#saving the model
model.save('Glass_Classification.h5')
DeepCC¶
!deepCC Glass_Classification.h5
[INFO] Reading [keras model] 'Glass_Classification.h5' [SUCCESS] Saved 'Glass_Classification_deepC/Glass_Classification.onnx' [INFO] Reading [onnx model] 'Glass_Classification_deepC/Glass_Classification.onnx' [INFO] Model info: ir_vesion : 4 doc : [WARNING] [ONNX]: terminal (input/output) dense_input's shape is less than 1. Changing it to 1. [WARNING] [ONNX]: terminal (input/output) dense_3's shape is less than 1. Changing it to 1. WARN (GRAPH): found operator node with the same name (dense_3) as io node. [INFO] Running DNNC graph sanity check ... [SUCCESS] Passed sanity check. [INFO] Writing C++ file 'Glass_Classification_deepC/Glass_Classification.cpp' [INFO] deepSea model files are ready in 'Glass_Classification_deepC/' [RUNNING COMMAND] g++ -std=c++11 -O3 -fno-rtti -fno-exceptions -I. -I/opt/tljh/user/lib/python3.7/site-packages/deepC-0.13-py3.7-linux-x86_64.egg/deepC/include -isystem /opt/tljh/user/lib/python3.7/site-packages/deepC-0.13-py3.7-linux-x86_64.egg/deepC/packages/eigen-eigen-323c052e1731 "Glass_Classification_deepC/Glass_Classification.cpp" -D_AITS_MAIN -o "Glass_Classification_deepC/Glass_Classification.exe" [RUNNING COMMAND] size "Glass_Classification_deepC/Glass_Classification.exe" text data bss dec hex filename 969805 3184 760 973749 edbb5 Glass_Classification_deepC/Glass_Classification.exe [SUCCESS] Saved model as executable "Glass_Classification_deepC/Glass_Classification.exe"