Cainvas
Model Files
star.h5
keras
Model
deepSea Compiled Models
star.exe
deepSea
Ubuntu

What type of star is it?

Credit: AITS Cainvas Community

Photo by Alex Kunchevsky for OUTLΛNE on Dribbble

Identify the type of star using its characteristics like luminosity, temperature, colour, etc.

In [1]:
import pandas as pd
import numpy as np
from sklearn.model_selection import train_test_split
from sklearn.preprocessing import StandardScaler, MinMaxScaler
from tensorflow.keras import models, optimizers, losses, layers, callbacks
from sklearn.metrics import confusion_matrix
import matplotlib.pyplot as plt
import random

Dataset

On Kaggle by Deepraj Baidya | Github

The dataset took 3 weeks to collect for 240 stars which are mostly collected from web. The missing data were manually calculated using equations of astrophysics.

The dataset is a CSV file with characteristics of a star like luminosity, temperature, colour, radius etc that help classify them into one of the 6 classes - Brown Dwarf, Red Dwarf, White Dwarf, Main Sequence, Supergiant, Hypergiant.

In [2]:
df = pd.read_csv('https://cainvas-static.s3.amazonaws.com/media/user_data/cainvas-admin/star.csv')
df
Out[2]:
Temperature (K) Luminosity(L/Lo) Radius(R/Ro) Absolute magnitude(Mv) Star type Star color Spectral Class
0 3068 0.002400 0.1700 16.12 0 Red M
1 3042 0.000500 0.1542 16.60 0 Red M
2 2600 0.000300 0.1020 18.70 0 Red M
3 2800 0.000200 0.1600 16.65 0 Red M
4 1939 0.000138 0.1030 20.06 0 Red M
... ... ... ... ... ... ... ...
235 38940 374830.000000 1356.0000 -9.93 5 Blue O
236 30839 834042.000000 1194.0000 -10.63 5 Blue O
237 8829 537493.000000 1423.0000 -10.73 5 White A
238 9235 404940.000000 1112.0000 -11.23 5 White A
239 37882 294903.000000 1783.0000 -7.80 5 Blue O

240 rows × 7 columns

Preprocessing

In [3]:
df['Star color'].value_counts()
Out[3]:
Red                   112
Blue                   55
Blue-white             26
Blue White             10
yellow-white            8
White                   7
Yellowish White         3
white                   3
Blue white              3
yellowish               2
Whitish                 2
Orange                  2
Orange-Red              1
Blue                    1
Blue white              1
White-Yellow            1
Yellowish               1
Pale yellow orange      1
Blue-White              1
Name: Star color, dtype: int64

There are many shades of colours mentioned, some similar like Yellowish white and White-Yellow, and many spelling for blue-white.

We can identify 5 basic colours from the given list - blue, white, yellow, orange, red. Lets rewrite the column as 5 columns with multilabel values.

In [4]:
colours = ['Blu', 'Whit', 'Yellow', 'Orang', 'Red']    # using root word of colours as the spelling can differ while specifying shades

df[colours] = 0

for c in colours:
    df.loc[df['Star color'].str.contains(c, case = False), c]=1
In [5]:
df['Spectral Class'].value_counts()
Out[5]:
M    111
B     46
O     40
A     19
F     17
K      6
G      1
Name: Spectral Class, dtype: int64
In [6]:
# One hot encoding the input column
df_dummies = pd.get_dummies(df['Spectral Class'], drop_first = True, prefix = 'Spectral')
for column in df_dummies:
    df[column] = df_dummies[column]

# One hot encoding the output column    
y = pd.get_dummies(df['Star type'])

# Dropping the encoded columns
df = df.drop(columns = ['Spectral Class', 'Star type', 'Star color'])

Looking into the variable value ranges

In [7]:
df.describe()
Out[7]:
Temperature (K) Luminosity(L/Lo) Radius(R/Ro) Absolute magnitude(Mv) Blu Whit Yellow Orang Red Spectral_B Spectral_F Spectral_G Spectral_K Spectral_M Spectral_O
count 240.000000 240.000000 240.000000 240.000000 240.000000 240.000000 240.000000 240.000000 240.000000 240.000000 240.000000 240.000000 240.000000 240.000000 240.000000
mean 10497.462500 107188.361635 237.157781 4.382396 0.404167 0.270833 0.066667 0.016667 0.470833 0.191667 0.070833 0.004167 0.025000 0.462500 0.166667
std 9552.425037 179432.244940 517.155763 10.532512 0.491756 0.445319 0.249965 0.128287 0.500192 0.394435 0.257082 0.064550 0.156451 0.499634 0.373457
min 1939.000000 0.000080 0.008400 -11.920000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000
25% 3344.250000 0.000865 0.102750 -6.232500 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000
50% 5776.000000 0.070500 0.762500 8.313000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000
75% 15055.500000 198050.000000 42.750000 13.697500 1.000000 1.000000 0.000000 0.000000 1.000000 0.000000 0.000000 0.000000 0.000000 1.000000 0.000000
max 40000.000000 849420.000000 1948.500000 20.060000 1.000000 1.000000 1.000000 1.000000 1.000000 1.000000 1.000000 1.000000 1.000000 1.000000 1.000000

The standard deviation values differ across the attributes.

Splitting the dataset before standardization (mean = 0, sd = 1).

Train val test split

In [8]:
# Splitting into train, val and test set -- 80-10-10 split

# First, an 80-20 split
Xtrain, X_val_test, ytrain, y_val_test = train_test_split(df, y, test_size = 0.2)

# Then split the 20% into half
Xval, Xtest, yval, ytest = train_test_split(X_val_test, y_val_test, test_size = 0.5)

print("Number of samples in...")
print("Training set: ", len(Xtrain))
print("Validation set: ", len(Xval))
print("Testing set: ", len(Xtest))
Number of samples in...
Training set:  192
Validation set:  24
Testing set:  24

Standardization

In [9]:
ss = StandardScaler()

Xtrain = ss.fit_transform(Xtrain)
Xval = ss.transform(Xval)
Xtest = ss.transform(Xtest)

The model

In [10]:
model = models.Sequential([
    layers.Dense(16, activation = 'relu', input_shape = Xtrain[0].shape),
    layers.Dense(8, activation = 'relu'),
    layers.Dense(6, activation = 'softmax')
])

cb = callbacks.EarlyStopping(patience = 5, restore_best_weights = True)
In [11]:
model.summary()
Model: "sequential"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
dense (Dense)                (None, 16)                256       
_________________________________________________________________
dense_1 (Dense)              (None, 8)                 136       
_________________________________________________________________
dense_2 (Dense)              (None, 6)                 54        
=================================================================
Total params: 446
Trainable params: 446
Non-trainable params: 0
_________________________________________________________________
In [12]:
model.compile(optimizer = optimizers.Adam(0.001), loss = losses.CategoricalCrossentropy(), metrics = ['accuracy'])

history = model.fit(Xtrain, ytrain, validation_data = (Xval, yval), epochs = 256, callbacks = cb)
Epoch 1/256
6/6 [==============================] - 0s 24ms/step - loss: 1.8942 - accuracy: 0.1406 - val_loss: 1.9567 - val_accuracy: 0.0417
Epoch 2/256
6/6 [==============================] - 0s 3ms/step - loss: 1.8399 - accuracy: 0.1979 - val_loss: 1.9097 - val_accuracy: 0.2083
Epoch 3/256
6/6 [==============================] - 0s 3ms/step - loss: 1.7935 - accuracy: 0.3125 - val_loss: 1.8632 - val_accuracy: 0.2500
Epoch 4/256
6/6 [==============================] - 0s 10ms/step - loss: 1.7502 - accuracy: 0.3594 - val_loss: 1.8217 - val_accuracy: 0.2500
Epoch 5/256
6/6 [==============================] - 0s 3ms/step - loss: 1.7077 - accuracy: 0.3750 - val_loss: 1.7836 - val_accuracy: 0.2500
Epoch 6/256
6/6 [==============================] - 0s 7ms/step - loss: 1.6642 - accuracy: 0.3802 - val_loss: 1.7466 - val_accuracy: 0.2500
Epoch 7/256
6/6 [==============================] - 0s 4ms/step - loss: 1.6260 - accuracy: 0.3802 - val_loss: 1.7087 - val_accuracy: 0.2917
Epoch 8/256
6/6 [==============================] - 0s 3ms/step - loss: 1.5891 - accuracy: 0.4115 - val_loss: 1.6732 - val_accuracy: 0.3333
Epoch 9/256
6/6 [==============================] - 0s 3ms/step - loss: 1.5520 - accuracy: 0.4479 - val_loss: 1.6339 - val_accuracy: 0.3333
Epoch 10/256
6/6 [==============================] - 0s 3ms/step - loss: 1.5162 - accuracy: 0.4740 - val_loss: 1.5904 - val_accuracy: 0.3333
Epoch 11/256
6/6 [==============================] - 0s 3ms/step - loss: 1.4768 - accuracy: 0.4740 - val_loss: 1.5473 - val_accuracy: 0.4167
Epoch 12/256
6/6 [==============================] - 0s 3ms/step - loss: 1.4408 - accuracy: 0.4792 - val_loss: 1.5017 - val_accuracy: 0.4583
Epoch 13/256
6/6 [==============================] - 0s 7ms/step - loss: 1.4067 - accuracy: 0.5104 - val_loss: 1.4561 - val_accuracy: 0.5833
Epoch 14/256
6/6 [==============================] - 0s 3ms/step - loss: 1.3741 - accuracy: 0.6406 - val_loss: 1.4113 - val_accuracy: 0.7083
Epoch 15/256
6/6 [==============================] - 0s 4ms/step - loss: 1.3353 - accuracy: 0.6719 - val_loss: 1.3809 - val_accuracy: 0.6667
Epoch 16/256
6/6 [==============================] - 0s 8ms/step - loss: 1.3036 - accuracy: 0.6094 - val_loss: 1.3477 - val_accuracy: 0.6667
Epoch 17/256
6/6 [==============================] - 0s 3ms/step - loss: 1.2705 - accuracy: 0.6302 - val_loss: 1.3218 - val_accuracy: 0.6667
Epoch 18/256
6/6 [==============================] - 0s 6ms/step - loss: 1.2378 - accuracy: 0.6615 - val_loss: 1.2928 - val_accuracy: 0.6667
Epoch 19/256
6/6 [==============================] - 0s 5ms/step - loss: 1.2056 - accuracy: 0.6667 - val_loss: 1.2632 - val_accuracy: 0.6667
Epoch 20/256
6/6 [==============================] - 0s 3ms/step - loss: 1.1748 - accuracy: 0.6771 - val_loss: 1.2337 - val_accuracy: 0.6667
Epoch 21/256
6/6 [==============================] - 0s 3ms/step - loss: 1.1441 - accuracy: 0.6823 - val_loss: 1.2071 - val_accuracy: 0.6667
Epoch 22/256
6/6 [==============================] - 0s 3ms/step - loss: 1.1132 - accuracy: 0.6771 - val_loss: 1.1794 - val_accuracy: 0.6667
Epoch 23/256
6/6 [==============================] - 0s 3ms/step - loss: 1.0828 - accuracy: 0.6823 - val_loss: 1.1499 - val_accuracy: 0.6667
Epoch 24/256
6/6 [==============================] - 0s 4ms/step - loss: 1.0531 - accuracy: 0.6771 - val_loss: 1.1206 - val_accuracy: 0.6667
Epoch 25/256
6/6 [==============================] - 0s 9ms/step - loss: 1.0228 - accuracy: 0.6823 - val_loss: 1.0901 - val_accuracy: 0.6667
Epoch 26/256
6/6 [==============================] - 0s 4ms/step - loss: 0.9921 - accuracy: 0.6823 - val_loss: 1.0616 - val_accuracy: 0.6667
Epoch 27/256
6/6 [==============================] - 0s 3ms/step - loss: 0.9602 - accuracy: 0.6823 - val_loss: 1.0304 - val_accuracy: 0.6667
Epoch 28/256
6/6 [==============================] - 0s 3ms/step - loss: 0.9284 - accuracy: 0.6823 - val_loss: 0.9983 - val_accuracy: 0.6667
Epoch 29/256
6/6 [==============================] - 0s 3ms/step - loss: 0.8957 - accuracy: 0.7083 - val_loss: 0.9674 - val_accuracy: 0.7083
Epoch 30/256
6/6 [==============================] - 0s 9ms/step - loss: 0.8657 - accuracy: 0.7396 - val_loss: 0.9359 - val_accuracy: 0.7083
Epoch 31/256
6/6 [==============================] - 0s 6ms/step - loss: 0.8364 - accuracy: 0.7396 - val_loss: 0.9044 - val_accuracy: 0.7083
Epoch 32/256
6/6 [==============================] - 0s 5ms/step - loss: 0.8077 - accuracy: 0.7396 - val_loss: 0.8730 - val_accuracy: 0.7083
Epoch 33/256
6/6 [==============================] - 0s 6ms/step - loss: 0.7803 - accuracy: 0.7396 - val_loss: 0.8449 - val_accuracy: 0.7083
Epoch 34/256
6/6 [==============================] - 0s 5ms/step - loss: 0.7551 - accuracy: 0.7396 - val_loss: 0.8191 - val_accuracy: 0.7083
Epoch 35/256
6/6 [==============================] - 0s 6ms/step - loss: 0.7301 - accuracy: 0.7448 - val_loss: 0.7928 - val_accuracy: 0.7083
Epoch 36/256
6/6 [==============================] - 0s 5ms/step - loss: 0.7072 - accuracy: 0.7448 - val_loss: 0.7684 - val_accuracy: 0.7083
Epoch 37/256
6/6 [==============================] - 0s 4ms/step - loss: 0.6859 - accuracy: 0.7448 - val_loss: 0.7456 - val_accuracy: 0.7083
Epoch 38/256
6/6 [==============================] - 0s 3ms/step - loss: 0.6658 - accuracy: 0.7448 - val_loss: 0.7252 - val_accuracy: 0.7083
Epoch 39/256
6/6 [==============================] - 0s 3ms/step - loss: 0.6464 - accuracy: 0.7448 - val_loss: 0.7060 - val_accuracy: 0.7083
Epoch 40/256
6/6 [==============================] - 0s 9ms/step - loss: 0.6285 - accuracy: 0.7448 - val_loss: 0.6877 - val_accuracy: 0.7083
Epoch 41/256
6/6 [==============================] - 0s 3ms/step - loss: 0.6115 - accuracy: 0.7448 - val_loss: 0.6699 - val_accuracy: 0.7083
Epoch 42/256
6/6 [==============================] - 0s 5ms/step - loss: 0.5957 - accuracy: 0.7500 - val_loss: 0.6551 - val_accuracy: 0.7083
Epoch 43/256
6/6 [==============================] - 0s 6ms/step - loss: 0.5801 - accuracy: 0.7500 - val_loss: 0.6420 - val_accuracy: 0.7083
Epoch 44/256
6/6 [==============================] - 0s 3ms/step - loss: 0.5657 - accuracy: 0.7552 - val_loss: 0.6301 - val_accuracy: 0.7500
Epoch 45/256
6/6 [==============================] - 0s 3ms/step - loss: 0.5522 - accuracy: 0.7760 - val_loss: 0.6198 - val_accuracy: 0.7500
Epoch 46/256
6/6 [==============================] - 0s 3ms/step - loss: 0.5395 - accuracy: 0.7812 - val_loss: 0.6086 - val_accuracy: 0.7500
Epoch 47/256
6/6 [==============================] - 0s 6ms/step - loss: 0.5268 - accuracy: 0.7969 - val_loss: 0.5993 - val_accuracy: 0.7500
Epoch 48/256
6/6 [==============================] - 0s 4ms/step - loss: 0.5150 - accuracy: 0.8281 - val_loss: 0.5895 - val_accuracy: 0.8750
Epoch 49/256
6/6 [==============================] - 0s 3ms/step - loss: 0.5040 - accuracy: 0.9115 - val_loss: 0.5799 - val_accuracy: 0.9167
Epoch 50/256
6/6 [==============================] - 0s 3ms/step - loss: 0.4929 - accuracy: 0.9427 - val_loss: 0.5695 - val_accuracy: 0.9167
Epoch 51/256
6/6 [==============================] - 0s 3ms/step - loss: 0.4826 - accuracy: 0.9219 - val_loss: 0.5632 - val_accuracy: 0.8750
Epoch 52/256
6/6 [==============================] - 0s 8ms/step - loss: 0.4727 - accuracy: 0.8906 - val_loss: 0.5574 - val_accuracy: 0.8750
Epoch 53/256
6/6 [==============================] - 0s 3ms/step - loss: 0.4629 - accuracy: 0.9219 - val_loss: 0.5492 - val_accuracy: 0.9167
Epoch 54/256
6/6 [==============================] - 0s 3ms/step - loss: 0.4536 - accuracy: 0.9219 - val_loss: 0.5430 - val_accuracy: 0.8750
Epoch 55/256
6/6 [==============================] - 0s 3ms/step - loss: 0.4449 - accuracy: 0.9062 - val_loss: 0.5364 - val_accuracy: 0.8750
Epoch 56/256
6/6 [==============================] - 0s 8ms/step - loss: 0.4363 - accuracy: 0.8802 - val_loss: 0.5301 - val_accuracy: 0.8750
Epoch 57/256
6/6 [==============================] - 0s 3ms/step - loss: 0.4281 - accuracy: 0.8802 - val_loss: 0.5237 - val_accuracy: 0.8750
Epoch 58/256
6/6 [==============================] - 0s 3ms/step - loss: 0.4204 - accuracy: 0.8802 - val_loss: 0.5188 - val_accuracy: 0.8333
Epoch 59/256
6/6 [==============================] - 0s 8ms/step - loss: 0.4121 - accuracy: 0.8802 - val_loss: 0.5131 - val_accuracy: 0.8750
Epoch 60/256
6/6 [==============================] - 0s 3ms/step - loss: 0.4040 - accuracy: 0.9062 - val_loss: 0.5079 - val_accuracy: 0.8750
Epoch 61/256
6/6 [==============================] - 0s 8ms/step - loss: 0.3958 - accuracy: 0.9115 - val_loss: 0.5007 - val_accuracy: 0.8750
Epoch 62/256
6/6 [==============================] - 0s 3ms/step - loss: 0.3883 - accuracy: 0.9062 - val_loss: 0.4918 - val_accuracy: 0.8750
Epoch 63/256
6/6 [==============================] - 0s 3ms/step - loss: 0.3806 - accuracy: 0.9219 - val_loss: 0.4865 - val_accuracy: 0.8750
Epoch 64/256
6/6 [==============================] - 0s 3ms/step - loss: 0.3742 - accuracy: 0.9427 - val_loss: 0.4810 - val_accuracy: 0.8750
Epoch 65/256
6/6 [==============================] - 0s 3ms/step - loss: 0.3669 - accuracy: 0.9375 - val_loss: 0.4773 - val_accuracy: 0.8750
Epoch 66/256
6/6 [==============================] - 0s 7ms/step - loss: 0.3619 - accuracy: 0.9271 - val_loss: 0.4757 - val_accuracy: 0.8333
Epoch 67/256
6/6 [==============================] - 0s 3ms/step - loss: 0.3565 - accuracy: 0.8906 - val_loss: 0.4734 - val_accuracy: 0.7917
Epoch 68/256
6/6 [==============================] - 0s 5ms/step - loss: 0.3504 - accuracy: 0.8958 - val_loss: 0.4693 - val_accuracy: 0.8750
Epoch 69/256
6/6 [==============================] - 0s 7ms/step - loss: 0.3452 - accuracy: 0.9271 - val_loss: 0.4649 - val_accuracy: 0.8333
Epoch 70/256
6/6 [==============================] - 0s 3ms/step - loss: 0.3403 - accuracy: 0.9271 - val_loss: 0.4607 - val_accuracy: 0.8333
Epoch 71/256
6/6 [==============================] - 0s 10ms/step - loss: 0.3352 - accuracy: 0.9323 - val_loss: 0.4570 - val_accuracy: 0.8750
Epoch 72/256
6/6 [==============================] - 0s 3ms/step - loss: 0.3308 - accuracy: 0.9167 - val_loss: 0.4530 - val_accuracy: 0.8750
Epoch 73/256
6/6 [==============================] - 0s 6ms/step - loss: 0.3265 - accuracy: 0.9115 - val_loss: 0.4489 - val_accuracy: 0.8750
Epoch 74/256
6/6 [==============================] - 0s 4ms/step - loss: 0.3222 - accuracy: 0.9271 - val_loss: 0.4460 - val_accuracy: 0.8750
Epoch 75/256
6/6 [==============================] - 0s 3ms/step - loss: 0.3182 - accuracy: 0.9271 - val_loss: 0.4442 - val_accuracy: 0.8333
Epoch 76/256
6/6 [==============================] - 0s 7ms/step - loss: 0.3146 - accuracy: 0.9271 - val_loss: 0.4396 - val_accuracy: 0.8750
Epoch 77/256
6/6 [==============================] - 0s 3ms/step - loss: 0.3102 - accuracy: 0.9375 - val_loss: 0.4376 - val_accuracy: 0.8750
Epoch 78/256
6/6 [==============================] - 0s 3ms/step - loss: 0.3071 - accuracy: 0.9479 - val_loss: 0.4372 - val_accuracy: 0.8333
Epoch 79/256
6/6 [==============================] - 0s 3ms/step - loss: 0.3037 - accuracy: 0.9427 - val_loss: 0.4341 - val_accuracy: 0.8750
Epoch 80/256
6/6 [==============================] - 0s 6ms/step - loss: 0.3003 - accuracy: 0.9375 - val_loss: 0.4301 - val_accuracy: 0.8750
Epoch 81/256
6/6 [==============================] - 0s 4ms/step - loss: 0.2969 - accuracy: 0.9427 - val_loss: 0.4273 - val_accuracy: 0.8333
Epoch 82/256
6/6 [==============================] - 0s 3ms/step - loss: 0.2938 - accuracy: 0.9271 - val_loss: 0.4235 - val_accuracy: 0.8333
Epoch 83/256
6/6 [==============================] - 0s 4ms/step - loss: 0.2905 - accuracy: 0.9375 - val_loss: 0.4210 - val_accuracy: 0.8333
Epoch 84/256
6/6 [==============================] - 0s 3ms/step - loss: 0.2881 - accuracy: 0.9479 - val_loss: 0.4193 - val_accuracy: 0.8750
Epoch 85/256
6/6 [==============================] - 0s 3ms/step - loss: 0.2844 - accuracy: 0.9531 - val_loss: 0.4165 - val_accuracy: 0.8750
Epoch 86/256
6/6 [==============================] - 0s 3ms/step - loss: 0.2818 - accuracy: 0.9479 - val_loss: 0.4141 - val_accuracy: 0.8333
Epoch 87/256
6/6 [==============================] - 0s 3ms/step - loss: 0.2791 - accuracy: 0.9479 - val_loss: 0.4123 - val_accuracy: 0.8333
Epoch 88/256
6/6 [==============================] - 0s 7ms/step - loss: 0.2763 - accuracy: 0.9531 - val_loss: 0.4099 - val_accuracy: 0.8333
Epoch 89/256
6/6 [==============================] - 0s 3ms/step - loss: 0.2738 - accuracy: 0.9479 - val_loss: 0.4054 - val_accuracy: 0.8750
Epoch 90/256
6/6 [==============================] - 0s 9ms/step - loss: 0.2709 - accuracy: 0.9375 - val_loss: 0.4011 - val_accuracy: 0.8750
Epoch 91/256
6/6 [==============================] - 0s 4ms/step - loss: 0.2685 - accuracy: 0.9531 - val_loss: 0.3990 - val_accuracy: 0.8333
Epoch 92/256
6/6 [==============================] - 0s 3ms/step - loss: 0.2658 - accuracy: 0.9531 - val_loss: 0.3958 - val_accuracy: 0.8333
Epoch 93/256
6/6 [==============================] - 0s 6ms/step - loss: 0.2635 - accuracy: 0.9583 - val_loss: 0.3934 - val_accuracy: 0.8750
Epoch 94/256
6/6 [==============================] - 0s 3ms/step - loss: 0.2607 - accuracy: 0.9531 - val_loss: 0.3903 - val_accuracy: 0.8750
Epoch 95/256
6/6 [==============================] - 0s 3ms/step - loss: 0.2582 - accuracy: 0.9583 - val_loss: 0.3874 - val_accuracy: 0.8750
Epoch 96/256
6/6 [==============================] - 0s 3ms/step - loss: 0.2557 - accuracy: 0.9583 - val_loss: 0.3844 - val_accuracy: 0.8750
Epoch 97/256
6/6 [==============================] - 0s 6ms/step - loss: 0.2536 - accuracy: 0.9583 - val_loss: 0.3803 - val_accuracy: 0.8750
Epoch 98/256
6/6 [==============================] - 0s 3ms/step - loss: 0.2509 - accuracy: 0.9583 - val_loss: 0.3782 - val_accuracy: 0.9167
Epoch 99/256
6/6 [==============================] - 0s 3ms/step - loss: 0.2482 - accuracy: 0.9583 - val_loss: 0.3753 - val_accuracy: 0.8750
Epoch 100/256
6/6 [==============================] - 0s 6ms/step - loss: 0.2462 - accuracy: 0.9531 - val_loss: 0.3730 - val_accuracy: 0.8333
Epoch 101/256
6/6 [==============================] - 0s 11ms/step - loss: 0.2445 - accuracy: 0.9531 - val_loss: 0.3693 - val_accuracy: 0.9167
Epoch 102/256
6/6 [==============================] - 0s 9ms/step - loss: 0.2417 - accuracy: 0.9635 - val_loss: 0.3655 - val_accuracy: 0.8750
Epoch 103/256
6/6 [==============================] - 0s 7ms/step - loss: 0.2388 - accuracy: 0.9583 - val_loss: 0.3636 - val_accuracy: 0.9167
Epoch 104/256
6/6 [==============================] - 0s 10ms/step - loss: 0.2367 - accuracy: 0.9583 - val_loss: 0.3598 - val_accuracy: 0.9167
Epoch 105/256
6/6 [==============================] - 0s 3ms/step - loss: 0.2341 - accuracy: 0.9635 - val_loss: 0.3571 - val_accuracy: 0.9167
Epoch 106/256
6/6 [==============================] - 0s 3ms/step - loss: 0.2319 - accuracy: 0.9583 - val_loss: 0.3534 - val_accuracy: 0.8750
Epoch 107/256
6/6 [==============================] - 0s 3ms/step - loss: 0.2292 - accuracy: 0.9583 - val_loss: 0.3499 - val_accuracy: 0.8750
Epoch 108/256
6/6 [==============================] - 0s 5ms/step - loss: 0.2268 - accuracy: 0.9635 - val_loss: 0.3457 - val_accuracy: 0.9167
Epoch 109/256
6/6 [==============================] - 0s 5ms/step - loss: 0.2246 - accuracy: 0.9740 - val_loss: 0.3412 - val_accuracy: 0.9167
Epoch 110/256
6/6 [==============================] - 0s 3ms/step - loss: 0.2216 - accuracy: 0.9740 - val_loss: 0.3366 - val_accuracy: 0.9167
Epoch 111/256
6/6 [==============================] - 0s 3ms/step - loss: 0.2187 - accuracy: 0.9635 - val_loss: 0.3332 - val_accuracy: 0.9167
Epoch 112/256
6/6 [==============================] - 0s 11ms/step - loss: 0.2160 - accuracy: 0.9792 - val_loss: 0.3281 - val_accuracy: 0.9167
Epoch 113/256
6/6 [==============================] - 0s 10ms/step - loss: 0.2129 - accuracy: 0.9792 - val_loss: 0.3235 - val_accuracy: 0.9167
Epoch 114/256
6/6 [==============================] - 0s 13ms/step - loss: 0.2101 - accuracy: 0.9740 - val_loss: 0.3201 - val_accuracy: 0.9167
Epoch 115/256
6/6 [==============================] - 0s 4ms/step - loss: 0.2081 - accuracy: 0.9531 - val_loss: 0.3170 - val_accuracy: 0.8750
Epoch 116/256
6/6 [==============================] - 0s 5ms/step - loss: 0.2053 - accuracy: 0.9792 - val_loss: 0.3117 - val_accuracy: 0.9167
Epoch 117/256
6/6 [==============================] - 0s 11ms/step - loss: 0.2022 - accuracy: 0.9792 - val_loss: 0.3078 - val_accuracy: 0.9167
Epoch 118/256
6/6 [==============================] - 0s 10ms/step - loss: 0.1994 - accuracy: 0.9792 - val_loss: 0.3045 - val_accuracy: 0.9167
Epoch 119/256
6/6 [==============================] - 0s 14ms/step - loss: 0.1968 - accuracy: 0.9844 - val_loss: 0.3012 - val_accuracy: 0.9167
Epoch 120/256
6/6 [==============================] - 0s 5ms/step - loss: 0.1945 - accuracy: 0.9792 - val_loss: 0.2947 - val_accuracy: 0.9167
Epoch 121/256
6/6 [==============================] - 0s 3ms/step - loss: 0.1921 - accuracy: 0.9688 - val_loss: 0.2920 - val_accuracy: 0.9167
Epoch 122/256
6/6 [==============================] - 0s 4ms/step - loss: 0.1896 - accuracy: 0.9688 - val_loss: 0.2877 - val_accuracy: 0.9167
Epoch 123/256
6/6 [==============================] - 0s 5ms/step - loss: 0.1861 - accuracy: 0.9792 - val_loss: 0.2861 - val_accuracy: 0.9167
Epoch 124/256
6/6 [==============================] - 0s 7ms/step - loss: 0.1832 - accuracy: 0.9792 - val_loss: 0.2828 - val_accuracy: 0.9167
Epoch 125/256
6/6 [==============================] - 0s 10ms/step - loss: 0.1819 - accuracy: 0.9844 - val_loss: 0.2778 - val_accuracy: 0.9167
Epoch 126/256
6/6 [==============================] - 0s 7ms/step - loss: 0.1781 - accuracy: 0.9792 - val_loss: 0.2739 - val_accuracy: 0.9167
Epoch 127/256
6/6 [==============================] - 0s 6ms/step - loss: 0.1756 - accuracy: 0.9896 - val_loss: 0.2702 - val_accuracy: 0.9167
Epoch 128/256
6/6 [==============================] - 0s 9ms/step - loss: 0.1731 - accuracy: 0.9948 - val_loss: 0.2661 - val_accuracy: 0.9167
Epoch 129/256
6/6 [==============================] - 0s 3ms/step - loss: 0.1706 - accuracy: 0.9948 - val_loss: 0.2639 - val_accuracy: 0.9167
Epoch 130/256
6/6 [==============================] - 0s 6ms/step - loss: 0.1683 - accuracy: 0.9896 - val_loss: 0.2594 - val_accuracy: 0.9167
Epoch 131/256
6/6 [==============================] - 0s 3ms/step - loss: 0.1655 - accuracy: 0.9792 - val_loss: 0.2549 - val_accuracy: 0.9167
Epoch 132/256
6/6 [==============================] - 0s 8ms/step - loss: 0.1638 - accuracy: 0.9792 - val_loss: 0.2524 - val_accuracy: 0.9167
Epoch 133/256
6/6 [==============================] - 0s 11ms/step - loss: 0.1601 - accuracy: 0.9792 - val_loss: 0.2492 - val_accuracy: 0.9167
Epoch 134/256
6/6 [==============================] - 0s 3ms/step - loss: 0.1588 - accuracy: 0.9948 - val_loss: 0.2470 - val_accuracy: 0.9167
Epoch 135/256
6/6 [==============================] - 0s 3ms/step - loss: 0.1559 - accuracy: 1.0000 - val_loss: 0.2414 - val_accuracy: 0.9167
Epoch 136/256
6/6 [==============================] - 0s 14ms/step - loss: 0.1530 - accuracy: 0.9948 - val_loss: 0.2372 - val_accuracy: 0.9167
Epoch 137/256
6/6 [==============================] - 0s 4ms/step - loss: 0.1511 - accuracy: 0.9896 - val_loss: 0.2326 - val_accuracy: 0.9167
Epoch 138/256
6/6 [==============================] - 0s 3ms/step - loss: 0.1486 - accuracy: 0.9896 - val_loss: 0.2306 - val_accuracy: 0.9167
Epoch 139/256
6/6 [==============================] - 0s 6ms/step - loss: 0.1461 - accuracy: 0.9896 - val_loss: 0.2268 - val_accuracy: 0.9167
Epoch 140/256
6/6 [==============================] - 0s 3ms/step - loss: 0.1448 - accuracy: 0.9948 - val_loss: 0.2250 - val_accuracy: 0.9167
Epoch 141/256
6/6 [==============================] - 0s 3ms/step - loss: 0.1436 - accuracy: 0.9844 - val_loss: 0.2214 - val_accuracy: 0.9583
Epoch 142/256
6/6 [==============================] - 0s 5ms/step - loss: 0.1397 - accuracy: 0.9948 - val_loss: 0.2186 - val_accuracy: 0.9583
Epoch 143/256
6/6 [==============================] - 0s 8ms/step - loss: 0.1374 - accuracy: 0.9896 - val_loss: 0.2139 - val_accuracy: 0.9167
Epoch 144/256
6/6 [==============================] - 0s 5ms/step - loss: 0.1357 - accuracy: 0.9948 - val_loss: 0.2099 - val_accuracy: 0.9167
Epoch 145/256
6/6 [==============================] - 0s 3ms/step - loss: 0.1330 - accuracy: 0.9948 - val_loss: 0.2076 - val_accuracy: 0.9583
Epoch 146/256
6/6 [==============================] - 0s 9ms/step - loss: 0.1309 - accuracy: 1.0000 - val_loss: 0.2047 - val_accuracy: 0.9583
Epoch 147/256
6/6 [==============================] - 0s 3ms/step - loss: 0.1290 - accuracy: 1.0000 - val_loss: 0.2018 - val_accuracy: 1.0000
Epoch 148/256
6/6 [==============================] - 0s 6ms/step - loss: 0.1272 - accuracy: 0.9948 - val_loss: 0.1974 - val_accuracy: 1.0000
Epoch 149/256
6/6 [==============================] - 0s 11ms/step - loss: 0.1248 - accuracy: 0.9948 - val_loss: 0.1946 - val_accuracy: 1.0000
Epoch 150/256
6/6 [==============================] - 0s 9ms/step - loss: 0.1234 - accuracy: 0.9948 - val_loss: 0.1908 - val_accuracy: 1.0000
Epoch 151/256
6/6 [==============================] - 0s 3ms/step - loss: 0.1213 - accuracy: 1.0000 - val_loss: 0.1875 - val_accuracy: 1.0000
Epoch 152/256
6/6 [==============================] - 0s 3ms/step - loss: 0.1191 - accuracy: 0.9948 - val_loss: 0.1842 - val_accuracy: 1.0000
Epoch 153/256
6/6 [==============================] - 0s 3ms/step - loss: 0.1174 - accuracy: 0.9948 - val_loss: 0.1819 - val_accuracy: 1.0000
Epoch 154/256
6/6 [==============================] - 0s 3ms/step - loss: 0.1160 - accuracy: 0.9948 - val_loss: 0.1804 - val_accuracy: 1.0000
Epoch 155/256
6/6 [==============================] - 0s 3ms/step - loss: 0.1136 - accuracy: 1.0000 - val_loss: 0.1770 - val_accuracy: 1.0000
Epoch 156/256
6/6 [==============================] - 0s 8ms/step - loss: 0.1122 - accuracy: 1.0000 - val_loss: 0.1735 - val_accuracy: 1.0000
Epoch 157/256
6/6 [==============================] - 0s 10ms/step - loss: 0.1102 - accuracy: 0.9948 - val_loss: 0.1695 - val_accuracy: 1.0000
Epoch 158/256
6/6 [==============================] - 0s 3ms/step - loss: 0.1091 - accuracy: 0.9948 - val_loss: 0.1682 - val_accuracy: 1.0000
Epoch 159/256
6/6 [==============================] - 0s 3ms/step - loss: 0.1070 - accuracy: 0.9948 - val_loss: 0.1659 - val_accuracy: 1.0000
Epoch 160/256
6/6 [==============================] - 0s 3ms/step - loss: 0.1052 - accuracy: 1.0000 - val_loss: 0.1623 - val_accuracy: 1.0000
Epoch 161/256
6/6 [==============================] - 0s 3ms/step - loss: 0.1041 - accuracy: 0.9948 - val_loss: 0.1583 - val_accuracy: 1.0000
Epoch 162/256
6/6 [==============================] - 0s 3ms/step - loss: 0.1027 - accuracy: 0.9896 - val_loss: 0.1559 - val_accuracy: 1.0000
Epoch 163/256
6/6 [==============================] - 0s 3ms/step - loss: 0.1018 - accuracy: 0.9948 - val_loss: 0.1565 - val_accuracy: 1.0000
Epoch 164/256
6/6 [==============================] - 0s 3ms/step - loss: 0.1001 - accuracy: 1.0000 - val_loss: 0.1516 - val_accuracy: 1.0000
Epoch 165/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0977 - accuracy: 0.9948 - val_loss: 0.1495 - val_accuracy: 1.0000
Epoch 166/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0961 - accuracy: 1.0000 - val_loss: 0.1477 - val_accuracy: 1.0000
Epoch 167/256
6/6 [==============================] - 0s 5ms/step - loss: 0.0946 - accuracy: 1.0000 - val_loss: 0.1451 - val_accuracy: 1.0000
Epoch 168/256
6/6 [==============================] - 0s 11ms/step - loss: 0.0932 - accuracy: 1.0000 - val_loss: 0.1426 - val_accuracy: 1.0000
Epoch 169/256
6/6 [==============================] - 0s 11ms/step - loss: 0.0921 - accuracy: 1.0000 - val_loss: 0.1405 - val_accuracy: 1.0000
Epoch 170/256
6/6 [==============================] - 0s 13ms/step - loss: 0.0910 - accuracy: 1.0000 - val_loss: 0.1384 - val_accuracy: 1.0000
Epoch 171/256
6/6 [==============================] - 0s 4ms/step - loss: 0.0900 - accuracy: 1.0000 - val_loss: 0.1383 - val_accuracy: 1.0000
Epoch 172/256
6/6 [==============================] - 0s 7ms/step - loss: 0.0879 - accuracy: 1.0000 - val_loss: 0.1332 - val_accuracy: 1.0000
Epoch 173/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0869 - accuracy: 0.9948 - val_loss: 0.1302 - val_accuracy: 1.0000
Epoch 174/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0855 - accuracy: 1.0000 - val_loss: 0.1298 - val_accuracy: 1.0000
Epoch 175/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0846 - accuracy: 1.0000 - val_loss: 0.1280 - val_accuracy: 1.0000
Epoch 176/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0835 - accuracy: 1.0000 - val_loss: 0.1298 - val_accuracy: 1.0000
Epoch 177/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0826 - accuracy: 1.0000 - val_loss: 0.1251 - val_accuracy: 1.0000
Epoch 178/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0809 - accuracy: 1.0000 - val_loss: 0.1217 - val_accuracy: 1.0000
Epoch 179/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0803 - accuracy: 1.0000 - val_loss: 0.1183 - val_accuracy: 1.0000
Epoch 180/256
6/6 [==============================] - 0s 4ms/step - loss: 0.0782 - accuracy: 1.0000 - val_loss: 0.1185 - val_accuracy: 1.0000
Epoch 181/256
6/6 [==============================] - 0s 9ms/step - loss: 0.0772 - accuracy: 1.0000 - val_loss: 0.1191 - val_accuracy: 1.0000
Epoch 182/256
6/6 [==============================] - 0s 11ms/step - loss: 0.0770 - accuracy: 1.0000 - val_loss: 0.1151 - val_accuracy: 1.0000
Epoch 183/256
6/6 [==============================] - 0s 10ms/step - loss: 0.0760 - accuracy: 1.0000 - val_loss: 0.1158 - val_accuracy: 1.0000
Epoch 184/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0742 - accuracy: 1.0000 - val_loss: 0.1129 - val_accuracy: 1.0000
Epoch 185/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0730 - accuracy: 1.0000 - val_loss: 0.1103 - val_accuracy: 1.0000
Epoch 186/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0726 - accuracy: 0.9948 - val_loss: 0.1067 - val_accuracy: 1.0000
Epoch 187/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0714 - accuracy: 1.0000 - val_loss: 0.1067 - val_accuracy: 1.0000
Epoch 188/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0701 - accuracy: 1.0000 - val_loss: 0.1061 - val_accuracy: 1.0000
Epoch 189/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0694 - accuracy: 1.0000 - val_loss: 0.1052 - val_accuracy: 1.0000
Epoch 190/256
6/6 [==============================] - 0s 8ms/step - loss: 0.0687 - accuracy: 1.0000 - val_loss: 0.1041 - val_accuracy: 1.0000
Epoch 191/256
6/6 [==============================] - 0s 6ms/step - loss: 0.0686 - accuracy: 1.0000 - val_loss: 0.1038 - val_accuracy: 1.0000
Epoch 192/256
6/6 [==============================] - 0s 4ms/step - loss: 0.0681 - accuracy: 1.0000 - val_loss: 0.1020 - val_accuracy: 1.0000
Epoch 193/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0656 - accuracy: 1.0000 - val_loss: 0.0975 - val_accuracy: 1.0000
Epoch 194/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0655 - accuracy: 0.9948 - val_loss: 0.0950 - val_accuracy: 1.0000
Epoch 195/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0642 - accuracy: 1.0000 - val_loss: 0.0952 - val_accuracy: 1.0000
Epoch 196/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0654 - accuracy: 1.0000 - val_loss: 0.0999 - val_accuracy: 1.0000
Epoch 197/256
6/6 [==============================] - 0s 6ms/step - loss: 0.0627 - accuracy: 1.0000 - val_loss: 0.0959 - val_accuracy: 1.0000
Epoch 198/256
6/6 [==============================] - 0s 9ms/step - loss: 0.0617 - accuracy: 1.0000 - val_loss: 0.0928 - val_accuracy: 1.0000
Epoch 199/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0611 - accuracy: 1.0000 - val_loss: 0.0912 - val_accuracy: 1.0000
Epoch 200/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0604 - accuracy: 1.0000 - val_loss: 0.0890 - val_accuracy: 1.0000
Epoch 201/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0603 - accuracy: 1.0000 - val_loss: 0.0872 - val_accuracy: 1.0000
Epoch 202/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0592 - accuracy: 1.0000 - val_loss: 0.0903 - val_accuracy: 1.0000
Epoch 203/256
6/6 [==============================] - 0s 6ms/step - loss: 0.0585 - accuracy: 1.0000 - val_loss: 0.0877 - val_accuracy: 1.0000
Epoch 204/256
6/6 [==============================] - 0s 4ms/step - loss: 0.0575 - accuracy: 1.0000 - val_loss: 0.0879 - val_accuracy: 1.0000
Epoch 205/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0568 - accuracy: 1.0000 - val_loss: 0.0861 - val_accuracy: 1.0000
Epoch 206/256
6/6 [==============================] - 0s 10ms/step - loss: 0.0566 - accuracy: 1.0000 - val_loss: 0.0839 - val_accuracy: 1.0000
Epoch 207/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0555 - accuracy: 1.0000 - val_loss: 0.0850 - val_accuracy: 1.0000
Epoch 208/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0549 - accuracy: 1.0000 - val_loss: 0.0827 - val_accuracy: 1.0000
Epoch 209/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0541 - accuracy: 1.0000 - val_loss: 0.0825 - val_accuracy: 1.0000
Epoch 210/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0535 - accuracy: 1.0000 - val_loss: 0.0807 - val_accuracy: 1.0000
Epoch 211/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0531 - accuracy: 1.0000 - val_loss: 0.0792 - val_accuracy: 1.0000
Epoch 212/256
6/6 [==============================] - 0s 10ms/step - loss: 0.0524 - accuracy: 1.0000 - val_loss: 0.0786 - val_accuracy: 1.0000
Epoch 213/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0516 - accuracy: 1.0000 - val_loss: 0.0783 - val_accuracy: 1.0000
Epoch 214/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0511 - accuracy: 1.0000 - val_loss: 0.0776 - val_accuracy: 1.0000
Epoch 215/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0505 - accuracy: 1.0000 - val_loss: 0.0782 - val_accuracy: 1.0000
Epoch 216/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0499 - accuracy: 1.0000 - val_loss: 0.0763 - val_accuracy: 1.0000
Epoch 217/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0496 - accuracy: 1.0000 - val_loss: 0.0749 - val_accuracy: 1.0000
Epoch 218/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0491 - accuracy: 1.0000 - val_loss: 0.0739 - val_accuracy: 1.0000
Epoch 219/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0491 - accuracy: 1.0000 - val_loss: 0.0744 - val_accuracy: 1.0000
Epoch 220/256
6/6 [==============================] - 0s 9ms/step - loss: 0.0478 - accuracy: 1.0000 - val_loss: 0.0731 - val_accuracy: 1.0000
Epoch 221/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0471 - accuracy: 1.0000 - val_loss: 0.0710 - val_accuracy: 1.0000
Epoch 222/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0473 - accuracy: 1.0000 - val_loss: 0.0698 - val_accuracy: 1.0000
Epoch 223/256
6/6 [==============================] - 0s 6ms/step - loss: 0.0466 - accuracy: 1.0000 - val_loss: 0.0700 - val_accuracy: 1.0000
Epoch 224/256
6/6 [==============================] - 0s 4ms/step - loss: 0.0458 - accuracy: 1.0000 - val_loss: 0.0694 - val_accuracy: 1.0000
Epoch 225/256
6/6 [==============================] - 0s 6ms/step - loss: 0.0453 - accuracy: 1.0000 - val_loss: 0.0679 - val_accuracy: 1.0000
Epoch 226/256
6/6 [==============================] - 0s 5ms/step - loss: 0.0452 - accuracy: 1.0000 - val_loss: 0.0670 - val_accuracy: 1.0000
Epoch 227/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0442 - accuracy: 1.0000 - val_loss: 0.0677 - val_accuracy: 1.0000
Epoch 228/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0440 - accuracy: 1.0000 - val_loss: 0.0672 - val_accuracy: 1.0000
Epoch 229/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0436 - accuracy: 1.0000 - val_loss: 0.0675 - val_accuracy: 1.0000
Epoch 230/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0431 - accuracy: 1.0000 - val_loss: 0.0662 - val_accuracy: 1.0000
Epoch 231/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0425 - accuracy: 1.0000 - val_loss: 0.0649 - val_accuracy: 1.0000
Epoch 232/256
6/6 [==============================] - 0s 4ms/step - loss: 0.0422 - accuracy: 1.0000 - val_loss: 0.0629 - val_accuracy: 1.0000
Epoch 233/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0416 - accuracy: 1.0000 - val_loss: 0.0632 - val_accuracy: 1.0000
Epoch 234/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0413 - accuracy: 1.0000 - val_loss: 0.0619 - val_accuracy: 1.0000
Epoch 235/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0407 - accuracy: 1.0000 - val_loss: 0.0616 - val_accuracy: 1.0000
Epoch 236/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0408 - accuracy: 1.0000 - val_loss: 0.0612 - val_accuracy: 1.0000
Epoch 237/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0401 - accuracy: 1.0000 - val_loss: 0.0634 - val_accuracy: 1.0000
Epoch 238/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0397 - accuracy: 1.0000 - val_loss: 0.0614 - val_accuracy: 1.0000
Epoch 239/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0392 - accuracy: 1.0000 - val_loss: 0.0594 - val_accuracy: 1.0000
Epoch 240/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0391 - accuracy: 1.0000 - val_loss: 0.0592 - val_accuracy: 1.0000
Epoch 241/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0384 - accuracy: 1.0000 - val_loss: 0.0583 - val_accuracy: 1.0000
Epoch 242/256
6/6 [==============================] - 0s 4ms/step - loss: 0.0380 - accuracy: 1.0000 - val_loss: 0.0577 - val_accuracy: 1.0000
Epoch 243/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0378 - accuracy: 1.0000 - val_loss: 0.0570 - val_accuracy: 1.0000
Epoch 244/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0379 - accuracy: 1.0000 - val_loss: 0.0591 - val_accuracy: 1.0000
Epoch 245/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0370 - accuracy: 1.0000 - val_loss: 0.0561 - val_accuracy: 1.0000
Epoch 246/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0368 - accuracy: 1.0000 - val_loss: 0.0564 - val_accuracy: 1.0000
Epoch 247/256
6/6 [==============================] - 0s 6ms/step - loss: 0.0363 - accuracy: 1.0000 - val_loss: 0.0561 - val_accuracy: 1.0000
Epoch 248/256
6/6 [==============================] - 0s 4ms/step - loss: 0.0369 - accuracy: 1.0000 - val_loss: 0.0529 - val_accuracy: 1.0000
Epoch 249/256
6/6 [==============================] - 0s 6ms/step - loss: 0.0357 - accuracy: 1.0000 - val_loss: 0.0544 - val_accuracy: 1.0000
Epoch 250/256
6/6 [==============================] - 0s 5ms/step - loss: 0.0354 - accuracy: 1.0000 - val_loss: 0.0543 - val_accuracy: 1.0000
Epoch 251/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0350 - accuracy: 1.0000 - val_loss: 0.0558 - val_accuracy: 1.0000
Epoch 252/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0349 - accuracy: 1.0000 - val_loss: 0.0540 - val_accuracy: 1.0000
Epoch 253/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0343 - accuracy: 1.0000 - val_loss: 0.0515 - val_accuracy: 1.0000
Epoch 254/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0344 - accuracy: 1.0000 - val_loss: 0.0521 - val_accuracy: 1.0000
Epoch 255/256
6/6 [==============================] - 0s 7ms/step - loss: 0.0336 - accuracy: 1.0000 - val_loss: 0.0519 - val_accuracy: 1.0000
Epoch 256/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0336 - accuracy: 1.0000 - val_loss: 0.0503 - val_accuracy: 1.0000
In [13]:
model.evaluate(Xtest, ytest)
1/1 [==============================] - 0s 1ms/step - loss: 0.0149 - accuracy: 1.0000
Out[13]:
[0.01487953681498766, 1.0]
In [14]:
cm = confusion_matrix(np.argmax(ytest.values, axis = 1), (np.argmax(model.predict(Xtest), axis = 1)))
cm = cm.astype('int') / cm.sum(axis=1)[:, np.newaxis]

fig = plt.figure(figsize = (10, 10))
ax = fig.add_subplot(111)

for i in range(cm.shape[1]):
    for j in range(cm.shape[0]):
        if cm[i,j] > 0.8:
            clr = "white"
        else:
            clr = "black"
        ax.text(j, i, format(cm[i, j], '.2f'), horizontalalignment="center", color=clr)

_ = ax.imshow(cm, cmap=plt.cm.Blues)
ax.set_xticks(range(6))
ax.set_yticks(range(6))
ax.set_xticklabels(range(6))
ax.set_yticklabels(range(6))
plt.xlabel('Predicted')
plt.ylabel('True')
plt.show()

Plotting the metrics

In [15]:
def plot(history, variable, variable2):
    plt.plot(range(len(history[variable])), history[variable])
    plt.plot(range(len(history[variable2])), history[variable2])
    plt.legend([variable, variable2])
    plt.title(variable)
In [16]:
plot(history.history, "loss", "val_loss")
In [17]:
plot(history.history, "accuracy", "val_accuracy")

Prediction

In [18]:
classes = ['Brown Dwarf', 'Red Dwarf', 'White Dwarf', 'Main Sequence', 'Supergiant', 'Hypergiant']
In [19]:
# pick random test data sample from one batch
x = random.randint(0, len(Xtest) - 1)

output = model.predict(Xtest[x].reshape(1, -1))[0]
print("Predicted: ", classes[np.argmax(output)])   
print("Probability: ", output[np.argmax(output)])

print("True: ", classes[np.argmax(ytest.values[x])])
Predicted:  Red Dwarf
Probability:  0.98867714
True:  Red Dwarf

deepC

In [20]:
model.save('star.h5')

!deepCC star.h5
[INFO]
Reading [keras model] 'star.h5'
[SUCCESS]
Saved 'star_deepC/star.onnx'
[INFO]
Reading [onnx model] 'star_deepC/star.onnx'
[INFO]
Model info:
  ir_vesion : 4
  doc       : 
[WARNING]
[ONNX]: terminal (input/output) dense_input's shape is less than 1. Changing it to 1.
[WARNING]
[ONNX]: terminal (input/output) dense_2's shape is less than 1. Changing it to 1.
WARN (GRAPH): found operator node with the same name (dense_2) as io node.
[INFO]
Running DNNC graph sanity check ...
[SUCCESS]
Passed sanity check.
[INFO]
Writing C++ file 'star_deepC/star.cpp'
[INFO]
deepSea model files are ready in 'star_deepC/' 
[RUNNING COMMAND]
g++ -std=c++11 -O3 -fno-rtti -fno-exceptions -I. -I/opt/tljh/user/lib/python3.7/site-packages/deepC-0.13-py3.7-linux-x86_64.egg/deepC/include -isystem /opt/tljh/user/lib/python3.7/site-packages/deepC-0.13-py3.7-linux-x86_64.egg/deepC/packages/eigen-eigen-323c052e1731 "star_deepC/star.cpp" -D_AITS_MAIN -o "star_deepC/star.exe"
[RUNNING COMMAND]
size "star_deepC/star.exe"
   text	   data	    bss	    dec	    hex	filename
 124997	   2984	    760	 128741	  1f6e5	star_deepC/star.exe
[SUCCESS]
Saved model as executable "star_deepC/star.exe"
In [21]:
x = random.randint(0, len(Xtest) - 1)
print(x)
np.savetxt('sample.data', Xtest[x])    # xth sample into text file

# run exe with input
!star_deepC/star.exe sample.data

# show predicted output
nn_out = np.loadtxt('deepSea_result_1.out')
print("Model output: ", model.predict(Xtest[x].reshape(1, -1))[0])
print("deepC output:",nn_out)

print("Predicted: ", classes[np.argmax(nn_out)]) 
print("Probability: ", nn_out[np.argmax(nn_out)])

print("True: ", classes[np.argmax(ytest.values[x])])
7
writing file deepSea_result_1.out.
Model output:  [5.5304659e-04 1.9039716e-08 4.1234790e-04 9.9859565e-01 5.2602314e-07
 4.3843524e-04]
deepC output: [5.53048e-04 1.90399e-08 4.12349e-04 9.98596e-01 5.26024e-07 4.38438e-04]
Predicted:  Main Sequence
Probability:  0.998596
True:  Main Sequence