Cainvas
Model Files
star.h5
keras
Model
deepSea Compiled Models
star.exe
deepSea
Ubuntu

What type of star is it?

Credit: AITS Cainvas Community

Photo by Alex Kunchevsky for OUTLΛNE on Dribbble

Identify the type of star using its characteristics like luminosity, temperature, colour, etc.

In [1]:
import pandas as pd
import numpy as np
from sklearn.model_selection import train_test_split
from sklearn.preprocessing import StandardScaler, MinMaxScaler
from keras import models, optimizers, losses, layers, callbacks
from sklearn.metrics import confusion_matrix
import matplotlib.pyplot as plt
import random

Dataset

On Kaggle by Deepraj Baidya | Github

The dataset took 3 weeks to collect for 240 stars which are mostly collected from web. The missing data were manually calculated using equations of astrophysics.

The dataset is a CSV file with characteristics of a star like luminosity, temperature, colour, radius etc that help classify them into one of the 6 classes - Brown Dwarf, Red Dwarf, White Dwarf, Main Sequence, Supergiant, Hypergiant.

In [2]:
df = pd.read_csv('https://cainvas-static.s3.amazonaws.com/media/user_data/cainvas-admin/star.csv')
df
Out[2]:
Temperature (K) Luminosity(L/Lo) Radius(R/Ro) Absolute magnitude(Mv) Star type Star color Spectral Class
0 3068 0.002400 0.1700 16.12 0 Red M
1 3042 0.000500 0.1542 16.60 0 Red M
2 2600 0.000300 0.1020 18.70 0 Red M
3 2800 0.000200 0.1600 16.65 0 Red M
4 1939 0.000138 0.1030 20.06 0 Red M
... ... ... ... ... ... ... ...
235 38940 374830.000000 1356.0000 -9.93 5 Blue O
236 30839 834042.000000 1194.0000 -10.63 5 Blue O
237 8829 537493.000000 1423.0000 -10.73 5 White A
238 9235 404940.000000 1112.0000 -11.23 5 White A
239 37882 294903.000000 1783.0000 -7.80 5 Blue O

240 rows × 7 columns

Preprocessing

In [3]:
df['Star color'].value_counts()
Out[3]:
Red                   112
Blue                   55
Blue-white             26
Blue White             10
yellow-white            8
White                   7
white                   3
Yellowish White         3
Blue white              3
Whitish                 2
Orange                  2
yellowish               2
Pale yellow orange      1
Blue                    1
White-Yellow            1
Yellowish               1
Blue-White              1
Orange-Red              1
Blue white              1
Name: Star color, dtype: int64

There are many shades of colours mentioned, some similar like Yellowish white and White-Yellow, and many spelling for blue-white.

We can identify 5 basic colours from the given list - blue, white, yellow, orange, red. Lets rewrite the column as 5 columns with multilabel values.

In [4]:
colours = ['Blu', 'Whit', 'Yellow', 'Orang', 'Red']    # using root word of colours as the spelling can differ while specifying shades

df[colours] = 0

for c in colours:
    df.loc[df['Star color'].str.contains(c, case = False), c]=1
In [5]:
df['Spectral Class'].value_counts()
Out[5]:
M    111
B     46
O     40
A     19
F     17
K      6
G      1
Name: Spectral Class, dtype: int64
In [6]:
# One hot encoding the input column
df_dummies = pd.get_dummies(df['Spectral Class'], drop_first = True, prefix = 'Spectral')
for column in df_dummies:
    df[column] = df_dummies[column]

# One hot encoding the output column    
y = pd.get_dummies(df['Star type'])

# Dropping the encoded columns
df = df.drop(columns = ['Spectral Class', 'Star type', 'Star color'])

Looking into the variable value ranges

In [7]:
df.describe()
Out[7]:
Temperature (K) Luminosity(L/Lo) Radius(R/Ro) Absolute magnitude(Mv) Blu Whit Yellow Orang Red Spectral_B Spectral_F Spectral_G Spectral_K Spectral_M Spectral_O
count 240.000000 240.000000 240.000000 240.000000 240.000000 240.000000 240.000000 240.000000 240.000000 240.000000 240.000000 240.000000 240.000000 240.000000 240.000000
mean 10497.462500 107188.361635 237.157781 4.382396 0.404167 0.270833 0.066667 0.016667 0.470833 0.191667 0.070833 0.004167 0.025000 0.462500 0.166667
std 9552.425037 179432.244940 517.155763 10.532512 0.491756 0.445319 0.249965 0.128287 0.500192 0.394435 0.257082 0.064550 0.156451 0.499634 0.373457
min 1939.000000 0.000080 0.008400 -11.920000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000
25% 3344.250000 0.000865 0.102750 -6.232500 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000
50% 5776.000000 0.070500 0.762500 8.313000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000
75% 15055.500000 198050.000000 42.750000 13.697500 1.000000 1.000000 0.000000 0.000000 1.000000 0.000000 0.000000 0.000000 0.000000 1.000000 0.000000
max 40000.000000 849420.000000 1948.500000 20.060000 1.000000 1.000000 1.000000 1.000000 1.000000 1.000000 1.000000 1.000000 1.000000 1.000000 1.000000

The standard deviation values differ across the attributes.

Splitting the dataset before standardization (mean = 0, sd = 1).

Train val test split

In [8]:
# Splitting into train, val and test set -- 80-10-10 split

# First, an 80-20 split
Xtrain, X_val_test, ytrain, y_val_test = train_test_split(df, y, test_size = 0.2)

# Then split the 20% into half
Xval, Xtest, yval, ytest = train_test_split(X_val_test, y_val_test, test_size = 0.5)

print("Number of samples in...")
print("Training set: ", len(Xtrain))
print("Validation set: ", len(Xval))
print("Testing set: ", len(Xtest))
Number of samples in...
Training set:  192
Validation set:  24
Testing set:  24

Standardization

In [9]:
ss = StandardScaler()

Xtrain = ss.fit_transform(Xtrain)
Xval = ss.transform(Xval)
Xtest = ss.transform(Xtest)

The model

In [10]:
model = models.Sequential([
    layers.Dense(16, activation = 'relu', input_shape = Xtrain[0].shape),
    layers.Dense(8, activation = 'relu'),
    layers.Dense(6, activation = 'softmax')
])

cb = callbacks.EarlyStopping(patience = 5, restore_best_weights = True)
In [11]:
model.compile(optimizer = optimizers.Adam(0.001), loss = losses.CategoricalCrossentropy(), metrics = ['accuracy'])

history = model.fit(Xtrain, ytrain, validation_data = (Xval, yval), epochs = 256, callbacks = cb)
Epoch 1/256
6/6 [==============================] - 0s 23ms/step - loss: 1.9269 - accuracy: 0.3021 - val_loss: 1.8928 - val_accuracy: 0.2917
Epoch 2/256
6/6 [==============================] - 0s 3ms/step - loss: 1.8652 - accuracy: 0.3385 - val_loss: 1.8423 - val_accuracy: 0.2917
Epoch 3/256
6/6 [==============================] - 0s 3ms/step - loss: 1.8095 - accuracy: 0.3594 - val_loss: 1.7941 - val_accuracy: 0.3750
Epoch 4/256
6/6 [==============================] - 0s 4ms/step - loss: 1.7607 - accuracy: 0.3854 - val_loss: 1.7484 - val_accuracy: 0.4167
Epoch 5/256
6/6 [==============================] - 0s 3ms/step - loss: 1.7119 - accuracy: 0.4010 - val_loss: 1.7106 - val_accuracy: 0.4167
Epoch 6/256
6/6 [==============================] - 0s 3ms/step - loss: 1.6683 - accuracy: 0.4688 - val_loss: 1.6760 - val_accuracy: 0.4167
Epoch 7/256
6/6 [==============================] - 0s 4ms/step - loss: 1.6288 - accuracy: 0.5104 - val_loss: 1.6409 - val_accuracy: 0.4167
Epoch 8/256
6/6 [==============================] - 0s 3ms/step - loss: 1.5883 - accuracy: 0.5000 - val_loss: 1.6079 - val_accuracy: 0.4167
Epoch 9/256
6/6 [==============================] - 0s 3ms/step - loss: 1.5518 - accuracy: 0.4948 - val_loss: 1.5745 - val_accuracy: 0.4167
Epoch 10/256
6/6 [==============================] - 0s 3ms/step - loss: 1.5142 - accuracy: 0.4896 - val_loss: 1.5424 - val_accuracy: 0.4167
Epoch 11/256
6/6 [==============================] - 0s 3ms/step - loss: 1.4791 - accuracy: 0.4948 - val_loss: 1.5130 - val_accuracy: 0.4167
Epoch 12/256
6/6 [==============================] - 0s 4ms/step - loss: 1.4440 - accuracy: 0.5052 - val_loss: 1.4846 - val_accuracy: 0.4167
Epoch 13/256
6/6 [==============================] - 0s 3ms/step - loss: 1.4099 - accuracy: 0.5052 - val_loss: 1.4578 - val_accuracy: 0.4167
Epoch 14/256
6/6 [==============================] - 0s 3ms/step - loss: 1.3802 - accuracy: 0.5000 - val_loss: 1.4316 - val_accuracy: 0.4167
Epoch 15/256
6/6 [==============================] - 0s 3ms/step - loss: 1.3468 - accuracy: 0.5052 - val_loss: 1.4079 - val_accuracy: 0.4167
Epoch 16/256
6/6 [==============================] - 0s 3ms/step - loss: 1.3171 - accuracy: 0.5104 - val_loss: 1.3838 - val_accuracy: 0.4167
Epoch 17/256
6/6 [==============================] - 0s 3ms/step - loss: 1.2887 - accuracy: 0.5156 - val_loss: 1.3618 - val_accuracy: 0.4167
Epoch 18/256
6/6 [==============================] - 0s 3ms/step - loss: 1.2619 - accuracy: 0.5208 - val_loss: 1.3405 - val_accuracy: 0.4167
Epoch 19/256
6/6 [==============================] - 0s 3ms/step - loss: 1.2346 - accuracy: 0.5365 - val_loss: 1.3199 - val_accuracy: 0.4583
Epoch 20/256
6/6 [==============================] - 0s 3ms/step - loss: 1.2091 - accuracy: 0.5573 - val_loss: 1.3020 - val_accuracy: 0.4583
Epoch 21/256
6/6 [==============================] - 0s 4ms/step - loss: 1.1849 - accuracy: 0.6146 - val_loss: 1.2856 - val_accuracy: 0.5000
Epoch 22/256
6/6 [==============================] - 0s 3ms/step - loss: 1.1625 - accuracy: 0.6615 - val_loss: 1.2708 - val_accuracy: 0.5000
Epoch 23/256
6/6 [==============================] - 0s 3ms/step - loss: 1.1406 - accuracy: 0.6875 - val_loss: 1.2560 - val_accuracy: 0.5833
Epoch 24/256
6/6 [==============================] - 0s 3ms/step - loss: 1.1210 - accuracy: 0.7292 - val_loss: 1.2413 - val_accuracy: 0.5833
Epoch 25/256
6/6 [==============================] - 0s 3ms/step - loss: 1.1015 - accuracy: 0.7344 - val_loss: 1.2280 - val_accuracy: 0.5833
Epoch 26/256
6/6 [==============================] - 0s 3ms/step - loss: 1.0832 - accuracy: 0.7500 - val_loss: 1.2159 - val_accuracy: 0.6250
Epoch 27/256
6/6 [==============================] - 0s 3ms/step - loss: 1.0649 - accuracy: 0.7552 - val_loss: 1.2024 - val_accuracy: 0.6250
Epoch 28/256
6/6 [==============================] - 0s 3ms/step - loss: 1.0471 - accuracy: 0.7500 - val_loss: 1.1875 - val_accuracy: 0.6250
Epoch 29/256
6/6 [==============================] - 0s 3ms/step - loss: 1.0303 - accuracy: 0.7552 - val_loss: 1.1744 - val_accuracy: 0.6667
Epoch 30/256
6/6 [==============================] - 0s 3ms/step - loss: 1.0138 - accuracy: 0.7552 - val_loss: 1.1602 - val_accuracy: 0.6667
Epoch 31/256
6/6 [==============================] - 0s 3ms/step - loss: 0.9984 - accuracy: 0.7656 - val_loss: 1.1488 - val_accuracy: 0.6667
Epoch 32/256
6/6 [==============================] - 0s 3ms/step - loss: 0.9830 - accuracy: 0.7656 - val_loss: 1.1367 - val_accuracy: 0.6667
Epoch 33/256
6/6 [==============================] - 0s 3ms/step - loss: 0.9679 - accuracy: 0.7656 - val_loss: 1.1259 - val_accuracy: 0.6667
Epoch 34/256
6/6 [==============================] - 0s 3ms/step - loss: 0.9524 - accuracy: 0.7656 - val_loss: 1.1130 - val_accuracy: 0.6667
Epoch 35/256
6/6 [==============================] - 0s 3ms/step - loss: 0.9379 - accuracy: 0.7656 - val_loss: 1.1019 - val_accuracy: 0.7083
Epoch 36/256
6/6 [==============================] - 0s 3ms/step - loss: 0.9235 - accuracy: 0.7760 - val_loss: 1.0898 - val_accuracy: 0.7500
Epoch 37/256
6/6 [==============================] - 0s 3ms/step - loss: 0.9090 - accuracy: 0.7760 - val_loss: 1.0771 - val_accuracy: 0.7500
Epoch 38/256
6/6 [==============================] - 0s 3ms/step - loss: 0.8953 - accuracy: 0.7760 - val_loss: 1.0657 - val_accuracy: 0.7500
Epoch 39/256
6/6 [==============================] - 0s 3ms/step - loss: 0.8818 - accuracy: 0.7760 - val_loss: 1.0553 - val_accuracy: 0.7500
Epoch 40/256
6/6 [==============================] - 0s 3ms/step - loss: 0.8671 - accuracy: 0.7812 - val_loss: 1.0414 - val_accuracy: 0.7917
Epoch 41/256
6/6 [==============================] - 0s 3ms/step - loss: 0.8524 - accuracy: 0.7917 - val_loss: 1.0304 - val_accuracy: 0.7917
Epoch 42/256
6/6 [==============================] - 0s 4ms/step - loss: 0.8385 - accuracy: 0.8073 - val_loss: 1.0192 - val_accuracy: 0.7917
Epoch 43/256
6/6 [==============================] - 0s 3ms/step - loss: 0.8229 - accuracy: 0.7969 - val_loss: 1.0082 - val_accuracy: 0.7917
Epoch 44/256
6/6 [==============================] - 0s 3ms/step - loss: 0.8082 - accuracy: 0.8073 - val_loss: 0.9980 - val_accuracy: 0.7917
Epoch 45/256
6/6 [==============================] - 0s 3ms/step - loss: 0.7920 - accuracy: 0.8125 - val_loss: 0.9847 - val_accuracy: 0.7500
Epoch 46/256
6/6 [==============================] - 0s 3ms/step - loss: 0.7765 - accuracy: 0.8177 - val_loss: 0.9717 - val_accuracy: 0.7917
Epoch 47/256
6/6 [==============================] - 0s 4ms/step - loss: 0.7594 - accuracy: 0.8281 - val_loss: 0.9539 - val_accuracy: 0.7917
Epoch 48/256
6/6 [==============================] - 0s 3ms/step - loss: 0.7403 - accuracy: 0.8594 - val_loss: 0.9299 - val_accuracy: 0.8333
Epoch 49/256
6/6 [==============================] - 0s 3ms/step - loss: 0.7192 - accuracy: 0.8542 - val_loss: 0.9059 - val_accuracy: 0.8333
Epoch 50/256
6/6 [==============================] - 0s 3ms/step - loss: 0.6986 - accuracy: 0.8542 - val_loss: 0.8799 - val_accuracy: 0.8333
Epoch 51/256
6/6 [==============================] - 0s 3ms/step - loss: 0.6798 - accuracy: 0.8646 - val_loss: 0.8550 - val_accuracy: 0.8333
Epoch 52/256
6/6 [==============================] - 0s 3ms/step - loss: 0.6575 - accuracy: 0.8802 - val_loss: 0.8279 - val_accuracy: 0.8750
Epoch 53/256
6/6 [==============================] - 0s 3ms/step - loss: 0.6377 - accuracy: 0.8802 - val_loss: 0.7941 - val_accuracy: 0.8750
Epoch 54/256
6/6 [==============================] - 0s 3ms/step - loss: 0.6121 - accuracy: 0.8802 - val_loss: 0.7632 - val_accuracy: 0.8750
Epoch 55/256
6/6 [==============================] - 0s 3ms/step - loss: 0.5869 - accuracy: 0.8906 - val_loss: 0.7343 - val_accuracy: 0.8750
Epoch 56/256
6/6 [==============================] - 0s 3ms/step - loss: 0.5639 - accuracy: 0.9479 - val_loss: 0.7082 - val_accuracy: 0.8750
Epoch 57/256
6/6 [==============================] - 0s 3ms/step - loss: 0.5391 - accuracy: 0.9583 - val_loss: 0.6806 - val_accuracy: 0.8750
Epoch 58/256
6/6 [==============================] - 0s 3ms/step - loss: 0.5173 - accuracy: 0.9531 - val_loss: 0.6539 - val_accuracy: 0.9167
Epoch 59/256
6/6 [==============================] - 0s 3ms/step - loss: 0.4969 - accuracy: 0.9688 - val_loss: 0.6288 - val_accuracy: 0.9167
Epoch 60/256
6/6 [==============================] - 0s 3ms/step - loss: 0.4769 - accuracy: 0.9688 - val_loss: 0.6094 - val_accuracy: 0.9167
Epoch 61/256
6/6 [==============================] - 0s 3ms/step - loss: 0.4587 - accuracy: 0.9740 - val_loss: 0.5901 - val_accuracy: 0.9167
Epoch 62/256
6/6 [==============================] - 0s 3ms/step - loss: 0.4407 - accuracy: 0.9740 - val_loss: 0.5707 - val_accuracy: 0.9583
Epoch 63/256
6/6 [==============================] - 0s 3ms/step - loss: 0.4254 - accuracy: 0.9792 - val_loss: 0.5528 - val_accuracy: 0.9583
Epoch 64/256
6/6 [==============================] - 0s 3ms/step - loss: 0.4107 - accuracy: 0.9740 - val_loss: 0.5375 - val_accuracy: 0.9583
Epoch 65/256
6/6 [==============================] - 0s 3ms/step - loss: 0.3970 - accuracy: 0.9792 - val_loss: 0.5226 - val_accuracy: 0.9583
Epoch 66/256
6/6 [==============================] - 0s 3ms/step - loss: 0.3830 - accuracy: 0.9792 - val_loss: 0.5090 - val_accuracy: 0.9583
Epoch 67/256
6/6 [==============================] - 0s 3ms/step - loss: 0.3722 - accuracy: 0.9740 - val_loss: 0.4979 - val_accuracy: 0.9583
Epoch 68/256
6/6 [==============================] - 0s 3ms/step - loss: 0.3595 - accuracy: 0.9792 - val_loss: 0.4846 - val_accuracy: 0.9583
Epoch 69/256
6/6 [==============================] - 0s 3ms/step - loss: 0.3494 - accuracy: 0.9844 - val_loss: 0.4736 - val_accuracy: 0.9583
Epoch 70/256
6/6 [==============================] - 0s 3ms/step - loss: 0.3390 - accuracy: 0.9844 - val_loss: 0.4602 - val_accuracy: 0.9583
Epoch 71/256
6/6 [==============================] - 0s 3ms/step - loss: 0.3297 - accuracy: 0.9844 - val_loss: 0.4524 - val_accuracy: 0.9583
Epoch 72/256
6/6 [==============================] - 0s 3ms/step - loss: 0.3199 - accuracy: 0.9896 - val_loss: 0.4427 - val_accuracy: 0.9583
Epoch 73/256
6/6 [==============================] - 0s 3ms/step - loss: 0.3122 - accuracy: 0.9896 - val_loss: 0.4320 - val_accuracy: 0.9583
Epoch 74/256
6/6 [==============================] - 0s 3ms/step - loss: 0.3037 - accuracy: 0.9844 - val_loss: 0.4242 - val_accuracy: 0.9583
Epoch 75/256
6/6 [==============================] - 0s 3ms/step - loss: 0.2960 - accuracy: 0.9896 - val_loss: 0.4170 - val_accuracy: 0.9583
Epoch 76/256
6/6 [==============================] - 0s 3ms/step - loss: 0.2890 - accuracy: 0.9896 - val_loss: 0.4099 - val_accuracy: 0.9583
Epoch 77/256
6/6 [==============================] - 0s 3ms/step - loss: 0.2823 - accuracy: 0.9896 - val_loss: 0.4030 - val_accuracy: 0.9583
Epoch 78/256
6/6 [==============================] - 0s 3ms/step - loss: 0.2757 - accuracy: 0.9896 - val_loss: 0.3963 - val_accuracy: 0.9583
Epoch 79/256
6/6 [==============================] - 0s 3ms/step - loss: 0.2696 - accuracy: 0.9948 - val_loss: 0.3885 - val_accuracy: 0.9583
Epoch 80/256
6/6 [==============================] - 0s 3ms/step - loss: 0.2638 - accuracy: 0.9844 - val_loss: 0.3820 - val_accuracy: 0.9583
Epoch 81/256
6/6 [==============================] - 0s 3ms/step - loss: 0.2584 - accuracy: 0.9844 - val_loss: 0.3764 - val_accuracy: 0.9583
Epoch 82/256
6/6 [==============================] - 0s 3ms/step - loss: 0.2529 - accuracy: 0.9948 - val_loss: 0.3707 - val_accuracy: 0.9583
Epoch 83/256
6/6 [==============================] - 0s 3ms/step - loss: 0.2476 - accuracy: 0.9948 - val_loss: 0.3663 - val_accuracy: 0.9583
Epoch 84/256
6/6 [==============================] - 0s 3ms/step - loss: 0.2430 - accuracy: 0.9896 - val_loss: 0.3627 - val_accuracy: 0.9583
Epoch 85/256
6/6 [==============================] - 0s 3ms/step - loss: 0.2392 - accuracy: 0.9948 - val_loss: 0.3584 - val_accuracy: 0.9583
Epoch 86/256
6/6 [==============================] - 0s 3ms/step - loss: 0.2339 - accuracy: 0.9948 - val_loss: 0.3499 - val_accuracy: 0.9583
Epoch 87/256
6/6 [==============================] - 0s 3ms/step - loss: 0.2292 - accuracy: 0.9896 - val_loss: 0.3443 - val_accuracy: 0.9583
Epoch 88/256
6/6 [==============================] - 0s 3ms/step - loss: 0.2254 - accuracy: 0.9844 - val_loss: 0.3404 - val_accuracy: 0.9583
Epoch 89/256
6/6 [==============================] - 0s 3ms/step - loss: 0.2213 - accuracy: 0.9896 - val_loss: 0.3382 - val_accuracy: 0.9583
Epoch 90/256
6/6 [==============================] - 0s 3ms/step - loss: 0.2168 - accuracy: 0.9948 - val_loss: 0.3347 - val_accuracy: 0.9583
Epoch 91/256
6/6 [==============================] - 0s 3ms/step - loss: 0.2132 - accuracy: 0.9948 - val_loss: 0.3299 - val_accuracy: 0.9583
Epoch 92/256
6/6 [==============================] - 0s 3ms/step - loss: 0.2094 - accuracy: 0.9948 - val_loss: 0.3258 - val_accuracy: 0.9583
Epoch 93/256
6/6 [==============================] - 0s 3ms/step - loss: 0.2060 - accuracy: 0.9948 - val_loss: 0.3226 - val_accuracy: 0.9583
Epoch 94/256
6/6 [==============================] - 0s 3ms/step - loss: 0.2030 - accuracy: 0.9948 - val_loss: 0.3168 - val_accuracy: 0.9583
Epoch 95/256
6/6 [==============================] - 0s 3ms/step - loss: 0.1991 - accuracy: 0.9948 - val_loss: 0.3125 - val_accuracy: 0.9583
Epoch 96/256
6/6 [==============================] - 0s 3ms/step - loss: 0.1961 - accuracy: 0.9948 - val_loss: 0.3103 - val_accuracy: 0.9583
Epoch 97/256
6/6 [==============================] - 0s 3ms/step - loss: 0.1931 - accuracy: 0.9948 - val_loss: 0.3068 - val_accuracy: 0.9583
Epoch 98/256
6/6 [==============================] - 0s 3ms/step - loss: 0.1900 - accuracy: 0.9948 - val_loss: 0.3023 - val_accuracy: 0.9583
Epoch 99/256
6/6 [==============================] - 0s 3ms/step - loss: 0.1867 - accuracy: 0.9948 - val_loss: 0.2974 - val_accuracy: 0.9583
Epoch 100/256
6/6 [==============================] - 0s 3ms/step - loss: 0.1844 - accuracy: 0.9896 - val_loss: 0.2910 - val_accuracy: 0.9583
Epoch 101/256
6/6 [==============================] - 0s 3ms/step - loss: 0.1818 - accuracy: 0.9948 - val_loss: 0.2887 - val_accuracy: 0.9583
Epoch 102/256
6/6 [==============================] - 0s 3ms/step - loss: 0.1779 - accuracy: 0.9948 - val_loss: 0.2854 - val_accuracy: 0.9583
Epoch 103/256
6/6 [==============================] - 0s 3ms/step - loss: 0.1760 - accuracy: 0.9948 - val_loss: 0.2814 - val_accuracy: 0.9583
Epoch 104/256
6/6 [==============================] - 0s 3ms/step - loss: 0.1731 - accuracy: 0.9948 - val_loss: 0.2779 - val_accuracy: 0.9583
Epoch 105/256
6/6 [==============================] - 0s 3ms/step - loss: 0.1704 - accuracy: 0.9896 - val_loss: 0.2737 - val_accuracy: 0.9583
Epoch 106/256
6/6 [==============================] - 0s 3ms/step - loss: 0.1683 - accuracy: 0.9948 - val_loss: 0.2704 - val_accuracy: 0.9583
Epoch 107/256
6/6 [==============================] - 0s 3ms/step - loss: 0.1651 - accuracy: 0.9948 - val_loss: 0.2689 - val_accuracy: 0.9583
Epoch 108/256
6/6 [==============================] - 0s 3ms/step - loss: 0.1645 - accuracy: 0.9948 - val_loss: 0.2687 - val_accuracy: 0.9583
Epoch 109/256
6/6 [==============================] - 0s 4ms/step - loss: 0.1609 - accuracy: 0.9948 - val_loss: 0.2628 - val_accuracy: 0.9583
Epoch 110/256
6/6 [==============================] - 0s 3ms/step - loss: 0.1578 - accuracy: 0.9948 - val_loss: 0.2573 - val_accuracy: 0.9583
Epoch 111/256
6/6 [==============================] - 0s 3ms/step - loss: 0.1562 - accuracy: 0.9896 - val_loss: 0.2530 - val_accuracy: 0.9583
Epoch 112/256
6/6 [==============================] - 0s 3ms/step - loss: 0.1546 - accuracy: 0.9948 - val_loss: 0.2510 - val_accuracy: 0.9583
Epoch 113/256
6/6 [==============================] - 0s 3ms/step - loss: 0.1521 - accuracy: 0.9948 - val_loss: 0.2492 - val_accuracy: 0.9583
Epoch 114/256
6/6 [==============================] - 0s 3ms/step - loss: 0.1497 - accuracy: 0.9948 - val_loss: 0.2451 - val_accuracy: 0.9583
Epoch 115/256
6/6 [==============================] - 0s 3ms/step - loss: 0.1476 - accuracy: 0.9948 - val_loss: 0.2422 - val_accuracy: 0.9583
Epoch 116/256
6/6 [==============================] - 0s 3ms/step - loss: 0.1460 - accuracy: 0.9948 - val_loss: 0.2404 - val_accuracy: 0.9583
Epoch 117/256
6/6 [==============================] - 0s 3ms/step - loss: 0.1440 - accuracy: 0.9948 - val_loss: 0.2359 - val_accuracy: 0.9583
Epoch 118/256
6/6 [==============================] - 0s 3ms/step - loss: 0.1419 - accuracy: 0.9948 - val_loss: 0.2328 - val_accuracy: 0.9583
Epoch 119/256
6/6 [==============================] - 0s 3ms/step - loss: 0.1404 - accuracy: 0.9948 - val_loss: 0.2304 - val_accuracy: 0.9583
Epoch 120/256
6/6 [==============================] - 0s 3ms/step - loss: 0.1384 - accuracy: 0.9948 - val_loss: 0.2260 - val_accuracy: 0.9583
Epoch 121/256
6/6 [==============================] - 0s 3ms/step - loss: 0.1363 - accuracy: 0.9948 - val_loss: 0.2244 - val_accuracy: 0.9583
Epoch 122/256
6/6 [==============================] - 0s 3ms/step - loss: 0.1356 - accuracy: 0.9948 - val_loss: 0.2236 - val_accuracy: 0.9583
Epoch 123/256
6/6 [==============================] - 0s 3ms/step - loss: 0.1330 - accuracy: 0.9948 - val_loss: 0.2182 - val_accuracy: 0.9583
Epoch 124/256
6/6 [==============================] - 0s 3ms/step - loss: 0.1315 - accuracy: 0.9896 - val_loss: 0.2144 - val_accuracy: 0.9583
Epoch 125/256
6/6 [==============================] - 0s 3ms/step - loss: 0.1296 - accuracy: 0.9896 - val_loss: 0.2124 - val_accuracy: 0.9583
Epoch 126/256
6/6 [==============================] - 0s 3ms/step - loss: 0.1276 - accuracy: 0.9948 - val_loss: 0.2108 - val_accuracy: 0.9583
Epoch 127/256
6/6 [==============================] - 0s 3ms/step - loss: 0.1262 - accuracy: 0.9948 - val_loss: 0.2094 - val_accuracy: 0.9583
Epoch 128/256
6/6 [==============================] - 0s 3ms/step - loss: 0.1253 - accuracy: 0.9948 - val_loss: 0.2077 - val_accuracy: 0.9583
Epoch 129/256
6/6 [==============================] - 0s 3ms/step - loss: 0.1231 - accuracy: 0.9948 - val_loss: 0.2020 - val_accuracy: 0.9583
Epoch 130/256
6/6 [==============================] - 0s 4ms/step - loss: 0.1217 - accuracy: 0.9948 - val_loss: 0.1997 - val_accuracy: 0.9583
Epoch 131/256
6/6 [==============================] - 0s 3ms/step - loss: 0.1199 - accuracy: 0.9948 - val_loss: 0.1967 - val_accuracy: 0.9583
Epoch 132/256
6/6 [==============================] - 0s 3ms/step - loss: 0.1184 - accuracy: 0.9948 - val_loss: 0.1947 - val_accuracy: 0.9583
Epoch 133/256
6/6 [==============================] - 0s 3ms/step - loss: 0.1169 - accuracy: 0.9948 - val_loss: 0.1924 - val_accuracy: 0.9583
Epoch 134/256
6/6 [==============================] - 0s 3ms/step - loss: 0.1155 - accuracy: 0.9948 - val_loss: 0.1908 - val_accuracy: 0.9583
Epoch 135/256
6/6 [==============================] - 0s 3ms/step - loss: 0.1143 - accuracy: 0.9948 - val_loss: 0.1870 - val_accuracy: 0.9583
Epoch 136/256
6/6 [==============================] - 0s 3ms/step - loss: 0.1127 - accuracy: 0.9948 - val_loss: 0.1847 - val_accuracy: 0.9583
Epoch 137/256
6/6 [==============================] - 0s 3ms/step - loss: 0.1115 - accuracy: 0.9948 - val_loss: 0.1827 - val_accuracy: 0.9583
Epoch 138/256
6/6 [==============================] - 0s 3ms/step - loss: 0.1099 - accuracy: 0.9948 - val_loss: 0.1810 - val_accuracy: 0.9583
Epoch 139/256
6/6 [==============================] - 0s 3ms/step - loss: 0.1085 - accuracy: 0.9948 - val_loss: 0.1792 - val_accuracy: 0.9583
Epoch 140/256
6/6 [==============================] - 0s 3ms/step - loss: 0.1074 - accuracy: 0.9948 - val_loss: 0.1769 - val_accuracy: 0.9583
Epoch 141/256
6/6 [==============================] - 0s 3ms/step - loss: 0.1062 - accuracy: 0.9948 - val_loss: 0.1733 - val_accuracy: 0.9583
Epoch 142/256
6/6 [==============================] - 0s 3ms/step - loss: 0.1053 - accuracy: 0.9948 - val_loss: 0.1715 - val_accuracy: 0.9583
Epoch 143/256
6/6 [==============================] - 0s 3ms/step - loss: 0.1038 - accuracy: 0.9948 - val_loss: 0.1692 - val_accuracy: 0.9583
Epoch 144/256
6/6 [==============================] - 0s 3ms/step - loss: 0.1026 - accuracy: 0.9948 - val_loss: 0.1676 - val_accuracy: 0.9583
Epoch 145/256
6/6 [==============================] - 0s 3ms/step - loss: 0.1017 - accuracy: 0.9948 - val_loss: 0.1661 - val_accuracy: 0.9583
Epoch 146/256
6/6 [==============================] - 0s 3ms/step - loss: 0.1003 - accuracy: 0.9948 - val_loss: 0.1633 - val_accuracy: 0.9583
Epoch 147/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0997 - accuracy: 0.9948 - val_loss: 0.1595 - val_accuracy: 0.9583
Epoch 148/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0979 - accuracy: 0.9948 - val_loss: 0.1587 - val_accuracy: 0.9583
Epoch 149/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0977 - accuracy: 0.9948 - val_loss: 0.1558 - val_accuracy: 0.9583
Epoch 150/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0955 - accuracy: 0.9948 - val_loss: 0.1555 - val_accuracy: 0.9583
Epoch 151/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0950 - accuracy: 0.9948 - val_loss: 0.1550 - val_accuracy: 0.9583
Epoch 152/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0943 - accuracy: 0.9948 - val_loss: 0.1524 - val_accuracy: 0.9583
Epoch 153/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0928 - accuracy: 0.9948 - val_loss: 0.1483 - val_accuracy: 0.9583
Epoch 154/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0920 - accuracy: 0.9948 - val_loss: 0.1474 - val_accuracy: 0.9583
Epoch 155/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0908 - accuracy: 0.9948 - val_loss: 0.1459 - val_accuracy: 0.9583
Epoch 156/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0900 - accuracy: 0.9948 - val_loss: 0.1445 - val_accuracy: 0.9583
Epoch 157/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0886 - accuracy: 0.9948 - val_loss: 0.1422 - val_accuracy: 0.9583
Epoch 158/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0878 - accuracy: 0.9948 - val_loss: 0.1404 - val_accuracy: 0.9583
Epoch 159/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0870 - accuracy: 0.9948 - val_loss: 0.1391 - val_accuracy: 0.9583
Epoch 160/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0858 - accuracy: 0.9948 - val_loss: 0.1370 - val_accuracy: 0.9583
Epoch 161/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0852 - accuracy: 0.9948 - val_loss: 0.1354 - val_accuracy: 0.9583
Epoch 162/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0844 - accuracy: 0.9948 - val_loss: 0.1344 - val_accuracy: 0.9583
Epoch 163/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0837 - accuracy: 0.9948 - val_loss: 0.1318 - val_accuracy: 0.9583
Epoch 164/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0825 - accuracy: 0.9948 - val_loss: 0.1306 - val_accuracy: 0.9583
Epoch 165/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0822 - accuracy: 0.9948 - val_loss: 0.1302 - val_accuracy: 0.9583
Epoch 166/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0811 - accuracy: 0.9948 - val_loss: 0.1272 - val_accuracy: 0.9583
Epoch 167/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0804 - accuracy: 0.9948 - val_loss: 0.1263 - val_accuracy: 0.9583
Epoch 168/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0795 - accuracy: 0.9948 - val_loss: 0.1244 - val_accuracy: 0.9583
Epoch 169/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0790 - accuracy: 0.9948 - val_loss: 0.1239 - val_accuracy: 0.9583
Epoch 170/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0778 - accuracy: 0.9948 - val_loss: 0.1223 - val_accuracy: 0.9583
Epoch 171/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0779 - accuracy: 0.9948 - val_loss: 0.1196 - val_accuracy: 0.9583
Epoch 172/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0766 - accuracy: 0.9948 - val_loss: 0.1189 - val_accuracy: 0.9583
Epoch 173/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0754 - accuracy: 0.9948 - val_loss: 0.1181 - val_accuracy: 0.9583
Epoch 174/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0751 - accuracy: 0.9948 - val_loss: 0.1172 - val_accuracy: 0.9583
Epoch 175/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0741 - accuracy: 0.9948 - val_loss: 0.1146 - val_accuracy: 1.0000
Epoch 176/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0739 - accuracy: 0.9948 - val_loss: 0.1140 - val_accuracy: 1.0000
Epoch 177/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0730 - accuracy: 0.9948 - val_loss: 0.1116 - val_accuracy: 1.0000
Epoch 178/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0725 - accuracy: 0.9948 - val_loss: 0.1108 - val_accuracy: 1.0000
Epoch 179/256
6/6 [==============================] - 0s 4ms/step - loss: 0.0716 - accuracy: 0.9948 - val_loss: 0.1090 - val_accuracy: 1.0000
Epoch 180/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0708 - accuracy: 0.9948 - val_loss: 0.1086 - val_accuracy: 1.0000
Epoch 181/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0707 - accuracy: 0.9948 - val_loss: 0.1076 - val_accuracy: 1.0000
Epoch 182/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0697 - accuracy: 0.9948 - val_loss: 0.1058 - val_accuracy: 1.0000
Epoch 183/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0689 - accuracy: 0.9948 - val_loss: 0.1041 - val_accuracy: 1.0000
Epoch 184/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0684 - accuracy: 0.9948 - val_loss: 0.1033 - val_accuracy: 1.0000
Epoch 185/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0677 - accuracy: 0.9948 - val_loss: 0.1020 - val_accuracy: 1.0000
Epoch 186/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0672 - accuracy: 0.9948 - val_loss: 0.1004 - val_accuracy: 1.0000
Epoch 187/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0671 - accuracy: 0.9948 - val_loss: 0.0988 - val_accuracy: 1.0000
Epoch 188/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0662 - accuracy: 0.9948 - val_loss: 0.0987 - val_accuracy: 1.0000
Epoch 189/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0662 - accuracy: 0.9948 - val_loss: 0.0987 - val_accuracy: 1.0000
Epoch 190/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0651 - accuracy: 0.9948 - val_loss: 0.0955 - val_accuracy: 1.0000
Epoch 191/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0644 - accuracy: 0.9948 - val_loss: 0.0942 - val_accuracy: 1.0000
Epoch 192/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0644 - accuracy: 0.9948 - val_loss: 0.0941 - val_accuracy: 1.0000
Epoch 193/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0637 - accuracy: 0.9948 - val_loss: 0.0934 - val_accuracy: 1.0000
Epoch 194/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0635 - accuracy: 0.9948 - val_loss: 0.0914 - val_accuracy: 1.0000
Epoch 195/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0626 - accuracy: 0.9948 - val_loss: 0.0903 - val_accuracy: 1.0000
Epoch 196/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0620 - accuracy: 0.9948 - val_loss: 0.0905 - val_accuracy: 1.0000
Epoch 197/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0617 - accuracy: 0.9948 - val_loss: 0.0883 - val_accuracy: 1.0000
Epoch 198/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0608 - accuracy: 0.9948 - val_loss: 0.0874 - val_accuracy: 1.0000
Epoch 199/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0611 - accuracy: 0.9948 - val_loss: 0.0879 - val_accuracy: 1.0000
Epoch 200/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0598 - accuracy: 0.9948 - val_loss: 0.0857 - val_accuracy: 1.0000
Epoch 201/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0594 - accuracy: 0.9948 - val_loss: 0.0841 - val_accuracy: 1.0000
Epoch 202/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0591 - accuracy: 0.9948 - val_loss: 0.0833 - val_accuracy: 1.0000
Epoch 203/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0589 - accuracy: 0.9948 - val_loss: 0.0827 - val_accuracy: 1.0000
Epoch 204/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0578 - accuracy: 0.9948 - val_loss: 0.0829 - val_accuracy: 1.0000
Epoch 205/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0579 - accuracy: 0.9948 - val_loss: 0.0824 - val_accuracy: 1.0000
Epoch 206/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0582 - accuracy: 0.9948 - val_loss: 0.0802 - val_accuracy: 1.0000
Epoch 207/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0566 - accuracy: 0.9948 - val_loss: 0.0797 - val_accuracy: 1.0000
Epoch 208/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0565 - accuracy: 0.9948 - val_loss: 0.0804 - val_accuracy: 1.0000
Epoch 209/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0559 - accuracy: 0.9948 - val_loss: 0.0788 - val_accuracy: 1.0000
Epoch 210/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0557 - accuracy: 0.9948 - val_loss: 0.0774 - val_accuracy: 1.0000
Epoch 211/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0549 - accuracy: 0.9948 - val_loss: 0.0769 - val_accuracy: 1.0000
Epoch 212/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0545 - accuracy: 0.9948 - val_loss: 0.0766 - val_accuracy: 1.0000
Epoch 213/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0542 - accuracy: 0.9948 - val_loss: 0.0760 - val_accuracy: 1.0000
Epoch 214/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0543 - accuracy: 0.9948 - val_loss: 0.0752 - val_accuracy: 1.0000
Epoch 215/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0539 - accuracy: 0.9948 - val_loss: 0.0759 - val_accuracy: 1.0000
Epoch 216/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0531 - accuracy: 0.9948 - val_loss: 0.0744 - val_accuracy: 1.0000
Epoch 217/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0524 - accuracy: 0.9948 - val_loss: 0.0734 - val_accuracy: 1.0000
Epoch 218/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0523 - accuracy: 0.9948 - val_loss: 0.0726 - val_accuracy: 1.0000
Epoch 219/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0519 - accuracy: 0.9948 - val_loss: 0.0725 - val_accuracy: 1.0000
Epoch 220/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0505 - accuracy: 0.9948 - val_loss: 0.0722 - val_accuracy: 1.0000
Epoch 221/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0488 - accuracy: 1.0000 - val_loss: 0.0721 - val_accuracy: 1.0000
Epoch 222/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0484 - accuracy: 1.0000 - val_loss: 0.0710 - val_accuracy: 1.0000
Epoch 223/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0463 - accuracy: 1.0000 - val_loss: 0.0704 - val_accuracy: 1.0000
Epoch 224/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0464 - accuracy: 1.0000 - val_loss: 0.0711 - val_accuracy: 1.0000
Epoch 225/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0455 - accuracy: 1.0000 - val_loss: 0.0696 - val_accuracy: 1.0000
Epoch 226/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0445 - accuracy: 1.0000 - val_loss: 0.0692 - val_accuracy: 1.0000
Epoch 227/256
6/6 [==============================] - 0s 4ms/step - loss: 0.0446 - accuracy: 1.0000 - val_loss: 0.0698 - val_accuracy: 1.0000
Epoch 228/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0436 - accuracy: 1.0000 - val_loss: 0.0692 - val_accuracy: 1.0000
Epoch 229/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0430 - accuracy: 1.0000 - val_loss: 0.0689 - val_accuracy: 1.0000
Epoch 230/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0428 - accuracy: 1.0000 - val_loss: 0.0689 - val_accuracy: 1.0000
Epoch 231/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0430 - accuracy: 1.0000 - val_loss: 0.0692 - val_accuracy: 1.0000
Epoch 232/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0417 - accuracy: 1.0000 - val_loss: 0.0685 - val_accuracy: 1.0000
Epoch 233/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0414 - accuracy: 1.0000 - val_loss: 0.0680 - val_accuracy: 1.0000
Epoch 234/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0414 - accuracy: 1.0000 - val_loss: 0.0679 - val_accuracy: 1.0000
Epoch 235/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0405 - accuracy: 1.0000 - val_loss: 0.0681 - val_accuracy: 1.0000
Epoch 236/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0405 - accuracy: 1.0000 - val_loss: 0.0681 - val_accuracy: 1.0000
Epoch 237/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0399 - accuracy: 1.0000 - val_loss: 0.0680 - val_accuracy: 1.0000
Epoch 238/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0395 - accuracy: 1.0000 - val_loss: 0.0669 - val_accuracy: 1.0000
Epoch 239/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0393 - accuracy: 1.0000 - val_loss: 0.0664 - val_accuracy: 1.0000
Epoch 240/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0398 - accuracy: 1.0000 - val_loss: 0.0660 - val_accuracy: 1.0000
Epoch 241/256
6/6 [==============================] - 0s 4ms/step - loss: 0.0385 - accuracy: 1.0000 - val_loss: 0.0657 - val_accuracy: 1.0000
Epoch 242/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0384 - accuracy: 1.0000 - val_loss: 0.0651 - val_accuracy: 1.0000
Epoch 243/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0376 - accuracy: 1.0000 - val_loss: 0.0646 - val_accuracy: 1.0000
Epoch 244/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0373 - accuracy: 1.0000 - val_loss: 0.0639 - val_accuracy: 1.0000
Epoch 245/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0370 - accuracy: 1.0000 - val_loss: 0.0632 - val_accuracy: 1.0000
Epoch 246/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0367 - accuracy: 1.0000 - val_loss: 0.0624 - val_accuracy: 1.0000
Epoch 247/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0367 - accuracy: 1.0000 - val_loss: 0.0621 - val_accuracy: 1.0000
Epoch 248/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0362 - accuracy: 1.0000 - val_loss: 0.0615 - val_accuracy: 1.0000
Epoch 249/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0358 - accuracy: 1.0000 - val_loss: 0.0611 - val_accuracy: 1.0000
Epoch 250/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0356 - accuracy: 1.0000 - val_loss: 0.0605 - val_accuracy: 1.0000
Epoch 251/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0358 - accuracy: 1.0000 - val_loss: 0.0606 - val_accuracy: 1.0000
Epoch 252/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0349 - accuracy: 1.0000 - val_loss: 0.0597 - val_accuracy: 1.0000
Epoch 253/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0350 - accuracy: 1.0000 - val_loss: 0.0590 - val_accuracy: 1.0000
Epoch 254/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0344 - accuracy: 1.0000 - val_loss: 0.0585 - val_accuracy: 1.0000
Epoch 255/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0341 - accuracy: 1.0000 - val_loss: 0.0584 - val_accuracy: 1.0000
Epoch 256/256
6/6 [==============================] - 0s 3ms/step - loss: 0.0346 - accuracy: 1.0000 - val_loss: 0.0584 - val_accuracy: 1.0000
In [12]:
model.evaluate(Xtest, ytest)
1/1 [==============================] - 0s 861us/step - loss: 0.0413 - accuracy: 1.0000
Out[12]:
[0.041303601115942, 1.0]
In [13]:
cm = confusion_matrix(np.argmax(ytest.values, axis = 1), (np.argmax(model.predict(Xtest), axis = 1)))
cm = cm.astype('int') / cm.sum(axis=1)[:, np.newaxis]

fig = plt.figure(figsize = (10, 10))
ax = fig.add_subplot(111)

for i in range(cm.shape[1]):
    for j in range(cm.shape[0]):
        if cm[i,j] > 0.8:
            clr = "white"
        else:
            clr = "black"
        ax.text(j, i, format(cm[i, j], '.2f'), horizontalalignment="center", color=clr)

_ = ax.imshow(cm, cmap=plt.cm.Blues)
ax.set_xticks(range(6))
ax.set_yticks(range(6))
ax.set_xticklabels(range(6))
ax.set_yticklabels(range(6))
plt.xlabel('Predicted')
plt.ylabel('True')
plt.show()

Plotting the metrics

In [14]:
def plot(history, variable, variable2):
    plt.plot(range(len(history[variable])), history[variable])
    plt.plot(range(len(history[variable2])), history[variable2])
    plt.legend([variable, variable2])
    plt.title(variable)
In [15]:
plot(history.history, "loss", "val_loss")
In [16]:
plot(history.history, "accuracy", "val_accuracy")

Prediction

In [17]:
classes = ['Brown Dwarf', 'Red Dwarf', 'White Dwarf', 'Main Sequence', 'Supergiant', 'Hypergiant']
In [18]:
# pick random test data sample from one batch
x = random.randint(0, len(Xtest) - 1)

output = model.predict(Xtest[x].reshape(1, -1))[0]
print("Predicted: ", classes[np.argmax(output)])   
print("Probability: ", output[np.argmax(output)])

print("True: ", classes[np.argmax(ytest.values[x])])
Predicted:  Red Dwarf
Probability:  0.99529725
True:  Red Dwarf

deepC

In [19]:
model.save('star.h5')

!deepCC star.h5
[INFO]
Reading [keras model] 'star.h5'
[SUCCESS]
Saved 'star_deepC/star.onnx'
[INFO]
Reading [onnx model] 'star_deepC/star.onnx'
[INFO]
Model info:
  ir_vesion : 4
  doc       : 
[WARNING]
[ONNX]: terminal (input/output) dense_input's shape is less than 1. Changing it to 1.
[WARNING]
[ONNX]: terminal (input/output) dense_2's shape is less than 1. Changing it to 1.
WARN (GRAPH): found operator node with the same name (dense_2) as io node.
[INFO]
Running DNNC graph sanity check ...
[SUCCESS]
Passed sanity check.
[INFO]
Writing C++ file 'star_deepC/star.cpp'
[INFO]
deepSea model files are ready in 'star_deepC/' 
[RUNNING COMMAND]
g++ -std=c++11 -O3 -fno-rtti -fno-exceptions -I. -I/opt/tljh/user/lib/python3.7/site-packages/deepC-0.13-py3.7-linux-x86_64.egg/deepC/include -isystem /opt/tljh/user/lib/python3.7/site-packages/deepC-0.13-py3.7-linux-x86_64.egg/deepC/packages/eigen-eigen-323c052e1731 "star_deepC/star.cpp" -D_AITS_MAIN -o "star_deepC/star.exe"
[RUNNING COMMAND]
size "star_deepC/star.exe"
   text	   data	    bss	    dec	    hex	filename
 122817	   2584	    760	 126161	  1ecd1	star_deepC/star.exe
[SUCCESS]
Saved model as executable "star_deepC/star.exe"
In [20]:
x = random.randint(0, len(Xtest) - 1)
print(x)
np.savetxt('sample.data', Xtest[x])    # xth sample into text file

# run exe with input
!star_deepC/star.exe sample.data

# show predicted output
nn_out = np.loadtxt('deepSea_result_1.out')
print("Model output: ", model.predict(Xtest[x].reshape(1, -1))[0])
print("deepC output:",nn_out)

print("Predicted: ", classes[np.argmax(nn_out)]) 
print("Probability: ", nn_out[np.argmax(nn_out)])

print("True: ", classes[np.argmax(ytest.values[x])])
9
writing file deepSea_result_1.out.
Model output:  [9.4219571e-01 5.6600075e-02 1.1520842e-03 1.9398474e-06 4.8289985e-05
 1.8597848e-06]
deepC output: [0. 0. 0. 0. 1. 0.]
Predicted:  Supergiant
Probability:  1.0
True:  Brown Dwarf