Cainvas
Model Files
gesture_recognition.h5
keras
Model
deepSea Compiled Models
gesture_recognition.exe
deepSea
Ubuntu

IoT based Gesture Recognition

Created by Cainvas Scholars-

  1. Ayisha Ryhana D ( GitHub | LinkedIn )
  2. Dheeraj Perumandla ( GitHub | LinkedIn )

iot_gesture

Photo by Marco Coppeto on Dribbble and Matt Harvey on Vimeo

Gestures are mapped to the corresponding accelerometer and gyroscope values recorded during motion. Here, we have accelerometer and gyroscope values along x, y, z axes recorded 100 times for one gesture, i.e, 600 data points for one gesture.

Dataset: The sensor values were recorded using an app (github link here) and consolidates using csv.

Algorithm: A tensorflow deep learning model with relu and softmax activations was used.

gesture

Photo by Adrien King on Unsplash

In [1]:
import pandas as pd
import csv
import os
import tensorflow as tf
import keras
from sklearn.preprocessing import normalize, OneHotEncoder
import numpy as np
import matplotlib.pyplot as plt
import urllib

List of gestures used

In [2]:
gestures = ['down_to_up', 'forward_clockwise', 'left_fall', 'up_clockwise',
            'up_anticlockwise', 'left_to_right', 'right_to_left', 'forward_fall']
In [3]:
df = pd.read_csv('https://cainvas-static.s3.amazonaws.com/media/user_data/cainvas-admin/gesture60_8_.csv')

Normalize and shuffle

In [4]:
df[df.columns[:-1]] = normalize(df[df.columns[:-1]])
df = df.sample(frac=1, random_state=13).reset_index(drop = True)
df
Out[4]:
1 2 3 4 5 6 7 8 9 10 ... 592 593 594 595 596 597 598 599 600 Gesture
0 0.003022 0.093937 0.011679 -0.000629 0.000282 -0.000210 0.002838 0.093238 0.010267 -0.000629 ... 0.024808 0.002097 -0.001888 0.000016 0.090347 -0.011931 0.024808 0.002097 -0.001888 forward_fall
1 0.000798 0.096556 0.020467 -0.000109 -0.000561 -0.001020 0.000401 0.095487 0.020298 -0.000109 ... 0.000335 -0.000426 -0.000641 0.004812 0.097926 0.013797 0.000335 -0.000426 -0.000641 right_to_left
2 0.002546 0.090785 0.027262 0.000192 0.000018 -0.000177 0.004073 0.091596 0.029648 0.000192 ... 0.000269 0.000499 -0.000634 0.003202 0.099697 -0.008268 -0.000968 -0.000035 -0.000430 down_to_up
3 0.017293 0.095955 0.019115 -0.000139 0.000187 -0.000324 0.017862 0.096802 0.019986 -0.000139 ... -0.001582 0.000178 -0.000643 -0.007194 0.099899 0.022321 -0.001582 0.000178 -0.000643 left_to_right
4 -0.000055 0.095312 0.020812 -0.000233 0.000047 0.000046 -0.000883 0.095468 0.022240 -0.000233 ... 0.000294 -0.001089 -0.000150 -0.000919 0.089743 -0.005857 0.000294 -0.001089 -0.000150 down_to_up
... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ...
472 0.012845 0.098365 0.013678 -0.000165 -0.000197 0.000305 0.012250 0.099106 0.013836 -0.000165 ... 0.000217 0.000280 -0.000531 -0.003093 0.097247 0.024952 0.000217 0.000280 -0.000531 left_to_right
473 -0.002079 0.080596 0.006062 0.000606 0.000420 -0.000667 -0.002528 0.080146 0.005661 0.000606 ... 0.000380 0.000523 0.001311 0.000933 0.080889 0.000381 0.000380 0.000523 0.001311 left_fall
474 0.011863 0.095148 0.028367 -0.000533 -0.000350 -0.000925 0.011118 0.095004 0.028283 -0.000533 ... -0.001097 0.001230 -0.003628 -0.007168 0.095688 0.020857 -0.001097 0.001230 -0.003628 right_to_left
475 0.007047 0.093727 0.008762 0.000426 0.000409 -0.000170 0.006576 0.094048 0.008969 0.000426 ... 0.016682 -0.000295 -0.002204 0.006072 0.094140 -0.005092 0.001702 0.000295 -0.000190 forward_fall
476 0.000861 0.093104 0.010468 -0.000235 -0.000126 0.000363 0.001809 0.093196 0.009577 -0.000235 ... -0.003521 -0.001342 -0.004736 0.002734 0.095672 0.019548 -0.003521 -0.001342 -0.004736 forward_clockwise

477 rows × 601 columns

Visualization

In [5]:
data = pd.DataFrame(df)
In [6]:
col = ['ax','ay','az','gx','gy','gz']*100
col.append('Gesture')
data.columns = col
In [7]:
def extract(gesture,data_dict):
    k = data[data["Gesture"]==gesture].shape[0]
    for j in range(0,k):
        if j == 0:
            data_dict[gesture] = data[data['Gesture']==gesture].iloc[0:1,0:6]
        else:
            data_dict[gesture] = data_dict[gesture].append(data[data['Gesture']==gesture].iloc[j:j+1,0:6])
        for i in range(6,600,6):
            data_dict[gesture] = data_dict[gesture].append(data[data['Gesture']==gesture].iloc[j:j+1,i:i+6])
In [8]:
data_dict = {}
for i in gestures:
    extract(i,data_dict)
    #print(data_dict[i].head())
#data_dict
In [9]:
def visualize(type, gesture):
    for i in gestures:
        title = type.upper()+' for \"'+i.upper()+'\" gesture'
        plt.title(title)
        if type == 'acceleration':
            index = range(1, len(data_dict[i]['ax']) + 1)
            plt.plot(index, data_dict[i]['ax'], 'g.', label='x', linestyle='solid', marker=',')
            plt.plot(index, data_dict[i]['ay'], 'b.', label='y', linestyle='solid', marker=',')
            plt.plot(index, data_dict[i]['az'], 'r.', label='z', linestyle='solid', marker=',')

        if type == 'gyro':
            index = range(1, len(data_dict[i]['ax']) + 1)
            plt.plot(index, data_dict[i]['gx'], 'g.', label='x', linestyle='solid', marker=',')
            plt.plot(index, data_dict[i]['gy'], 'b.', label='y', linestyle='solid', marker=',')
            plt.plot(index, data_dict[i]['gz'], 'r.', label='z', linestyle='solid', marker=',')

        plt.legend()
        plt.show()
In [10]:
visualize('gyro', gestures)
In [11]:
visualize('acceleration', gestures)

Acceleration plot of 'left_to_right' has an outlier.

Identifying and removing it.

In [12]:
outlier = data_dict['left_to_right'][data_dict['left_to_right']['ax'] > 0.06]
print(outlier)

outlier_index = outlier.index.unique().values

df = df.drop(outlier_index).reset_index(drop = True)
           ax        ay        az        gx        gy        gz
421  0.061794  0.082124  0.010391  0.000595  0.000707 -0.000647
421  0.073243  0.082884  0.017196  0.000595  0.000707 -0.000647
421  0.086864  0.083204  0.017766  0.000595  0.000707 -0.000647
421  0.085522  0.082052  0.012932  0.000595  0.000707 -0.000647
421  0.085581  0.077136  0.009168  0.000595  0.000707 -0.000647
421  0.085581  0.077136  0.009168 -0.000697 -0.000800  0.006152
421  0.088123  0.084677  0.007743 -0.000697 -0.000800  0.006152
421  0.084608  0.080722  0.012517 -0.000697 -0.000800  0.006152
421  0.080439  0.075770  0.015783 -0.000697 -0.000800  0.006152
421  0.079596  0.072789  0.021483 -0.000697 -0.000800  0.006152
421  0.078682  0.078228  0.029547 -0.000697 -0.000800  0.006152
421  0.067530  0.080188  0.034677 -0.000697 -0.000800  0.006152
In [13]:
df
Out[13]:
ax ay az gx gy gz ax ay az gx ... gx gy gz ax ay az gx gy gz Gesture
0 0.003022 0.093937 0.011679 -0.000629 0.000282 -0.000210 0.002838 0.093238 0.010267 -0.000629 ... 0.024808 0.002097 -0.001888 0.000016 0.090347 -0.011931 0.024808 0.002097 -0.001888 forward_fall
1 0.000798 0.096556 0.020467 -0.000109 -0.000561 -0.001020 0.000401 0.095487 0.020298 -0.000109 ... 0.000335 -0.000426 -0.000641 0.004812 0.097926 0.013797 0.000335 -0.000426 -0.000641 right_to_left
2 0.002546 0.090785 0.027262 0.000192 0.000018 -0.000177 0.004073 0.091596 0.029648 0.000192 ... 0.000269 0.000499 -0.000634 0.003202 0.099697 -0.008268 -0.000968 -0.000035 -0.000430 down_to_up
3 0.017293 0.095955 0.019115 -0.000139 0.000187 -0.000324 0.017862 0.096802 0.019986 -0.000139 ... -0.001582 0.000178 -0.000643 -0.007194 0.099899 0.022321 -0.001582 0.000178 -0.000643 left_to_right
4 -0.000055 0.095312 0.020812 -0.000233 0.000047 0.000046 -0.000883 0.095468 0.022240 -0.000233 ... 0.000294 -0.001089 -0.000150 -0.000919 0.089743 -0.005857 0.000294 -0.001089 -0.000150 down_to_up
... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ...
471 0.012845 0.098365 0.013678 -0.000165 -0.000197 0.000305 0.012250 0.099106 0.013836 -0.000165 ... 0.000217 0.000280 -0.000531 -0.003093 0.097247 0.024952 0.000217 0.000280 -0.000531 left_to_right
472 -0.002079 0.080596 0.006062 0.000606 0.000420 -0.000667 -0.002528 0.080146 0.005661 0.000606 ... 0.000380 0.000523 0.001311 0.000933 0.080889 0.000381 0.000380 0.000523 0.001311 left_fall
473 0.011863 0.095148 0.028367 -0.000533 -0.000350 -0.000925 0.011118 0.095004 0.028283 -0.000533 ... -0.001097 0.001230 -0.003628 -0.007168 0.095688 0.020857 -0.001097 0.001230 -0.003628 right_to_left
474 0.007047 0.093727 0.008762 0.000426 0.000409 -0.000170 0.006576 0.094048 0.008969 0.000426 ... 0.016682 -0.000295 -0.002204 0.006072 0.094140 -0.005092 0.001702 0.000295 -0.000190 forward_fall
475 0.000861 0.093104 0.010468 -0.000235 -0.000126 0.000363 0.001809 0.093196 0.009577 -0.000235 ... -0.003521 -0.001342 -0.004736 0.002734 0.095672 0.019548 -0.003521 -0.001342 -0.004736 forward_clockwise

476 rows × 601 columns

In [14]:
df.to_csv('gesture60_8_normalized.csv', index = False)

One hot encoding the labels

In [15]:
y_df = pd.get_dummies(df.Gesture)[gestures]
print(y_df)
print(np.array(y_df)[0].shape)
     down_to_up  forward_clockwise  left_fall  up_clockwise  up_anticlockwise  \
0             0                  0          0             0                 0   
1             0                  0          0             0                 0   
2             1                  0          0             0                 0   
3             0                  0          0             0                 0   
4             1                  0          0             0                 0   
..          ...                ...        ...           ...               ...   
471           0                  0          0             0                 0   
472           0                  0          1             0                 0   
473           0                  0          0             0                 0   
474           0                  0          0             0                 0   
475           0                  1          0             0                 0   

     left_to_right  right_to_left  forward_fall  
0                0              0             1  
1                0              1             0  
2                0              0             0  
3                1              0             0  
4                0              0             0  
..             ...            ...           ...  
471              1              0             0  
472              0              0             0  
473              0              1             0  
474              0              0             1  
475              0              0             0  

[476 rows x 8 columns]
(8,)
In [16]:
df = pd.concat([df, y_df], axis=1)
df.head()
Out[16]:
ax ay az gx gy gz ax ay az gx ... gz Gesture down_to_up forward_clockwise left_fall up_clockwise up_anticlockwise left_to_right right_to_left forward_fall
0 0.003022 0.093937 0.011679 -0.000629 0.000282 -0.000210 0.002838 0.093238 0.010267 -0.000629 ... -0.001888 forward_fall 0 0 0 0 0 0 0 1
1 0.000798 0.096556 0.020467 -0.000109 -0.000561 -0.001020 0.000401 0.095487 0.020298 -0.000109 ... -0.000641 right_to_left 0 0 0 0 0 0 1 0
2 0.002546 0.090785 0.027262 0.000192 0.000018 -0.000177 0.004073 0.091596 0.029648 0.000192 ... -0.000430 down_to_up 1 0 0 0 0 0 0 0
3 0.017293 0.095955 0.019115 -0.000139 0.000187 -0.000324 0.017862 0.096802 0.019986 -0.000139 ... -0.000643 left_to_right 0 0 0 0 0 1 0 0
4 -0.000055 0.095312 0.020812 -0.000233 0.000047 0.000046 -0.000883 0.095468 0.022240 -0.000233 ... -0.000150 down_to_up 1 0 0 0 0 0 0 0

5 rows × 609 columns

In [17]:
train_count = int(0.8*len(df))
test_count = len(df) - train_count
print(train_count, test_count)

x_train, y_train = np.array(df)[:train_count, :600].astype('float32'), np.array(df)[:train_count, -(len(gestures)):].astype('int64')
x_test, y_test = np.array(df)[train_count:, :600].astype('float32'), np.array(df)[train_count:, -(len(gestures)):].astype('int64')

print(x_train.shape, y_train.shape, x_test.shape, y_test.shape)
380 96
(380, 600) (380, 8) (96, 600) (96, 8)
In [18]:
x_train[0].shape
Out[18]:
(600,)

Model

In [19]:
model = tf.keras.models.Sequential()
model.add(tf.keras.layers.Dense(512, activation="relu", input_shape = (600,)))
model.add(tf.keras.layers.Dense(128, activation="relu"))
model.add(tf.keras.layers.Dense(64, activation="relu"))
model.add(tf.keras.layers.Dense(len(gestures), activation="softmax"))
In [20]:
model.summary()
Model: "sequential"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
dense (Dense)                (None, 512)               307712    
_________________________________________________________________
dense_1 (Dense)              (None, 128)               65664     
_________________________________________________________________
dense_2 (Dense)              (None, 64)                8256      
_________________________________________________________________
dense_3 (Dense)              (None, 8)                 520       
=================================================================
Total params: 382,152
Trainable params: 382,152
Non-trainable params: 0
_________________________________________________________________
In [21]:
model.compile(optimizer = 'adam', loss = 'categorical_crossentropy', metrics = ['accuracy'])
In [22]:
model.fit(x_train, y_train, epochs = 16, batch_size = 8)
Epoch 1/16
48/48 [==============================] - 0s 2ms/step - loss: 1.5460 - accuracy: 0.5000
Epoch 2/16
48/48 [==============================] - 0s 2ms/step - loss: 0.4585 - accuracy: 0.9158
Epoch 3/16
48/48 [==============================] - 0s 2ms/step - loss: 0.1810 - accuracy: 0.9605
Epoch 4/16
48/48 [==============================] - 0s 2ms/step - loss: 0.0805 - accuracy: 0.9816
Epoch 5/16
48/48 [==============================] - 0s 2ms/step - loss: 0.0594 - accuracy: 0.9895
Epoch 6/16
48/48 [==============================] - 0s 2ms/step - loss: 0.0299 - accuracy: 0.9947
Epoch 7/16
48/48 [==============================] - 0s 2ms/step - loss: 0.0186 - accuracy: 0.9974
Epoch 8/16
48/48 [==============================] - 0s 2ms/step - loss: 0.0181 - accuracy: 0.9974
Epoch 9/16
48/48 [==============================] - 0s 2ms/step - loss: 0.0098 - accuracy: 1.0000
Epoch 10/16
48/48 [==============================] - 0s 2ms/step - loss: 0.0062 - accuracy: 1.0000
Epoch 11/16
48/48 [==============================] - 0s 2ms/step - loss: 0.0042 - accuracy: 1.0000
Epoch 12/16
48/48 [==============================] - 0s 2ms/step - loss: 0.0030 - accuracy: 1.0000
Epoch 13/16
48/48 [==============================] - 0s 2ms/step - loss: 0.0025 - accuracy: 1.0000
Epoch 14/16
48/48 [==============================] - 0s 2ms/step - loss: 0.0023 - accuracy: 1.0000
Epoch 15/16
48/48 [==============================] - 0s 2ms/step - loss: 0.0026 - accuracy: 1.0000
Epoch 16/16
48/48 [==============================] - 0s 2ms/step - loss: 0.0017 - accuracy: 1.0000
Out[22]:
<tensorflow.python.keras.callbacks.History at 0x7f991d26cc88>
In [23]:
model.evaluate(x_test, y_test)
3/3 [==============================] - 0s 1ms/step - loss: 0.1697 - accuracy: 0.9479
Out[23]:
[0.16967159509658813, 0.9479166865348816]

Tiny model

In [24]:
model = tf.keras.models.Sequential()
model.add(tf.keras.layers.Dense(16, activation="relu", input_shape = (600,)))
model.add(tf.keras.layers.Dense(8, activation="relu"))
model.add(tf.keras.layers.Dense(4, activation="relu"))
model.add(tf.keras.layers.Dense(len(gestures), activation="softmax"))

model.compile(optimizer = 'adam', loss = 'categorical_crossentropy', metrics = ['accuracy'])

model.summary()
Model: "sequential_1"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
dense_4 (Dense)              (None, 16)                9616      
_________________________________________________________________
dense_5 (Dense)              (None, 8)                 136       
_________________________________________________________________
dense_6 (Dense)              (None, 4)                 36        
_________________________________________________________________
dense_7 (Dense)              (None, 8)                 40        
=================================================================
Total params: 9,828
Trainable params: 9,828
Non-trainable params: 0
_________________________________________________________________
In [25]:
model.fit(x_train, y_train, epochs = 256, batch_size = 32)
print("\n Test set")
model.evaluate(x_test, y_test)
Epoch 1/256
12/12 [==============================] - 0s 1ms/step - loss: 2.0703 - accuracy: 0.1447
Epoch 2/256
12/12 [==============================] - 0s 1ms/step - loss: 2.0485 - accuracy: 0.1447
Epoch 3/256
12/12 [==============================] - 0s 1ms/step - loss: 2.0189 - accuracy: 0.2947
Epoch 4/256
12/12 [==============================] - 0s 2ms/step - loss: 1.9775 - accuracy: 0.3553
Epoch 5/256
12/12 [==============================] - 0s 1ms/step - loss: 1.9334 - accuracy: 0.3632
Epoch 6/256
12/12 [==============================] - 0s 1ms/step - loss: 1.8841 - accuracy: 0.3684
Epoch 7/256
12/12 [==============================] - 0s 1ms/step - loss: 1.8323 - accuracy: 0.3632
Epoch 8/256
12/12 [==============================] - 0s 1ms/step - loss: 1.7783 - accuracy: 0.3658
Epoch 9/256
12/12 [==============================] - 0s 998us/step - loss: 1.7234 - accuracy: 0.3632
Epoch 10/256
12/12 [==============================] - 0s 1ms/step - loss: 1.6639 - accuracy: 0.3658
Epoch 11/256
12/12 [==============================] - 0s 1ms/step - loss: 1.6045 - accuracy: 0.3658
Epoch 12/256
12/12 [==============================] - 0s 1ms/step - loss: 1.5424 - accuracy: 0.3711
Epoch 13/256
12/12 [==============================] - 0s 1ms/step - loss: 1.4848 - accuracy: 0.4184
Epoch 14/256
12/12 [==============================] - 0s 1ms/step - loss: 1.4298 - accuracy: 0.4474
Epoch 15/256
12/12 [==============================] - 0s 1ms/step - loss: 1.3791 - accuracy: 0.4474
Epoch 16/256
12/12 [==============================] - 0s 1ms/step - loss: 1.3292 - accuracy: 0.4658
Epoch 17/256
12/12 [==============================] - 0s 981us/step - loss: 1.2822 - accuracy: 0.4711
Epoch 18/256
12/12 [==============================] - 0s 989us/step - loss: 1.2405 - accuracy: 0.4789
Epoch 19/256
12/12 [==============================] - 0s 1ms/step - loss: 1.2002 - accuracy: 0.4842
Epoch 20/256
12/12 [==============================] - 0s 999us/step - loss: 1.1642 - accuracy: 0.4868
Epoch 21/256
12/12 [==============================] - 0s 1ms/step - loss: 1.1302 - accuracy: 0.4895
Epoch 22/256
12/12 [==============================] - 0s 1ms/step - loss: 1.0973 - accuracy: 0.4921
Epoch 23/256
12/12 [==============================] - 0s 1ms/step - loss: 1.0699 - accuracy: 0.4921
Epoch 24/256
12/12 [==============================] - 0s 1ms/step - loss: 1.0368 - accuracy: 0.4974
Epoch 25/256
12/12 [==============================] - 0s 1ms/step - loss: 1.0118 - accuracy: 0.4947
Epoch 26/256
12/12 [==============================] - 0s 1ms/step - loss: 0.9845 - accuracy: 0.5132
Epoch 27/256
12/12 [==============================] - 0s 1ms/step - loss: 0.9604 - accuracy: 0.5184
Epoch 28/256
12/12 [==============================] - 0s 1ms/step - loss: 0.9391 - accuracy: 0.5421
Epoch 29/256
12/12 [==============================] - 0s 1ms/step - loss: 0.9126 - accuracy: 0.5474
Epoch 30/256
12/12 [==============================] - 0s 1ms/step - loss: 0.8902 - accuracy: 0.5447
Epoch 31/256
12/12 [==============================] - 0s 1ms/step - loss: 0.8695 - accuracy: 0.5579
Epoch 32/256
12/12 [==============================] - 0s 1ms/step - loss: 0.8506 - accuracy: 0.6132
Epoch 33/256
12/12 [==============================] - 0s 1ms/step - loss: 0.8281 - accuracy: 0.6184
Epoch 34/256
12/12 [==============================] - 0s 1ms/step - loss: 0.8064 - accuracy: 0.6684
Epoch 35/256
12/12 [==============================] - 0s 1ms/step - loss: 0.7877 - accuracy: 0.6868
Epoch 36/256
12/12 [==============================] - 0s 1ms/step - loss: 0.7666 - accuracy: 0.6974
Epoch 37/256
12/12 [==============================] - 0s 1ms/step - loss: 0.7398 - accuracy: 0.7211
Epoch 38/256
12/12 [==============================] - 0s 1ms/step - loss: 0.7138 - accuracy: 0.7316
Epoch 39/256
12/12 [==============================] - 0s 1ms/step - loss: 0.6901 - accuracy: 0.7342
Epoch 40/256
12/12 [==============================] - 0s 1ms/step - loss: 0.6657 - accuracy: 0.7553
Epoch 41/256
12/12 [==============================] - 0s 1ms/step - loss: 0.6407 - accuracy: 0.7658
Epoch 42/256
12/12 [==============================] - 0s 1ms/step - loss: 0.6225 - accuracy: 0.7763
Epoch 43/256
12/12 [==============================] - 0s 1ms/step - loss: 0.5968 - accuracy: 0.8079
Epoch 44/256
12/12 [==============================] - 0s 1ms/step - loss: 0.5726 - accuracy: 0.8342
Epoch 45/256
12/12 [==============================] - 0s 1ms/step - loss: 0.5518 - accuracy: 0.8158
Epoch 46/256
12/12 [==============================] - 0s 1ms/step - loss: 0.5333 - accuracy: 0.8500
Epoch 47/256
12/12 [==============================] - 0s 1ms/step - loss: 0.5080 - accuracy: 0.8711
Epoch 48/256
12/12 [==============================] - 0s 1ms/step - loss: 0.4869 - accuracy: 0.8763
Epoch 49/256
12/12 [==============================] - 0s 1ms/step - loss: 0.4607 - accuracy: 0.8921
Epoch 50/256
12/12 [==============================] - 0s 1ms/step - loss: 0.4337 - accuracy: 0.9053
Epoch 51/256
12/12 [==============================] - 0s 1ms/step - loss: 0.4056 - accuracy: 0.9237
Epoch 52/256
12/12 [==============================] - 0s 1ms/step - loss: 0.3767 - accuracy: 0.9447
Epoch 53/256
12/12 [==============================] - 0s 1ms/step - loss: 0.3486 - accuracy: 0.9474
Epoch 54/256
12/12 [==============================] - 0s 1ms/step - loss: 0.3232 - accuracy: 0.9579
Epoch 55/256
12/12 [==============================] - 0s 1ms/step - loss: 0.3029 - accuracy: 0.9579
Epoch 56/256
12/12 [==============================] - 0s 1ms/step - loss: 0.2799 - accuracy: 0.9579
Epoch 57/256
12/12 [==============================] - 0s 1ms/step - loss: 0.2637 - accuracy: 0.9605
Epoch 58/256
12/12 [==============================] - 0s 1ms/step - loss: 0.2494 - accuracy: 0.9605
Epoch 59/256
12/12 [==============================] - 0s 1ms/step - loss: 0.2304 - accuracy: 0.9605
Epoch 60/256
12/12 [==============================] - 0s 1ms/step - loss: 0.2221 - accuracy: 0.9526
Epoch 61/256
12/12 [==============================] - 0s 1ms/step - loss: 0.2083 - accuracy: 0.9684
Epoch 62/256
12/12 [==============================] - 0s 1ms/step - loss: 0.1957 - accuracy: 0.9684
Epoch 63/256
12/12 [==============================] - 0s 1ms/step - loss: 0.1882 - accuracy: 0.9684
Epoch 64/256
12/12 [==============================] - 0s 1ms/step - loss: 0.1775 - accuracy: 0.9763
Epoch 65/256
12/12 [==============================] - 0s 1ms/step - loss: 0.1789 - accuracy: 0.9553
Epoch 66/256
12/12 [==============================] - 0s 1ms/step - loss: 0.1643 - accuracy: 0.9711
Epoch 67/256
12/12 [==============================] - 0s 1ms/step - loss: 0.1574 - accuracy: 0.9737
Epoch 68/256
12/12 [==============================] - 0s 1ms/step - loss: 0.1526 - accuracy: 0.9737
Epoch 69/256
12/12 [==============================] - 0s 1ms/step - loss: 0.1475 - accuracy: 0.9763
Epoch 70/256
12/12 [==============================] - 0s 1ms/step - loss: 0.1467 - accuracy: 0.9632
Epoch 71/256
12/12 [==============================] - 0s 1ms/step - loss: 0.1377 - accuracy: 0.9763
Epoch 72/256
12/12 [==============================] - 0s 1ms/step - loss: 0.1311 - accuracy: 0.9816
Epoch 73/256
12/12 [==============================] - 0s 1ms/step - loss: 0.1286 - accuracy: 0.9789
Epoch 74/256
12/12 [==============================] - 0s 1ms/step - loss: 0.1232 - accuracy: 0.9816
Epoch 75/256
12/12 [==============================] - 0s 1ms/step - loss: 0.1203 - accuracy: 0.9842
Epoch 76/256
12/12 [==============================] - 0s 965us/step - loss: 0.1178 - accuracy: 0.9868
Epoch 77/256
12/12 [==============================] - 0s 996us/step - loss: 0.1137 - accuracy: 0.9763
Epoch 78/256
12/12 [==============================] - 0s 1ms/step - loss: 0.1128 - accuracy: 0.9842
Epoch 79/256
12/12 [==============================] - 0s 970us/step - loss: 0.1100 - accuracy: 0.9737
Epoch 80/256
12/12 [==============================] - 0s 1ms/step - loss: 0.1024 - accuracy: 0.9868
Epoch 81/256
12/12 [==============================] - 0s 1ms/step - loss: 0.1014 - accuracy: 0.9842
Epoch 82/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0969 - accuracy: 0.9868
Epoch 83/256
12/12 [==============================] - 0s 952us/step - loss: 0.0964 - accuracy: 0.9842
Epoch 84/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0944 - accuracy: 0.9842
Epoch 85/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0891 - accuracy: 0.9921
Epoch 86/256
12/12 [==============================] - 0s 987us/step - loss: 0.0872 - accuracy: 0.9868
Epoch 87/256
12/12 [==============================] - 0s 971us/step - loss: 0.0843 - accuracy: 0.9868
Epoch 88/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0829 - accuracy: 0.9895
Epoch 89/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0833 - accuracy: 0.9921
Epoch 90/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0849 - accuracy: 0.9842
Epoch 91/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0762 - accuracy: 0.9974
Epoch 92/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0755 - accuracy: 0.9921
Epoch 93/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0721 - accuracy: 0.9947
Epoch 94/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0756 - accuracy: 0.9895
Epoch 95/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0686 - accuracy: 0.9947
Epoch 96/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0680 - accuracy: 0.9947
Epoch 97/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0660 - accuracy: 0.9947
Epoch 98/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0665 - accuracy: 0.9921
Epoch 99/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0632 - accuracy: 0.9974
Epoch 100/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0607 - accuracy: 0.9947
Epoch 101/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0611 - accuracy: 0.9921
Epoch 102/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0612 - accuracy: 0.9921
Epoch 103/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0594 - accuracy: 0.9921
Epoch 104/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0570 - accuracy: 0.9947
Epoch 105/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0537 - accuracy: 0.9947
Epoch 106/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0514 - accuracy: 0.9947
Epoch 107/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0497 - accuracy: 0.9947
Epoch 108/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0490 - accuracy: 0.9974
Epoch 109/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0480 - accuracy: 0.9974
Epoch 110/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0465 - accuracy: 0.9947
Epoch 111/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0448 - accuracy: 0.9974
Epoch 112/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0437 - accuracy: 1.0000
Epoch 113/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0432 - accuracy: 1.0000
Epoch 114/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0427 - accuracy: 0.9974
Epoch 115/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0410 - accuracy: 0.9974
Epoch 116/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0398 - accuracy: 1.0000
Epoch 117/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0400 - accuracy: 0.9974
Epoch 118/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0389 - accuracy: 0.9974
Epoch 119/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0373 - accuracy: 1.0000
Epoch 120/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0360 - accuracy: 1.0000
Epoch 121/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0350 - accuracy: 1.0000
Epoch 122/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0344 - accuracy: 0.9974
Epoch 123/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0338 - accuracy: 1.0000
Epoch 124/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0327 - accuracy: 1.0000
Epoch 125/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0323 - accuracy: 1.0000
Epoch 126/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0325 - accuracy: 0.9974
Epoch 127/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0330 - accuracy: 1.0000
Epoch 128/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0305 - accuracy: 1.0000
Epoch 129/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0289 - accuracy: 1.0000
Epoch 130/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0279 - accuracy: 1.0000
Epoch 131/256
12/12 [==============================] - 0s 996us/step - loss: 0.0271 - accuracy: 1.0000
Epoch 132/256
12/12 [==============================] - 0s 995us/step - loss: 0.0265 - accuracy: 1.0000
Epoch 133/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0257 - accuracy: 1.0000
Epoch 134/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0262 - accuracy: 1.0000
Epoch 135/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0251 - accuracy: 1.0000
Epoch 136/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0244 - accuracy: 1.0000
Epoch 137/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0237 - accuracy: 1.0000
Epoch 138/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0233 - accuracy: 1.0000
Epoch 139/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0225 - accuracy: 1.0000
Epoch 140/256
12/12 [==============================] - 0s 1000us/step - loss: 0.0220 - accuracy: 1.0000
Epoch 141/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0217 - accuracy: 1.0000
Epoch 142/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0210 - accuracy: 1.0000
Epoch 143/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0206 - accuracy: 1.0000
Epoch 144/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0201 - accuracy: 1.0000
Epoch 145/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0195 - accuracy: 1.0000
Epoch 146/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0194 - accuracy: 1.0000
Epoch 147/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0190 - accuracy: 1.0000
Epoch 148/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0179 - accuracy: 1.0000
Epoch 149/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0176 - accuracy: 1.0000
Epoch 150/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0171 - accuracy: 1.0000
Epoch 151/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0167 - accuracy: 1.0000
Epoch 152/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0163 - accuracy: 1.0000
Epoch 153/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0159 - accuracy: 1.0000
Epoch 154/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0158 - accuracy: 1.0000
Epoch 155/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0153 - accuracy: 1.0000
Epoch 156/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0153 - accuracy: 1.0000
Epoch 157/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0147 - accuracy: 1.0000
Epoch 158/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0143 - accuracy: 1.0000
Epoch 159/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0139 - accuracy: 1.0000
Epoch 160/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0137 - accuracy: 1.0000
Epoch 161/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0133 - accuracy: 1.0000
Epoch 162/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0128 - accuracy: 1.0000
Epoch 163/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0125 - accuracy: 1.0000
Epoch 164/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0122 - accuracy: 1.0000
Epoch 165/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0122 - accuracy: 1.0000
Epoch 166/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0118 - accuracy: 1.0000
Epoch 167/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0115 - accuracy: 1.0000
Epoch 168/256
12/12 [==============================] - 0s 988us/step - loss: 0.0112 - accuracy: 1.0000
Epoch 169/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0110 - accuracy: 1.0000
Epoch 170/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0107 - accuracy: 1.0000
Epoch 171/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0112 - accuracy: 1.0000
Epoch 172/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0107 - accuracy: 1.0000
Epoch 173/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0102 - accuracy: 1.0000
Epoch 174/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0099 - accuracy: 1.0000
Epoch 175/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0097 - accuracy: 1.0000
Epoch 176/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0096 - accuracy: 1.0000
Epoch 177/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0093 - accuracy: 1.0000
Epoch 178/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0091 - accuracy: 1.0000
Epoch 179/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0089 - accuracy: 1.0000
Epoch 180/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0086 - accuracy: 1.0000
Epoch 181/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0084 - accuracy: 1.0000
Epoch 182/256
12/12 [==============================] - 0s 999us/step - loss: 0.0083 - accuracy: 1.0000
Epoch 183/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0081 - accuracy: 1.0000
Epoch 184/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0081 - accuracy: 1.0000
Epoch 185/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0078 - accuracy: 1.0000
Epoch 186/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0076 - accuracy: 1.0000
Epoch 187/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0078 - accuracy: 1.0000
Epoch 188/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0073 - accuracy: 1.0000
Epoch 189/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0071 - accuracy: 1.0000
Epoch 190/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0071 - accuracy: 1.0000
Epoch 191/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0069 - accuracy: 1.0000
Epoch 192/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0069 - accuracy: 1.0000
Epoch 193/256
12/12 [==============================] - ETA: 0s - loss: 0.0065 - accuracy: 1.00 - 0s 1ms/step - loss: 0.0065 - accuracy: 1.0000
Epoch 194/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0066 - accuracy: 1.0000
Epoch 195/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0064 - accuracy: 1.0000
Epoch 196/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0061 - accuracy: 1.0000
Epoch 197/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0062 - accuracy: 1.0000
Epoch 198/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0062 - accuracy: 1.0000
Epoch 199/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0058 - accuracy: 1.0000
Epoch 200/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0057 - accuracy: 1.0000
Epoch 201/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0056 - accuracy: 1.0000
Epoch 202/256
12/12 [==============================] - 0s 965us/step - loss: 0.0054 - accuracy: 1.0000
Epoch 203/256
12/12 [==============================] - 0s 995us/step - loss: 0.0054 - accuracy: 1.0000
Epoch 204/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0053 - accuracy: 1.0000
Epoch 205/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0052 - accuracy: 1.0000
Epoch 206/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0051 - accuracy: 1.0000
Epoch 207/256
12/12 [==============================] - 0s 968us/step - loss: 0.0050 - accuracy: 1.0000
Epoch 208/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0049 - accuracy: 1.0000
Epoch 209/256
12/12 [==============================] - 0s 958us/step - loss: 0.0049 - accuracy: 1.0000
Epoch 210/256
12/12 [==============================] - 0s 994us/step - loss: 0.0047 - accuracy: 1.0000
Epoch 211/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0046 - accuracy: 1.0000
Epoch 212/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0047 - accuracy: 1.0000
Epoch 213/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0045 - accuracy: 1.0000
Epoch 214/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0044 - accuracy: 1.0000
Epoch 215/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0043 - accuracy: 1.0000
Epoch 216/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0043 - accuracy: 1.0000
Epoch 217/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0043 - accuracy: 1.0000
Epoch 218/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0042 - accuracy: 1.0000
Epoch 219/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0040 - accuracy: 1.0000
Epoch 220/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0040 - accuracy: 1.0000
Epoch 221/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0039 - accuracy: 1.0000
Epoch 222/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0038 - accuracy: 1.0000
Epoch 223/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0037 - accuracy: 1.0000
Epoch 224/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0037 - accuracy: 1.0000
Epoch 225/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0036 - accuracy: 1.0000
Epoch 226/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0035 - accuracy: 1.0000
Epoch 227/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0035 - accuracy: 1.0000
Epoch 228/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0034 - accuracy: 1.0000
Epoch 229/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0034 - accuracy: 1.0000
Epoch 230/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0033 - accuracy: 1.0000
Epoch 231/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0033 - accuracy: 1.0000
Epoch 232/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0032 - accuracy: 1.0000
Epoch 233/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0032 - accuracy: 1.0000
Epoch 234/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0031 - accuracy: 1.0000
Epoch 235/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0030 - accuracy: 1.0000
Epoch 236/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0030 - accuracy: 1.0000
Epoch 237/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0029 - accuracy: 1.0000
Epoch 238/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0029 - accuracy: 1.0000
Epoch 239/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0029 - accuracy: 1.0000
Epoch 240/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0028 - accuracy: 1.0000
Epoch 241/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0028 - accuracy: 1.0000
Epoch 242/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0027 - accuracy: 1.0000
Epoch 243/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0027 - accuracy: 1.0000
Epoch 244/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0026 - accuracy: 1.0000
Epoch 245/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0026 - accuracy: 1.0000
Epoch 246/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0026 - accuracy: 1.0000
Epoch 247/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0025 - accuracy: 1.0000
Epoch 248/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0025 - accuracy: 1.0000
Epoch 249/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0025 - accuracy: 1.0000
Epoch 250/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0024 - accuracy: 1.0000
Epoch 251/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0024 - accuracy: 1.0000
Epoch 252/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0024 - accuracy: 1.0000
Epoch 253/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0023 - accuracy: 1.0000
Epoch 254/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0023 - accuracy: 1.0000
Epoch 255/256
12/12 [==============================] - 0s 1ms/step - loss: 0.0022 - accuracy: 1.0000
Epoch 256/256
12/12 [==============================] - 0s 987us/step - loss: 0.0022 - accuracy: 1.0000

 Test set
3/3 [==============================] - 0s 999us/step - loss: 0.1321 - accuracy: 0.9375
Out[25]:
[0.13213469088077545, 0.9375]
In [26]:
y_pred = model.predict(x_test)

for i in range(len(y_pred)):
    m = max(y_pred[i])
    j = np.where(y_pred[i] == m)[0][0]
    k = np.where(y_test[i] == 1)[0][0]
    if(j!=k):
        print("------- Mismatch -------")
    print("Actual: " + gestures[k] + "\nPredicted: " + gestures[j] + "\nConfidence: " + str(m) + "\n")
Actual: up_clockwise
Predicted: up_clockwise
Confidence: 0.9999838

Actual: up_clockwise
Predicted: up_clockwise
Confidence: 0.99959654

Actual: up_anticlockwise
Predicted: up_anticlockwise
Confidence: 0.99999046

Actual: forward_fall
Predicted: forward_fall
Confidence: 0.9996501

Actual: forward_fall
Predicted: forward_fall
Confidence: 0.9998752

Actual: left_to_right
Predicted: left_to_right
Confidence: 0.99876475

Actual: down_to_up
Predicted: down_to_up
Confidence: 0.9920249

Actual: up_clockwise
Predicted: up_clockwise
Confidence: 0.5570747

Actual: up_clockwise
Predicted: up_clockwise
Confidence: 0.99996173

Actual: left_fall
Predicted: left_fall
Confidence: 0.9999999

Actual: forward_clockwise
Predicted: forward_clockwise
Confidence: 0.9999913

Actual: down_to_up
Predicted: down_to_up
Confidence: 0.9994374

Actual: left_fall
Predicted: left_fall
Confidence: 0.9999994

Actual: down_to_up
Predicted: down_to_up
Confidence: 0.99477553

Actual: left_fall
Predicted: left_fall
Confidence: 0.999997

Actual: right_to_left
Predicted: right_to_left
Confidence: 0.99992824

Actual: forward_clockwise
Predicted: forward_clockwise
Confidence: 0.99983776

Actual: up_clockwise
Predicted: up_clockwise
Confidence: 0.9999647

Actual: up_clockwise
Predicted: up_clockwise
Confidence: 0.9999149

Actual: forward_fall
Predicted: forward_fall
Confidence: 0.9996562

------- Mismatch -------
Actual: right_to_left
Predicted: left_to_right
Confidence: 0.6083125

Actual: forward_clockwise
Predicted: forward_clockwise
Confidence: 0.9925015

Actual: forward_clockwise
Predicted: forward_clockwise
Confidence: 0.9999989

Actual: left_to_right
Predicted: left_to_right
Confidence: 0.9945163

Actual: right_to_left
Predicted: right_to_left
Confidence: 0.9999893

Actual: forward_fall
Predicted: forward_fall
Confidence: 0.99992967

Actual: left_to_right
Predicted: left_to_right
Confidence: 0.7176366

Actual: right_to_left
Predicted: right_to_left
Confidence: 0.9702669

Actual: up_anticlockwise
Predicted: up_anticlockwise
Confidence: 0.99999905

Actual: up_anticlockwise
Predicted: up_anticlockwise
Confidence: 0.9995927

Actual: forward_clockwise
Predicted: forward_clockwise
Confidence: 0.9998735

------- Mismatch -------
Actual: left_to_right
Predicted: down_to_up
Confidence: 0.93600833

Actual: up_anticlockwise
Predicted: up_anticlockwise
Confidence: 0.999959

------- Mismatch -------
Actual: up_clockwise
Predicted: down_to_up
Confidence: 0.55089766

Actual: right_to_left
Predicted: right_to_left
Confidence: 0.9999479

Actual: down_to_up
Predicted: down_to_up
Confidence: 0.99686664

------- Mismatch -------
Actual: left_to_right
Predicted: down_to_up
Confidence: 0.5871159

Actual: forward_clockwise
Predicted: forward_clockwise
Confidence: 0.999936

Actual: down_to_up
Predicted: down_to_up
Confidence: 0.92205405

Actual: forward_fall
Predicted: forward_fall
Confidence: 0.99974054

Actual: forward_clockwise
Predicted: forward_clockwise
Confidence: 0.9995011

Actual: left_fall
Predicted: left_fall
Confidence: 0.99997866

Actual: up_anticlockwise
Predicted: up_anticlockwise
Confidence: 0.9999962

Actual: forward_clockwise
Predicted: forward_clockwise
Confidence: 0.99987376

Actual: down_to_up
Predicted: down_to_up
Confidence: 0.97294545

Actual: up_clockwise
Predicted: up_clockwise
Confidence: 0.99998844

Actual: down_to_up
Predicted: down_to_up
Confidence: 0.81514555

Actual: up_anticlockwise
Predicted: up_anticlockwise
Confidence: 0.999995

Actual: left_fall
Predicted: left_fall
Confidence: 0.9999989

Actual: up_clockwise
Predicted: up_clockwise
Confidence: 0.9998957

Actual: up_anticlockwise
Predicted: up_anticlockwise
Confidence: 0.9998722

Actual: forward_clockwise
Predicted: forward_clockwise
Confidence: 0.99996936

Actual: down_to_up
Predicted: down_to_up
Confidence: 0.9986534

Actual: forward_fall
Predicted: forward_fall
Confidence: 0.99883634

Actual: down_to_up
Predicted: down_to_up
Confidence: 0.996051

Actual: down_to_up
Predicted: down_to_up
Confidence: 0.9827587

Actual: down_to_up
Predicted: down_to_up
Confidence: 0.9907924

Actual: forward_fall
Predicted: forward_fall
Confidence: 0.9998714

Actual: right_to_left
Predicted: right_to_left
Confidence: 0.99992394

Actual: left_fall
Predicted: left_fall
Confidence: 0.9999993

Actual: up_clockwise
Predicted: up_clockwise
Confidence: 0.98926276

Actual: forward_clockwise
Predicted: forward_clockwise
Confidence: 0.9999721

Actual: right_to_left
Predicted: right_to_left
Confidence: 0.9997861

Actual: left_fall
Predicted: left_fall
Confidence: 0.9999927

Actual: forward_fall
Predicted: forward_fall
Confidence: 0.9991929

Actual: right_to_left
Predicted: right_to_left
Confidence: 0.9903287

Actual: forward_clockwise
Predicted: forward_clockwise
Confidence: 0.9977597

Actual: left_fall
Predicted: left_fall
Confidence: 0.9999602

------- Mismatch -------
Actual: right_to_left
Predicted: left_to_right
Confidence: 0.7039355

Actual: forward_clockwise
Predicted: forward_clockwise
Confidence: 0.9993461

Actual: left_to_right
Predicted: left_to_right
Confidence: 0.9889498

Actual: forward_fall
Predicted: forward_fall
Confidence: 0.9999063

Actual: forward_fall
Predicted: forward_fall
Confidence: 0.9998259

Actual: forward_fall
Predicted: forward_fall
Confidence: 0.9999443

Actual: up_anticlockwise
Predicted: up_anticlockwise
Confidence: 0.999998

Actual: up_clockwise
Predicted: up_clockwise
Confidence: 0.9999002

------- Mismatch -------
Actual: forward_clockwise
Predicted: right_to_left
Confidence: 0.903558

Actual: up_clockwise
Predicted: up_clockwise
Confidence: 0.9999914

Actual: left_to_right
Predicted: left_to_right
Confidence: 0.9891298

Actual: left_to_right
Predicted: left_to_right
Confidence: 0.9987771

Actual: left_fall
Predicted: left_fall
Confidence: 0.999998

Actual: left_fall
Predicted: left_fall
Confidence: 0.99999905

Actual: left_to_right
Predicted: left_to_right
Confidence: 0.5647926

Actual: forward_clockwise
Predicted: forward_clockwise
Confidence: 0.9999938

Actual: left_to_right
Predicted: left_to_right
Confidence: 0.994208

Actual: forward_fall
Predicted: forward_fall
Confidence: 0.99980456

Actual: left_fall
Predicted: left_fall
Confidence: 0.9999994

Actual: left_to_right
Predicted: left_to_right
Confidence: 0.9980793

Actual: forward_fall
Predicted: forward_fall
Confidence: 0.99989533

Actual: up_anticlockwise
Predicted: up_anticlockwise
Confidence: 0.99957365

Actual: forward_clockwise
Predicted: forward_clockwise
Confidence: 0.99995697

Actual: left_to_right
Predicted: left_to_right
Confidence: 0.9987696

Actual: left_fall
Predicted: left_fall
Confidence: 0.99995935

Actual: right_to_left
Predicted: right_to_left
Confidence: 0.82914716

Actual: forward_fall
Predicted: forward_fall
Confidence: 0.9995034

Actual: forward_clockwise
Predicted: forward_clockwise
Confidence: 0.99973637

In [27]:
model.save('gestureTinyModel.h5')
In [28]:
model = tf.keras.models.load_model('gestureTinyModel.h5')
In [29]:
model.evaluate(x_test, y_test)
3/3 [==============================] - 0s 1ms/step - loss: 0.1321 - accuracy: 0.9375
Out[29]:
[0.13213469088077545, 0.9375]
In [30]:
!deepCC gestureTinyModel.h5
reading [keras model] from 'gestureTinyModel.h5'
Saved 'gestureTinyModel.onnx'
reading onnx model from file  gestureTinyModel.onnx
Model info:
  ir_vesion :  3 
  doc       : 
WARN (ONNX): terminal (input/output) dense_4_input's shape is less than 1.
             changing it to 1.
WARN (ONNX): terminal (input/output) dense_7's shape is less than 1.
             changing it to 1.
WARN (GRAPH): found operator node with the same name (dense_7) as io node.
running DNNC graph sanity check ... passed.
Writing C++ file  gestureTinyModel_deepC/gestureTinyModel.cpp
INFO (ONNX): model files are ready in dir gestureTinyModel_deepC
g++ -std=c++11 -O3 -I. -I/opt/tljh/user/lib/python3.7/site-packages/deepC-0.13-py3.7-linux-x86_64.egg/deepC/include -isystem /opt/tljh/user/lib/python3.7/site-packages/deepC-0.13-py3.7-linux-x86_64.egg/deepC/packages/eigen-eigen-323c052e1731 gestureTinyModel_deepC/gestureTinyModel.cpp -o gestureTinyModel_deepC/gestureTinyModel.exe
Model executable  gestureTinyModel_deepC/gestureTinyModel.exe

Gesture based home automation

The below snippet is a poc for gesture based home automation - controlling the various appliances at home with gestures.

The aim is to embed the sensors in a wearable device.

We move our hands in so many different ways and there is a high chance that one of these many movements map to the recorded gestures and trigger responses. To counter this, we can include a tiny button/other mechanism to indicate the start of the gesture.


List of gestures and the various categories for reference, mapped to the list of gestures above.

Here, count refers to moving to the next instance of the appliance (eg, countIncrease - moving from fan1 to fan2, countDecrease - moving from fan2 to fan1) and next refers to choosing the next appliance.

In [31]:
gesture_key = {}
gesture_key["appliance"] = ["light", "fan"]
gesture_key["operation"] = ["switch", "increase", "decrease"]
gesture_key["count"] = ["countIncrease", "countDecrease"]
gesture_key["next"] = ["next"]


gestures_home = []
for x in gesture_key:
    gestures_home.extend(gesture_key[x])

print(gestures_home)
['light', 'fan', 'switch', 'increase', 'decrease', 'countIncrease', 'countDecrease', 'next']
In [32]:
lines = urllib.request.urlopen('https://cainvas-static.s3.amazonaws.com/media/user_data/cainvas-admin/test7.txt').read()
lines = lines.decode('utf-8').split('\n')[:-1]
 
appliance_bool = True
operation_bool = False
 
# mapping each appliance to the operations that can be done on them
 
appliance_to_operation = {'light' : {'switch'}, 
                          'fan' : {'switch', 'increase', 'decrease'}}
 
# descriptions and cuurent state of the configured (installed) appliances  
configured = {}
configured['light'] = { 1 : ["off"], 
                       2 : ["off"]}
configured['fan'] = { 1 : ["off", 0, 5]}        # 0 - cuurent speed, 5 - maximum
 
# list of operations performed (just for reference) 
performed = []
 
# initializations 
count = 0
appliance = ""
 
for line in lines[1:]:      # first line of text file contains column headings
 
    line = line.split(' ')[:-1]     # last value is gesture name which is blank here
    line = np.array(line).astype('float32').reshape(1,600)
 
    result = model.predict(normalize(line))
    m = max(result[0])
    g = gestures_home[np.where(result[0] == m)[0][0]]
    print("Recognized: " + g + " with " + str(m) + " confidence level.")
    
    # if appliance is to be selected
    if (appliance_bool):
        
        if g in configured:
            print("Selected " + g)
            
            # initializing instance of appliance
            count = 0
 
            # Setting appliance selection to false and operation on appliance to true
            appliance_bool = False
            operation_bool = True
            
            # keeping track of current appliance
            appliance = g
            
            performed.append([g, "Selected"])
        
        else:
            print("Invalid gesture. Try again!")
    
    # if operation is to be done on appliance
    elif (operation_bool):
 
        # gesture to move to next instance of appliance
        if g == 'countIncrease':
            if count < len(configured[appliance]):
                count = count + 1
                print("Instance " + str(count) + " of " + appliance + " selected.")
                performed.append([appliance, g, count])
            else:
                print("Last instance reached.")
 
        # gesture to move to previous instance of appliance
        elif g == 'countDecrease':
            if count > 1:
                count = count - 1
                print("Instance " + str(count) + " of " + appliance + " selected.")
                performed.append([appliance, g, count])
            else:
                print("First instance reached.")
 
        # gesture to change current selelcted appliance
        elif g == 'next':            
            print("Change appliance")
            application = ""
 
            # Setting appliance selection to true and operation on appliance to false
            appliance_bool = True
            operation_bool = False  
 
            performed.append(['Changed'])
 
        # gesture to perform an operation on the selelcted appliance (checking if the operation is valid for the selected appliance)
        elif g in appliance_to_operation[appliance]:
            
            # checking if instance of appliance is selected
            if count !=0:
                
                # toggle switch on/off
                if g == 'switch':
                    configured[appliance][count][0] = 'on' if configured[appliance][count][0]=='off' else 'off'
 
                    performed.append([appliance, g, configured[appliance][count][0]])
 
                    print("Instance " + str(count) + " of " + appliance + " switched " + configured[appliance][count][0])
 
                # increase fan speed    
                elif g == 'increase':
                    if configured[appliance][count][1]<configured[appliance][count][2]:
                        configured[appliance][count][1] = configured[appliance][count][1] + 1
 
                        performed.append([appliance, g, configured[appliance][count][1]])
 
                        print("Instance " + str(count) + " of " + appliance + " - speed increased to " + str(configured[appliance][count][1]))
                    else:
                        print("Max speed reached!")
 
                #decrease fan speed
                elif g == 'decrease':
                    if configured[appliance][count][1] > 0:
                        configured[appliance][count][1] = configured[appliance][count][1] - 1
 
                        performed.append([appliance, g, configured[appliance][count][1]])
 
                        print("Instance " + str(count) + " of " + appliance + " - speed decreased to " + str(configured[appliance][count][1]))
                    else:
                        print("Min speed reached!")
            else:
                print('Select instance of appliance to work with!')
        else:
            print("Invalid operation for appliance!")
    print()
 
print('\nOperations performed')
for x in performed:
    print(x)
 
print('\nFinal state of appliances')
print(configured)
Recognized: light with 0.9836955 confidence level.
Selected light

Recognized: switch with 0.99999905 confidence level.
Select instance of appliance to work with!

Recognized: light with 0.7068602 confidence level.
Invalid operation for appliance!

Recognized: switch with 0.9996886 confidence level.
Select instance of appliance to work with!

Recognized: next with 0.99992144 confidence level.
Change appliance

Recognized: fan with 0.99999964 confidence level.
Selected fan

Recognized: light with 0.9850494 confidence level.
Invalid operation for appliance!

Recognized: switch with 0.99996674 confidence level.
Select instance of appliance to work with!

Recognized: fan with 0.74425936 confidence level.
Invalid operation for appliance!

Recognized: increase with 0.9999747 confidence level.
Select instance of appliance to work with!

Recognized: decrease with 0.99970657 confidence level.
Select instance of appliance to work with!

Recognized: next with 0.99985135 confidence level.
Change appliance


Operations performed
['light', 'Selected']
['Changed']
['fan', 'Selected']
['Changed']

Final state of appliances
{'light': {1: ['off'], 2: ['off']}, 'fan': {1: ['off', 0, 5]}}

Other applications

  • Sign language convertor

    Embedded in a wearable, this allows the gestures performed to be mapped to voice output.

  • Gaming

    Move your hands around for a more real life experience.

  • Music generation