Cainvas
Model Files
heartbeat_anomaly.h5
keras
Model
deepSea Compiled Models
heartbeat_anomaly.exe
deepSea
Ubuntu

Heartbeat Anomaly Detection

Credit: AITS Cainvas Community

Photo on Gifer

According to WHO 17.9 million people die each year due to Cardiovascular Diseases.Over the years it has been found that these deaths can be prevented if the diseases are diagnosed in early stages.AI has brought a major development in the field of healthcare for early diagnosis of these diseases.

This model if coupled with digital stethoscopes or some similar IoT Device can help in the detection of anomalies in the Heartbeat Sounds of an individual.

Importing the Dataset

In [1]:
!wget -N "https://cainvas-static.s3.amazonaws.com/media/user_data/cainvas-admin/heart.zip"
!unzip -qo heart.zip 
!rm heart.zip
--2020-10-27 07:59:09--  https://cainvas-static.s3.amazonaws.com/media/user_data/cainvas-admin/heart.zip
Resolving cainvas-static.s3.amazonaws.com (cainvas-static.s3.amazonaws.com)... 52.219.66.92
Connecting to cainvas-static.s3.amazonaws.com (cainvas-static.s3.amazonaws.com)|52.219.66.92|:443... connected.
HTTP request sent, awaiting response... 200 OK
Length: 72125176 (69M) [application/zip]
Saving to: ‘heart.zip’

heart.zip           100%[===================>]  68.78M  97.7MB/s    in 0.7s    

2020-10-27 07:59:10 (97.7 MB/s) - ‘heart.zip’ saved [72125176/72125176]

Importing Necessary Libraries

In [2]:
# Pandas
import pandas as pd

# Scikit learn
from sklearn.model_selection import train_test_split
from sklearn.metrics import classification_report, accuracy_score, confusion_matrix
from sklearn.preprocessing import LabelEncoder
from sklearn.utils import shuffle
from sklearn.utils import class_weight

# Keras
from keras.models import Sequential
from keras.layers import Dense, Dropout, Activation, Flatten
from keras.layers import Convolution2D, Conv2D, MaxPooling2D, GlobalAveragePooling2D
from keras.utils import to_categorical
from keras.optimizers import Adam

# Audio
import librosa
import librosa.display

# Plot
%matplotlib inline
%pylab inline
import matplotlib.pyplot as plt
%config InlineBackend.figure_format = 'retina'

# Utility
import os
import glob
import numpy as np
from tqdm import tqdm
import itertools

# To ignore any warnings
import warnings                       
warnings.filterwarnings("ignore")


# gather software versions
import tensorflow as tf; print('tensorflow version: ', tf.__version__)
import keras; print('keras version: ',keras.__version__)

# If any warning pops up run the cell again.There is nothing to worry about.
Populating the interactive namespace from numpy and matplotlib
tensorflow version:  2.3.1
keras version:  2.4.3
/opt/tljh/user/lib/python3.7/site-packages/IPython/core/magics/pylab.py:160: UserWarning: pylab import has clobbered these variables: ['shuffle']
`%matplotlib` prevents importing * from pylab and numpy
  "\n`%matplotlib` prevents importing * from pylab and numpy"

Build Dataset

In [3]:
dataset = []
for folder in ["heart/set_a/**"]:
    for filename in glob.iglob(folder):
        if os.path.exists(filename):
            label = os.path.basename(filename).split("_")[0]
            # skip audio smaller than 4 secs
            if librosa.get_duration(filename=filename)>=4:
                if label not in ["Aunlabelledtest"]:
                    dataset.append({
                        "filename": filename,
                        "label": label
                    })
dataset = pd.DataFrame(dataset)

Exploratory Data Analysis

In [4]:
dataset.info()
<class 'pandas.core.frame.DataFrame'>
RangeIndex: 93 entries, 0 to 92
Data columns (total 2 columns):
 #   Column    Non-Null Count  Dtype 
---  ------    --------------  ----- 
 0   filename  93 non-null     object
 1   label     93 non-null     object
dtypes: object(2)
memory usage: 1.6+ KB
In [5]:
plt.figure(figsize=(12,6))
dataset.label.value_counts().plot(kind='bar', title="Dataset distribution")
plt.show()
In [6]:
# parent folder of sound files
INPUT_DIR="heart"
# 16 KHz
SAMPLE_RATE = 16000
# seconds
MAX_SOUND_CLIP_DURATION=12 
In [7]:
set_a=pd.read_csv(INPUT_DIR+"/set_a.csv")
set_a.head()
set_a.info()
<class 'pandas.core.frame.DataFrame'>
RangeIndex: 176 entries, 0 to 175
Data columns (total 4 columns):
 #   Column    Non-Null Count  Dtype  
---  ------    --------------  -----  
 0   dataset   176 non-null    object 
 1   fname     176 non-null    object 
 2   label     124 non-null    object 
 3   sublabel  0 non-null      float64
dtypes: float64(1), object(3)
memory usage: 5.6+ KB
In [8]:
train_ab=set_a
train_ab.describe()
Out[8]:
sublabel
count 0.0
mean NaN
std NaN
min NaN
25% NaN
50% NaN
75% NaN
max NaN
In [9]:
#get all unique labels
nb_classes=train_ab.label.unique()

print("Number of training examples=", train_ab.shape[0], "  Number of classes=", len(train_ab.label.unique()))
print (nb_classes)
Number of training examples= 176   Number of classes= 5
['artifact' 'extrahls' 'murmur' 'normal' nan]
In [10]:
print('Minimum samples per category = ', min(train_ab.label.value_counts()))
print('Maximum samples per category = ', max(train_ab.label.value_counts()))
Minimum samples per category =  19
Maximum samples per category =  40

Normal Case

In the Normal category there are normal, healthy heart sounds. A normal heart sound has a clear “lub dub, lub dub” pattern, with the time from “lub” to “dub” shorter than the time from “dub” to the next “lub”.

In [11]:
normal_file=INPUT_DIR+"/set_a/normal__201106111136.wav"
In [12]:
# hear it
import IPython.display as ipd
ipd.Audio(normal_file) 
Out[12]:
In [13]:
# Load use wave 
import wave
wav = wave.open(normal_file)
print("Sampling (frame) rate = ", wav.getframerate())
print("Total samples (frames) = ", wav.getnframes())
print("Duration = ", wav.getnframes()/wav.getframerate())
Sampling (frame) rate =  44100
Total samples (frames) =  218903
Duration =  4.963786848072562
In [14]:
# Load using Librosa
y, sr = librosa.load(normal_file, duration=5)   #default sampling rate is 22 HZ
dur=librosa.get_duration(y)
print ("duration:", dur)
print(y.shape, sr)
duration: 4.963809523809524
(109452,) 22050
In [15]:
# librosa plot
plt.figure(figsize=(16, 3))
librosa.display.waveplot(y, sr=sr)
Out[15]:
<matplotlib.collections.PolyCollection at 0x7f55eeaceb00>

Murmur Case

Heart murmurs sound as though there is a “whooshing, roaring, rumbling, or turbulent fluid” noise in one of two temporal locations: (1) between “lub” and “dub”, or (2) between “dub” and “lub”. They can be a symptom of many heart disorders, some serious. There will still be a “lub” and a “dub”.

In [16]:
# murmur case
murmur_file=INPUT_DIR+"/set_a/murmur__201108222231.wav"
y2, sr2 = librosa.load(murmur_file,duration=5)
dur=librosa.get_duration(y)
print ("duration:", dur)
print(y2.shape,sr2)
duration: 4.963809523809524
(110250,) 22050
In [17]:
# heart it
import IPython.display as ipd
ipd.Audio(murmur_file) 
Out[17]:
In [18]:
# show it
plt.figure(figsize=(16, 3))
librosa.display.waveplot(y2, sr=sr2)
Out[18]:
<matplotlib.collections.PolyCollection at 0x7f55ee9eeba8>

Artifact

In the Artifact category there are a wide range of different sounds, including feedback squeals and echoes, speech, music and noise. There are usually no discernable heart sounds, and thus little or no temporal periodicity at frequencies below 195 Hz.

In [19]:
# sample file
artifact_file=INPUT_DIR+"/set_a/artifact__201012172012.wav"
y4, sr4 = librosa.load(artifact_file, duration=5)
dur=librosa.get_duration(y)
print ("duration:", dur)
print(y4.shape,sr4)
duration: 4.963809523809524
(110250,) 22050
In [20]:
# heart it
import IPython.display as ipd
ipd.Audio(artifact_file) 
Out[20]:
In [21]:
# show it
plt.figure(figsize=(16, 3))
librosa.display.waveplot(y4, sr=sr4)
Out[21]:
<matplotlib.collections.PolyCollection at 0x7f55ee964128>

Extrahls

Extrahls sounds may appear occasionally and can be identified because there is a heart sound that is out of rhythm involving extra or skipped heartbeats, e.g. a “lub-lub dub” or a “lub dub-dub”.It can be sign of a disease.

In [22]:
# sample file
extrahls_file=INPUT_DIR+"/set_a/extrahls__201101070953.wav"
y5, sr5 = librosa.load(extrahls_file, duration=5)
dur=librosa.get_duration(y)
print ("duration:", dur)
print(y5.shape,sr5)
duration: 4.963809523809524
(110250,) 22050
In [23]:
# heart it
import IPython.display as ipd
ipd.Audio(extrahls_file) 
Out[23]:
In [24]:
# show it
plt.figure(figsize=(16, 3))
librosa.display.waveplot(y5, sr=sr5)
Out[24]:
<matplotlib.collections.PolyCollection at 0x7f55ee949748>

Split dataset in train and test

In [25]:
train, test = train_test_split(dataset, test_size=0.2, random_state=42)

print("Train: %i" % len(train))
print("Test: %i" % len(test))
Train: 74
Test: 19

Show Audio info

In [26]:
plt.figure(figsize=(20,20))
idx = 0
for label in dataset.label.unique():    
    y, sr = librosa.load(dataset[dataset.label==label].filename.iloc[0], duration=4)
    idx+=1
    plt.subplot(5, 3, idx)
    plt.title("%s wave" % label)
    librosa.display.waveplot(y, sr=sr)
    idx+=1
    plt.subplot(5, 3, idx)
    D = librosa.amplitude_to_db(np.abs(librosa.stft(y)), ref=np.max)
    librosa.display.specshow(D, y_axis='linear')
    plt.title("%s spectogram" % label)
    idx+=1
    mfccs = librosa.feature.mfcc(y=y, sr=sr, n_mfcc=40)
    plt.subplot(5, 3, idx)
    librosa.display.specshow(mfccs, x_axis='time')
    plt.title("%s mfcc" % label)
plt.show()

Extract features from audio

In [27]:
def extract_features(audio_path):
    y, sr = librosa.load(audio_path, duration=4)
    mfccs = librosa.feature.mfcc(y=y, sr=sr, n_mfcc=40)
    return mfccs
In [28]:
%%time
x_train, x_test = [], []
print("Extract features from TRAIN  and TEST dataset")
for idx in tqdm(range(len(train))):
    x_train.append(extract_features(train.filename.iloc[idx]))

for idx in tqdm(range(len(test))):
    x_test.append(extract_features(test.filename.iloc[idx]))
    
    
x_test = np.asarray(x_test)
x_train = np.asarray(x_train)

print("X train:", x_train.shape)
print("X test:", x_test.shape)
  1%|▏         | 1/74 [00:00<00:13,  5.51it/s]
Extract features from TRAIN  and TEST dataset
100%|██████████| 74/74 [00:13<00:00,  5.35it/s]
100%|██████████| 19/19 [00:03<00:00,  5.40it/s]
X train: (74, 40, 173)
X test: (19, 40, 173)
CPU times: user 23.3 s, sys: 28 s, total: 51.2 s
Wall time: 17.3 s

Encode labels

In [29]:
encoder = LabelEncoder()
encoder.fit(train.label)

y_train = encoder.transform(train.label)
y_test = encoder.transform(test.label)

Input shapes

In [30]:
x_train = x_train.reshape(x_train.shape[0], x_train.shape[1], x_train.shape[2], 1)
x_test = x_test.reshape(x_test.shape[0], x_test.shape[1], x_test.shape[2], 1)
y_train = to_categorical(y_train)
y_test = to_categorical(y_test)



print("X train:", x_train.shape)
print("Y train:", y_train.shape)
print()
print("X test:", x_test.shape)
print("Y test:", y_test.shape)
X train: (74, 40, 173, 1)
Y train: (74, 4)

X test: (19, 40, 173, 1)
Y test: (19, 4)

Build Model

In [31]:
# Model architecture
model = Sequential()
model.add(Conv2D(filters=16, kernel_size=2, input_shape=(x_train.shape[1],x_train.shape[2],x_train.shape[3]), activation='relu'))
model.add(MaxPooling2D(pool_size=2))
model.add(Dropout(0.3))

model.add(Conv2D(filters=32, kernel_size=2, activation='relu'))
model.add(MaxPooling2D(pool_size=2))
model.add(Dropout(0.3))

model.add(Conv2D(filters=64, kernel_size=2, activation='relu'))
model.add(MaxPooling2D(pool_size=2))
model.add(Dropout(0.3))

model.add(Conv2D(filters=128, kernel_size=2, activation='relu'))
model.add(MaxPooling2D(pool_size=2))
model.add(Dropout(0.3))
model.add(GlobalAveragePooling2D())

model.add(Dense(256, activation='relu'))
model.add(Dense(128, activation='relu'))

model.add(Dense(len(encoder.classes_), activation='softmax'))
model.summary()
Model: "sequential"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
conv2d (Conv2D)              (None, 39, 172, 16)       80        
_________________________________________________________________
max_pooling2d (MaxPooling2D) (None, 19, 86, 16)        0         
_________________________________________________________________
dropout (Dropout)            (None, 19, 86, 16)        0         
_________________________________________________________________
conv2d_1 (Conv2D)            (None, 18, 85, 32)        2080      
_________________________________________________________________
max_pooling2d_1 (MaxPooling2 (None, 9, 42, 32)         0         
_________________________________________________________________
dropout_1 (Dropout)          (None, 9, 42, 32)         0         
_________________________________________________________________
conv2d_2 (Conv2D)            (None, 8, 41, 64)         8256      
_________________________________________________________________
max_pooling2d_2 (MaxPooling2 (None, 4, 20, 64)         0         
_________________________________________________________________
dropout_2 (Dropout)          (None, 4, 20, 64)         0         
_________________________________________________________________
conv2d_3 (Conv2D)            (None, 3, 19, 128)        32896     
_________________________________________________________________
max_pooling2d_3 (MaxPooling2 (None, 1, 9, 128)         0         
_________________________________________________________________
dropout_3 (Dropout)          (None, 1, 9, 128)         0         
_________________________________________________________________
global_average_pooling2d (Gl (None, 128)               0         
_________________________________________________________________
dense (Dense)                (None, 256)               33024     
_________________________________________________________________
dense_1 (Dense)              (None, 128)               32896     
_________________________________________________________________
dense_2 (Dense)              (None, 4)                 516       
=================================================================
Total params: 109,748
Trainable params: 109,748
Non-trainable params: 0
_________________________________________________________________

Compile model

In [32]:
model.compile(loss='categorical_crossentropy', metrics=['accuracy'], optimizer=Adam(lr = 0.001))

Fit model

In [33]:
history = model.fit(x_train, y_train,
              batch_size=256,
              epochs=200,
              validation_data=(x_test, y_test),
              shuffle=True)
Epoch 1/200
1/1 [==============================] - 0s 208ms/step - loss: 10.6176 - accuracy: 0.2432 - val_loss: 5.8482 - val_accuracy: 0.4211
Epoch 2/200
1/1 [==============================] - 0s 27ms/step - loss: 15.5591 - accuracy: 0.2838 - val_loss: 8.6489 - val_accuracy: 0.4211
Epoch 3/200
1/1 [==============================] - 0s 27ms/step - loss: 21.3881 - accuracy: 0.2703 - val_loss: 6.3063 - val_accuracy: 0.4211
Epoch 4/200
1/1 [==============================] - 0s 27ms/step - loss: 11.0651 - accuracy: 0.2568 - val_loss: 5.2964 - val_accuracy: 0.0526
Epoch 5/200
1/1 [==============================] - 0s 26ms/step - loss: 11.9146 - accuracy: 0.2297 - val_loss: 4.8335 - val_accuracy: 0.0526
Epoch 6/200
1/1 [==============================] - 0s 27ms/step - loss: 11.1389 - accuracy: 0.1892 - val_loss: 3.8074 - val_accuracy: 0.1053
Epoch 7/200
1/1 [==============================] - 0s 32ms/step - loss: 9.2125 - accuracy: 0.2703 - val_loss: 2.5129 - val_accuracy: 0.1053
Epoch 8/200
1/1 [==============================] - 0s 28ms/step - loss: 6.2452 - accuracy: 0.2703 - val_loss: 1.7470 - val_accuracy: 0.2105
Epoch 9/200
1/1 [==============================] - 0s 28ms/step - loss: 3.4916 - accuracy: 0.2432 - val_loss: 1.6612 - val_accuracy: 0.3684
Epoch 10/200
1/1 [==============================] - 0s 27ms/step - loss: 4.3977 - accuracy: 0.2297 - val_loss: 1.3803 - val_accuracy: 0.3684
Epoch 11/200
1/1 [==============================] - 0s 28ms/step - loss: 4.0067 - accuracy: 0.2432 - val_loss: 1.2572 - val_accuracy: 0.5263
Epoch 12/200
1/1 [==============================] - 0s 27ms/step - loss: 3.6727 - accuracy: 0.3514 - val_loss: 1.3045 - val_accuracy: 0.4737
Epoch 13/200
1/1 [==============================] - 0s 27ms/step - loss: 3.6982 - accuracy: 0.3919 - val_loss: 1.2897 - val_accuracy: 0.4737
Epoch 14/200
1/1 [==============================] - 0s 27ms/step - loss: 3.1303 - accuracy: 0.3514 - val_loss: 1.2217 - val_accuracy: 0.4737
Epoch 15/200
1/1 [==============================] - 0s 26ms/step - loss: 2.3552 - accuracy: 0.3649 - val_loss: 1.2310 - val_accuracy: 0.4211
Epoch 16/200
1/1 [==============================] - 0s 27ms/step - loss: 1.8146 - accuracy: 0.3378 - val_loss: 1.3009 - val_accuracy: 0.3158
Epoch 17/200
1/1 [==============================] - 0s 27ms/step - loss: 2.2879 - accuracy: 0.2568 - val_loss: 1.3581 - val_accuracy: 0.3158
Epoch 18/200
1/1 [==============================] - 0s 27ms/step - loss: 2.2991 - accuracy: 0.3514 - val_loss: 1.3917 - val_accuracy: 0.4211
Epoch 19/200
1/1 [==============================] - 0s 27ms/step - loss: 2.2092 - accuracy: 0.3243 - val_loss: 1.4005 - val_accuracy: 0.3684
Epoch 20/200
1/1 [==============================] - 0s 27ms/step - loss: 2.4932 - accuracy: 0.2297 - val_loss: 1.3728 - val_accuracy: 0.3684
Epoch 21/200
1/1 [==============================] - 0s 27ms/step - loss: 2.0781 - accuracy: 0.3243 - val_loss: 1.3249 - val_accuracy: 0.3684
Epoch 22/200
1/1 [==============================] - 0s 27ms/step - loss: 2.0453 - accuracy: 0.2973 - val_loss: 1.2640 - val_accuracy: 0.3684
Epoch 23/200
1/1 [==============================] - 0s 27ms/step - loss: 1.7408 - accuracy: 0.3514 - val_loss: 1.2013 - val_accuracy: 0.3684
Epoch 24/200
1/1 [==============================] - 0s 27ms/step - loss: 1.6139 - accuracy: 0.2838 - val_loss: 1.1510 - val_accuracy: 0.5789
Epoch 25/200
1/1 [==============================] - 0s 27ms/step - loss: 1.3270 - accuracy: 0.4459 - val_loss: 1.1262 - val_accuracy: 0.7368
Epoch 26/200
1/1 [==============================] - 0s 27ms/step - loss: 1.2938 - accuracy: 0.4189 - val_loss: 1.1165 - val_accuracy: 0.6842
Epoch 27/200
1/1 [==============================] - 0s 28ms/step - loss: 1.3384 - accuracy: 0.4595 - val_loss: 1.1179 - val_accuracy: 0.5789
Epoch 28/200
1/1 [==============================] - 0s 27ms/step - loss: 1.4138 - accuracy: 0.3784 - val_loss: 1.1237 - val_accuracy: 0.5789
Epoch 29/200
1/1 [==============================] - 0s 28ms/step - loss: 1.3784 - accuracy: 0.4865 - val_loss: 1.1295 - val_accuracy: 0.6316
Epoch 30/200
1/1 [==============================] - 0s 28ms/step - loss: 1.5555 - accuracy: 0.3514 - val_loss: 1.1301 - val_accuracy: 0.6316
Epoch 31/200
1/1 [==============================] - 0s 27ms/step - loss: 1.3050 - accuracy: 0.4865 - val_loss: 1.1272 - val_accuracy: 0.6316
Epoch 32/200
1/1 [==============================] - 0s 27ms/step - loss: 1.4018 - accuracy: 0.4324 - val_loss: 1.1239 - val_accuracy: 0.7368
Epoch 33/200
1/1 [==============================] - 0s 27ms/step - loss: 1.3378 - accuracy: 0.4865 - val_loss: 1.1234 - val_accuracy: 0.7368
Epoch 34/200
1/1 [==============================] - 0s 27ms/step - loss: 1.2352 - accuracy: 0.5541 - val_loss: 1.1260 - val_accuracy: 0.8421
Epoch 35/200
1/1 [==============================] - 0s 27ms/step - loss: 1.1676 - accuracy: 0.4189 - val_loss: 1.1299 - val_accuracy: 0.7895
Epoch 36/200
1/1 [==============================] - 0s 28ms/step - loss: 1.2028 - accuracy: 0.5405 - val_loss: 1.1346 - val_accuracy: 0.7895
Epoch 37/200
1/1 [==============================] - 0s 27ms/step - loss: 1.1161 - accuracy: 0.5541 - val_loss: 1.1394 - val_accuracy: 0.7895
Epoch 38/200
1/1 [==============================] - 0s 27ms/step - loss: 1.2738 - accuracy: 0.4189 - val_loss: 1.1423 - val_accuracy: 0.7895
Epoch 39/200
1/1 [==============================] - 0s 27ms/step - loss: 1.1717 - accuracy: 0.4865 - val_loss: 1.1421 - val_accuracy: 0.7895
Epoch 40/200
1/1 [==============================] - 0s 29ms/step - loss: 1.2465 - accuracy: 0.4459 - val_loss: 1.1378 - val_accuracy: 0.7895
Epoch 41/200
1/1 [==============================] - 0s 26ms/step - loss: 1.1213 - accuracy: 0.4595 - val_loss: 1.1310 - val_accuracy: 0.7895
Epoch 42/200
1/1 [==============================] - 0s 27ms/step - loss: 1.0994 - accuracy: 0.5270 - val_loss: 1.1205 - val_accuracy: 0.7895
Epoch 43/200
1/1 [==============================] - 0s 27ms/step - loss: 1.0657 - accuracy: 0.4730 - val_loss: 1.1063 - val_accuracy: 0.7895
Epoch 44/200
1/1 [==============================] - 0s 27ms/step - loss: 1.0323 - accuracy: 0.5676 - val_loss: 1.0885 - val_accuracy: 0.8421
Epoch 45/200
1/1 [==============================] - 0s 27ms/step - loss: 1.0781 - accuracy: 0.5405 - val_loss: 1.0678 - val_accuracy: 0.8421
Epoch 46/200
1/1 [==============================] - 0s 27ms/step - loss: 1.0304 - accuracy: 0.6081 - val_loss: 1.0469 - val_accuracy: 0.8421
Epoch 47/200
1/1 [==============================] - 0s 26ms/step - loss: 1.0777 - accuracy: 0.5405 - val_loss: 1.0275 - val_accuracy: 0.8421
Epoch 48/200
1/1 [==============================] - 0s 26ms/step - loss: 1.0383 - accuracy: 0.5811 - val_loss: 1.0102 - val_accuracy: 0.7895
Epoch 49/200
1/1 [==============================] - 0s 26ms/step - loss: 0.9981 - accuracy: 0.5676 - val_loss: 0.9954 - val_accuracy: 0.7895
Epoch 50/200
1/1 [==============================] - 0s 27ms/step - loss: 1.0116 - accuracy: 0.5541 - val_loss: 0.9812 - val_accuracy: 0.7895
Epoch 51/200
1/1 [==============================] - 0s 28ms/step - loss: 0.9892 - accuracy: 0.5135 - val_loss: 0.9679 - val_accuracy: 0.7895
Epoch 52/200
1/1 [==============================] - 0s 28ms/step - loss: 0.8942 - accuracy: 0.6351 - val_loss: 0.9552 - val_accuracy: 0.7895
Epoch 53/200
1/1 [==============================] - 0s 27ms/step - loss: 0.9137 - accuracy: 0.6216 - val_loss: 0.9425 - val_accuracy: 0.7895
Epoch 54/200
1/1 [==============================] - 0s 27ms/step - loss: 0.9805 - accuracy: 0.5946 - val_loss: 0.9296 - val_accuracy: 0.7895
Epoch 55/200
1/1 [==============================] - 0s 27ms/step - loss: 0.8946 - accuracy: 0.5541 - val_loss: 0.9173 - val_accuracy: 0.7895
Epoch 56/200
1/1 [==============================] - 0s 28ms/step - loss: 0.8603 - accuracy: 0.5946 - val_loss: 0.9039 - val_accuracy: 0.7895
Epoch 57/200
1/1 [==============================] - 0s 27ms/step - loss: 0.9809 - accuracy: 0.5405 - val_loss: 0.8893 - val_accuracy: 0.7895
Epoch 58/200
1/1 [==============================] - 0s 27ms/step - loss: 0.8591 - accuracy: 0.6351 - val_loss: 0.8732 - val_accuracy: 0.7895
Epoch 59/200
1/1 [==============================] - 0s 26ms/step - loss: 0.8908 - accuracy: 0.5270 - val_loss: 0.8557 - val_accuracy: 0.7895
Epoch 60/200
1/1 [==============================] - 0s 28ms/step - loss: 0.8473 - accuracy: 0.6216 - val_loss: 0.8386 - val_accuracy: 0.7895
Epoch 61/200
1/1 [==============================] - 0s 27ms/step - loss: 0.8806 - accuracy: 0.6081 - val_loss: 0.8234 - val_accuracy: 0.7895
Epoch 62/200
1/1 [==============================] - 0s 27ms/step - loss: 0.8466 - accuracy: 0.6351 - val_loss: 0.8074 - val_accuracy: 0.7895
Epoch 63/200
1/1 [==============================] - 0s 27ms/step - loss: 0.7787 - accuracy: 0.6351 - val_loss: 0.7931 - val_accuracy: 0.7895
Epoch 64/200
1/1 [==============================] - 0s 26ms/step - loss: 0.8178 - accuracy: 0.6486 - val_loss: 0.7795 - val_accuracy: 0.7895
Epoch 65/200
1/1 [==============================] - 0s 27ms/step - loss: 0.8957 - accuracy: 0.5811 - val_loss: 0.7667 - val_accuracy: 0.7895
Epoch 66/200
1/1 [==============================] - 0s 30ms/step - loss: 0.8603 - accuracy: 0.6216 - val_loss: 0.7559 - val_accuracy: 0.7895
Epoch 67/200
1/1 [==============================] - 0s 27ms/step - loss: 0.8287 - accuracy: 0.6351 - val_loss: 0.7448 - val_accuracy: 0.7895
Epoch 68/200
1/1 [==============================] - 0s 27ms/step - loss: 0.8704 - accuracy: 0.6081 - val_loss: 0.7333 - val_accuracy: 0.7895
Epoch 69/200
1/1 [==============================] - 0s 27ms/step - loss: 0.8959 - accuracy: 0.5541 - val_loss: 0.7210 - val_accuracy: 0.7895
Epoch 70/200
1/1 [==============================] - 0s 26ms/step - loss: 0.7852 - accuracy: 0.6892 - val_loss: 0.7103 - val_accuracy: 0.7895
Epoch 71/200
1/1 [==============================] - 0s 26ms/step - loss: 0.8703 - accuracy: 0.5676 - val_loss: 0.6960 - val_accuracy: 0.7895
Epoch 72/200
1/1 [==============================] - 0s 27ms/step - loss: 0.8493 - accuracy: 0.6216 - val_loss: 0.6818 - val_accuracy: 0.7895
Epoch 73/200
1/1 [==============================] - 0s 28ms/step - loss: 0.8591 - accuracy: 0.6081 - val_loss: 0.6686 - val_accuracy: 0.7895
Epoch 74/200
1/1 [==============================] - 0s 27ms/step - loss: 0.8284 - accuracy: 0.5946 - val_loss: 0.6552 - val_accuracy: 0.7895
Epoch 75/200
1/1 [==============================] - 0s 28ms/step - loss: 0.7763 - accuracy: 0.6757 - val_loss: 0.6428 - val_accuracy: 0.7895
Epoch 76/200
1/1 [==============================] - 0s 26ms/step - loss: 0.7612 - accuracy: 0.6486 - val_loss: 0.6342 - val_accuracy: 0.7895
Epoch 77/200
1/1 [==============================] - 0s 29ms/step - loss: 0.8494 - accuracy: 0.6622 - val_loss: 0.6285 - val_accuracy: 0.7895
Epoch 78/200
1/1 [==============================] - 0s 27ms/step - loss: 0.7784 - accuracy: 0.6757 - val_loss: 0.6243 - val_accuracy: 0.7895
Epoch 79/200
1/1 [==============================] - 0s 26ms/step - loss: 0.7807 - accuracy: 0.6081 - val_loss: 0.6187 - val_accuracy: 0.7895
Epoch 80/200
1/1 [==============================] - 0s 26ms/step - loss: 0.7544 - accuracy: 0.6486 - val_loss: 0.6139 - val_accuracy: 0.7895
Epoch 81/200
1/1 [==============================] - 0s 27ms/step - loss: 0.7195 - accuracy: 0.6757 - val_loss: 0.6063 - val_accuracy: 0.7895
Epoch 82/200
1/1 [==============================] - 0s 27ms/step - loss: 0.7580 - accuracy: 0.5541 - val_loss: 0.5988 - val_accuracy: 0.7895
Epoch 83/200
1/1 [==============================] - 0s 27ms/step - loss: 0.7504 - accuracy: 0.6757 - val_loss: 0.5883 - val_accuracy: 0.7895
Epoch 84/200
1/1 [==============================] - 0s 27ms/step - loss: 0.7535 - accuracy: 0.6351 - val_loss: 0.5765 - val_accuracy: 0.7895
Epoch 85/200
1/1 [==============================] - 0s 27ms/step - loss: 0.6617 - accuracy: 0.7027 - val_loss: 0.5668 - val_accuracy: 0.7895
Epoch 86/200
1/1 [==============================] - 0s 29ms/step - loss: 0.6893 - accuracy: 0.6757 - val_loss: 0.5601 - val_accuracy: 0.7895
Epoch 87/200
1/1 [==============================] - 0s 28ms/step - loss: 0.7766 - accuracy: 0.6622 - val_loss: 0.5539 - val_accuracy: 0.7895
Epoch 88/200
1/1 [==============================] - 0s 33ms/step - loss: 0.8263 - accuracy: 0.5676 - val_loss: 0.5510 - val_accuracy: 0.7895
Epoch 89/200
1/1 [==============================] - 0s 29ms/step - loss: 0.7405 - accuracy: 0.6216 - val_loss: 0.5515 - val_accuracy: 0.7895
Epoch 90/200
1/1 [==============================] - 0s 27ms/step - loss: 0.6805 - accuracy: 0.6892 - val_loss: 0.5522 - val_accuracy: 0.7895
Epoch 91/200
1/1 [==============================] - 0s 27ms/step - loss: 0.7647 - accuracy: 0.5811 - val_loss: 0.5505 - val_accuracy: 0.7895
Epoch 92/200
1/1 [==============================] - 0s 27ms/step - loss: 0.7871 - accuracy: 0.6216 - val_loss: 0.5438 - val_accuracy: 0.7895
Epoch 93/200
1/1 [==============================] - 0s 27ms/step - loss: 0.6979 - accuracy: 0.6892 - val_loss: 0.5355 - val_accuracy: 0.7895
Epoch 94/200
1/1 [==============================] - 0s 27ms/step - loss: 0.6827 - accuracy: 0.6892 - val_loss: 0.5256 - val_accuracy: 0.7895
Epoch 95/200
1/1 [==============================] - 0s 27ms/step - loss: 0.7583 - accuracy: 0.5946 - val_loss: 0.5207 - val_accuracy: 0.7895
Epoch 96/200
1/1 [==============================] - 0s 27ms/step - loss: 0.7661 - accuracy: 0.6622 - val_loss: 0.5149 - val_accuracy: 0.7895
Epoch 97/200
1/1 [==============================] - 0s 27ms/step - loss: 0.6860 - accuracy: 0.6216 - val_loss: 0.5134 - val_accuracy: 0.7895
Epoch 98/200
1/1 [==============================] - 0s 27ms/step - loss: 0.7218 - accuracy: 0.6351 - val_loss: 0.5094 - val_accuracy: 0.7895
Epoch 99/200
1/1 [==============================] - 0s 26ms/step - loss: 0.8124 - accuracy: 0.5946 - val_loss: 0.5064 - val_accuracy: 0.7895
Epoch 100/200
1/1 [==============================] - 0s 26ms/step - loss: 0.6782 - accuracy: 0.7297 - val_loss: 0.5034 - val_accuracy: 0.7895
Epoch 101/200
1/1 [==============================] - 0s 29ms/step - loss: 0.7300 - accuracy: 0.6757 - val_loss: 0.4995 - val_accuracy: 0.7895
Epoch 102/200
1/1 [==============================] - 0s 28ms/step - loss: 0.6353 - accuracy: 0.6757 - val_loss: 0.4924 - val_accuracy: 0.7895
Epoch 103/200
1/1 [==============================] - 0s 27ms/step - loss: 0.6635 - accuracy: 0.6757 - val_loss: 0.4877 - val_accuracy: 0.7895
Epoch 104/200
1/1 [==============================] - 0s 30ms/step - loss: 0.7488 - accuracy: 0.6351 - val_loss: 0.4850 - val_accuracy: 0.7895
Epoch 105/200
1/1 [==============================] - 0s 27ms/step - loss: 0.6803 - accuracy: 0.6351 - val_loss: 0.4814 - val_accuracy: 0.7895
Epoch 106/200
1/1 [==============================] - 0s 27ms/step - loss: 0.7289 - accuracy: 0.6216 - val_loss: 0.4781 - val_accuracy: 0.7895
Epoch 107/200
1/1 [==============================] - 0s 28ms/step - loss: 0.6911 - accuracy: 0.6351 - val_loss: 0.4770 - val_accuracy: 0.7895
Epoch 108/200
1/1 [==============================] - 0s 27ms/step - loss: 0.6763 - accuracy: 0.6757 - val_loss: 0.4760 - val_accuracy: 0.7895
Epoch 109/200
1/1 [==============================] - 0s 27ms/step - loss: 0.7053 - accuracy: 0.7297 - val_loss: 0.4758 - val_accuracy: 0.7895
Epoch 110/200
1/1 [==============================] - 0s 28ms/step - loss: 0.6434 - accuracy: 0.7027 - val_loss: 0.4767 - val_accuracy: 0.7895
Epoch 111/200
1/1 [==============================] - 0s 27ms/step - loss: 0.6825 - accuracy: 0.6486 - val_loss: 0.4781 - val_accuracy: 0.7895
Epoch 112/200
1/1 [==============================] - 0s 28ms/step - loss: 0.6328 - accuracy: 0.7027 - val_loss: 0.4779 - val_accuracy: 0.7895
Epoch 113/200
1/1 [==============================] - 0s 30ms/step - loss: 0.6800 - accuracy: 0.6486 - val_loss: 0.4769 - val_accuracy: 0.7895
Epoch 114/200
1/1 [==============================] - 0s 29ms/step - loss: 0.6569 - accuracy: 0.7027 - val_loss: 0.4735 - val_accuracy: 0.7895
Epoch 115/200
1/1 [==============================] - 0s 28ms/step - loss: 0.6640 - accuracy: 0.6757 - val_loss: 0.4694 - val_accuracy: 0.7895
Epoch 116/200
1/1 [==============================] - 0s 27ms/step - loss: 0.5961 - accuracy: 0.7568 - val_loss: 0.4640 - val_accuracy: 0.7895
Epoch 117/200
1/1 [==============================] - 0s 28ms/step - loss: 0.6602 - accuracy: 0.6351 - val_loss: 0.4611 - val_accuracy: 0.7895
Epoch 118/200
1/1 [==============================] - 0s 28ms/step - loss: 0.7038 - accuracy: 0.6351 - val_loss: 0.4610 - val_accuracy: 0.8421
Epoch 119/200
1/1 [==============================] - 0s 27ms/step - loss: 0.7826 - accuracy: 0.6081 - val_loss: 0.4608 - val_accuracy: 0.7895
Epoch 120/200
1/1 [==============================] - 0s 28ms/step - loss: 0.6046 - accuracy: 0.7162 - val_loss: 0.4606 - val_accuracy: 0.7895
Epoch 121/200
1/1 [==============================] - 0s 27ms/step - loss: 0.7212 - accuracy: 0.6351 - val_loss: 0.4584 - val_accuracy: 0.7895
Epoch 122/200
1/1 [==============================] - 0s 27ms/step - loss: 0.6361 - accuracy: 0.6892 - val_loss: 0.4575 - val_accuracy: 0.7895
Epoch 123/200
1/1 [==============================] - 0s 27ms/step - loss: 0.6885 - accuracy: 0.6622 - val_loss: 0.4584 - val_accuracy: 0.7895
Epoch 124/200
1/1 [==============================] - 0s 28ms/step - loss: 0.6967 - accuracy: 0.6486 - val_loss: 0.4599 - val_accuracy: 0.8421
Epoch 125/200
1/1 [==============================] - 0s 27ms/step - loss: 0.6326 - accuracy: 0.7027 - val_loss: 0.4631 - val_accuracy: 0.8421
Epoch 126/200
1/1 [==============================] - 0s 27ms/step - loss: 0.6964 - accuracy: 0.6622 - val_loss: 0.4664 - val_accuracy: 0.8421
Epoch 127/200
1/1 [==============================] - 0s 27ms/step - loss: 0.5626 - accuracy: 0.8108 - val_loss: 0.4724 - val_accuracy: 0.7895
Epoch 128/200
1/1 [==============================] - 0s 27ms/step - loss: 0.6771 - accuracy: 0.6622 - val_loss: 0.4708 - val_accuracy: 0.7895
Epoch 129/200
1/1 [==============================] - 0s 26ms/step - loss: 0.6819 - accuracy: 0.7027 - val_loss: 0.4654 - val_accuracy: 0.7895
Epoch 130/200
1/1 [==============================] - 0s 27ms/step - loss: 0.5980 - accuracy: 0.7162 - val_loss: 0.4596 - val_accuracy: 0.7895
Epoch 131/200
1/1 [==============================] - 0s 31ms/step - loss: 0.5978 - accuracy: 0.6892 - val_loss: 0.4556 - val_accuracy: 0.7895
Epoch 132/200
1/1 [==============================] - 0s 27ms/step - loss: 0.6286 - accuracy: 0.7027 - val_loss: 0.4548 - val_accuracy: 0.7895
Epoch 133/200
1/1 [==============================] - 0s 28ms/step - loss: 0.5561 - accuracy: 0.7703 - val_loss: 0.4567 - val_accuracy: 0.7895
Epoch 134/200
1/1 [==============================] - 0s 27ms/step - loss: 0.6529 - accuracy: 0.7162 - val_loss: 0.4622 - val_accuracy: 0.7895
Epoch 135/200
1/1 [==============================] - 0s 27ms/step - loss: 0.5997 - accuracy: 0.7027 - val_loss: 0.4692 - val_accuracy: 0.7895
Epoch 136/200
1/1 [==============================] - 0s 27ms/step - loss: 0.6250 - accuracy: 0.7027 - val_loss: 0.4711 - val_accuracy: 0.7895
Epoch 137/200
1/1 [==============================] - 0s 28ms/step - loss: 0.5961 - accuracy: 0.7162 - val_loss: 0.4695 - val_accuracy: 0.7895
Epoch 138/200
1/1 [==============================] - 0s 27ms/step - loss: 0.5883 - accuracy: 0.6892 - val_loss: 0.4647 - val_accuracy: 0.7895
Epoch 139/200
1/1 [==============================] - 0s 29ms/step - loss: 0.6457 - accuracy: 0.7027 - val_loss: 0.4601 - val_accuracy: 0.7895
Epoch 140/200
1/1 [==============================] - 0s 27ms/step - loss: 0.6708 - accuracy: 0.6757 - val_loss: 0.4629 - val_accuracy: 0.7895
Epoch 141/200
1/1 [==============================] - 0s 27ms/step - loss: 0.7040 - accuracy: 0.6622 - val_loss: 0.4806 - val_accuracy: 0.7895
Epoch 142/200
1/1 [==============================] - 0s 27ms/step - loss: 0.6145 - accuracy: 0.7432 - val_loss: 0.5073 - val_accuracy: 0.7895
Epoch 143/200
1/1 [==============================] - 0s 27ms/step - loss: 0.6031 - accuracy: 0.7432 - val_loss: 0.5123 - val_accuracy: 0.7895
Epoch 144/200
1/1 [==============================] - 0s 27ms/step - loss: 0.6750 - accuracy: 0.6757 - val_loss: 0.4918 - val_accuracy: 0.7895
Epoch 145/200
1/1 [==============================] - 0s 27ms/step - loss: 0.6158 - accuracy: 0.6622 - val_loss: 0.4673 - val_accuracy: 0.7895
Epoch 146/200
1/1 [==============================] - 0s 28ms/step - loss: 0.5883 - accuracy: 0.7432 - val_loss: 0.4587 - val_accuracy: 0.7895
Epoch 147/200
1/1 [==============================] - 0s 27ms/step - loss: 0.6341 - accuracy: 0.7432 - val_loss: 0.4622 - val_accuracy: 0.7895
Epoch 148/200
1/1 [==============================] - 0s 27ms/step - loss: 0.6748 - accuracy: 0.6757 - val_loss: 0.4754 - val_accuracy: 0.7895
Epoch 149/200
1/1 [==============================] - 0s 28ms/step - loss: 0.6118 - accuracy: 0.7297 - val_loss: 0.4958 - val_accuracy: 0.7895
Epoch 150/200
1/1 [==============================] - 0s 27ms/step - loss: 0.5697 - accuracy: 0.7568 - val_loss: 0.5049 - val_accuracy: 0.7895
Epoch 151/200
1/1 [==============================] - 0s 27ms/step - loss: 0.5974 - accuracy: 0.7027 - val_loss: 0.4982 - val_accuracy: 0.7895
Epoch 152/200
1/1 [==============================] - 0s 27ms/step - loss: 0.6279 - accuracy: 0.7297 - val_loss: 0.4897 - val_accuracy: 0.7895
Epoch 153/200
1/1 [==============================] - 0s 28ms/step - loss: 0.6029 - accuracy: 0.7432 - val_loss: 0.4893 - val_accuracy: 0.7895
Epoch 154/200
1/1 [==============================] - 0s 27ms/step - loss: 0.5966 - accuracy: 0.6892 - val_loss: 0.4993 - val_accuracy: 0.7895
Epoch 155/200
1/1 [==============================] - 0s 29ms/step - loss: 0.5968 - accuracy: 0.7162 - val_loss: 0.4997 - val_accuracy: 0.7895
Epoch 156/200
1/1 [==============================] - 0s 27ms/step - loss: 0.6125 - accuracy: 0.7297 - val_loss: 0.4955 - val_accuracy: 0.7895
Epoch 157/200
1/1 [==============================] - 0s 27ms/step - loss: 0.6205 - accuracy: 0.7432 - val_loss: 0.4962 - val_accuracy: 0.7895
Epoch 158/200
1/1 [==============================] - 0s 27ms/step - loss: 0.6400 - accuracy: 0.6622 - val_loss: 0.4966 - val_accuracy: 0.7895
Epoch 159/200
1/1 [==============================] - 0s 28ms/step - loss: 0.6432 - accuracy: 0.6892 - val_loss: 0.4917 - val_accuracy: 0.7895
Epoch 160/200
1/1 [==============================] - 0s 27ms/step - loss: 0.5253 - accuracy: 0.7703 - val_loss: 0.4864 - val_accuracy: 0.7895
Epoch 161/200
1/1 [==============================] - 0s 27ms/step - loss: 0.6324 - accuracy: 0.6757 - val_loss: 0.4838 - val_accuracy: 0.7895
Epoch 162/200
1/1 [==============================] - 0s 27ms/step - loss: 0.6549 - accuracy: 0.6892 - val_loss: 0.4918 - val_accuracy: 0.7895
Epoch 163/200
1/1 [==============================] - 0s 27ms/step - loss: 0.5720 - accuracy: 0.7162 - val_loss: 0.5161 - val_accuracy: 0.7895
Epoch 164/200
1/1 [==============================] - 0s 27ms/step - loss: 0.5958 - accuracy: 0.7162 - val_loss: 0.5286 - val_accuracy: 0.7895
Epoch 165/200
1/1 [==============================] - 0s 27ms/step - loss: 0.6366 - accuracy: 0.6351 - val_loss: 0.5247 - val_accuracy: 0.7895
Epoch 166/200
1/1 [==============================] - 0s 26ms/step - loss: 0.5451 - accuracy: 0.7703 - val_loss: 0.5077 - val_accuracy: 0.7895
Epoch 167/200
1/1 [==============================] - 0s 28ms/step - loss: 0.5748 - accuracy: 0.7162 - val_loss: 0.4969 - val_accuracy: 0.7895
Epoch 168/200
1/1 [==============================] - 0s 29ms/step - loss: 0.5710 - accuracy: 0.7432 - val_loss: 0.4874 - val_accuracy: 0.7895
Epoch 169/200
1/1 [==============================] - 0s 29ms/step - loss: 0.6278 - accuracy: 0.7297 - val_loss: 0.4915 - val_accuracy: 0.7895
Epoch 170/200
1/1 [==============================] - 0s 38ms/step - loss: 0.6030 - accuracy: 0.7162 - val_loss: 0.4964 - val_accuracy: 0.7895
Epoch 171/200
1/1 [==============================] - 0s 34ms/step - loss: 0.5251 - accuracy: 0.7703 - val_loss: 0.5068 - val_accuracy: 0.7895
Epoch 172/200
1/1 [==============================] - 0s 29ms/step - loss: 0.5589 - accuracy: 0.7162 - val_loss: 0.5274 - val_accuracy: 0.7895
Epoch 173/200
1/1 [==============================] - 0s 26ms/step - loss: 0.6435 - accuracy: 0.7162 - val_loss: 0.5385 - val_accuracy: 0.7895
Epoch 174/200
1/1 [==============================] - 0s 32ms/step - loss: 0.5588 - accuracy: 0.7162 - val_loss: 0.5348 - val_accuracy: 0.7895
Epoch 175/200
1/1 [==============================] - 0s 35ms/step - loss: 0.5468 - accuracy: 0.7568 - val_loss: 0.5230 - val_accuracy: 0.7895
Epoch 176/200
1/1 [==============================] - 0s 32ms/step - loss: 0.6330 - accuracy: 0.6892 - val_loss: 0.5259 - val_accuracy: 0.7895
Epoch 177/200
1/1 [==============================] - 0s 31ms/step - loss: 0.6258 - accuracy: 0.7027 - val_loss: 0.5343 - val_accuracy: 0.7895
Epoch 178/200
1/1 [==============================] - 0s 29ms/step - loss: 0.5607 - accuracy: 0.7162 - val_loss: 0.5456 - val_accuracy: 0.7895
Epoch 179/200
1/1 [==============================] - 0s 28ms/step - loss: 0.5309 - accuracy: 0.7703 - val_loss: 0.5609 - val_accuracy: 0.7895
Epoch 180/200
1/1 [==============================] - 0s 37ms/step - loss: 0.5459 - accuracy: 0.7838 - val_loss: 0.5681 - val_accuracy: 0.7895
Epoch 181/200
1/1 [==============================] - 0s 37ms/step - loss: 0.6314 - accuracy: 0.7162 - val_loss: 0.5662 - val_accuracy: 0.7895
Epoch 182/200
1/1 [==============================] - 0s 27ms/step - loss: 0.5203 - accuracy: 0.7703 - val_loss: 0.5527 - val_accuracy: 0.7895
Epoch 183/200
1/1 [==============================] - 0s 33ms/step - loss: 0.5764 - accuracy: 0.7432 - val_loss: 0.5244 - val_accuracy: 0.7895
Epoch 184/200
1/1 [==============================] - 0s 29ms/step - loss: 0.5249 - accuracy: 0.7703 - val_loss: 0.5006 - val_accuracy: 0.7895
Epoch 185/200
1/1 [==============================] - 0s 30ms/step - loss: 0.5956 - accuracy: 0.7162 - val_loss: 0.5117 - val_accuracy: 0.7895
Epoch 186/200
1/1 [==============================] - 0s 28ms/step - loss: 0.5632 - accuracy: 0.7297 - val_loss: 0.5399 - val_accuracy: 0.7895
Epoch 187/200
1/1 [==============================] - 0s 28ms/step - loss: 0.5711 - accuracy: 0.7568 - val_loss: 0.5639 - val_accuracy: 0.7895
Epoch 188/200
1/1 [==============================] - 0s 27ms/step - loss: 0.5793 - accuracy: 0.7568 - val_loss: 0.5698 - val_accuracy: 0.7895
Epoch 189/200
1/1 [==============================] - 0s 29ms/step - loss: 0.5031 - accuracy: 0.7703 - val_loss: 0.5574 - val_accuracy: 0.7895
Epoch 190/200
1/1 [==============================] - 0s 28ms/step - loss: 0.5482 - accuracy: 0.6757 - val_loss: 0.5321 - val_accuracy: 0.7895
Epoch 191/200
1/1 [==============================] - 0s 33ms/step - loss: 0.5899 - accuracy: 0.7297 - val_loss: 0.5058 - val_accuracy: 0.7895
Epoch 192/200
1/1 [==============================] - 0s 29ms/step - loss: 0.5548 - accuracy: 0.7162 - val_loss: 0.5085 - val_accuracy: 0.7895
Epoch 193/200
1/1 [==============================] - 0s 29ms/step - loss: 0.5299 - accuracy: 0.7297 - val_loss: 0.5173 - val_accuracy: 0.7895
Epoch 194/200
1/1 [==============================] - 0s 28ms/step - loss: 0.5057 - accuracy: 0.7838 - val_loss: 0.5230 - val_accuracy: 0.7895
Epoch 195/200
1/1 [==============================] - 0s 29ms/step - loss: 0.6135 - accuracy: 0.7703 - val_loss: 0.5327 - val_accuracy: 0.7895
Epoch 196/200
1/1 [==============================] - 0s 28ms/step - loss: 0.5791 - accuracy: 0.7027 - val_loss: 0.5404 - val_accuracy: 0.7895
Epoch 197/200
1/1 [==============================] - 0s 27ms/step - loss: 0.5258 - accuracy: 0.7027 - val_loss: 0.5314 - val_accuracy: 0.7895
Epoch 198/200
1/1 [==============================] - 0s 30ms/step - loss: 0.5717 - accuracy: 0.7162 - val_loss: 0.5192 - val_accuracy: 0.7895
Epoch 199/200
1/1 [==============================] - 0s 28ms/step - loss: 0.5407 - accuracy: 0.7297 - val_loss: 0.5185 - val_accuracy: 0.7895
Epoch 200/200
1/1 [==============================] - 0s 27ms/step - loss: 0.5488 - accuracy: 0.7432 - val_loss: 0.5206 - val_accuracy: 0.7895

Model performance can be increased by increasing number of epochs, training data and creating custom call-backs

Training Plot

In [34]:
# Loss Curves
plt.figure(figsize=[14,10])
plt.subplot(211)
plt.plot(history.history['loss'],'r',linewidth=3.0)
plt.plot(history.history['val_loss'],'b',linewidth=3.0)
plt.legend(['Training loss', 'Validation Loss'],fontsize=18)
plt.xlabel('Epochs ',fontsize=16)
plt.ylabel('Loss',fontsize=16)
plt.title('Loss Curves',fontsize=16)
 
# Accuracy Curves
plt.figure(figsize=[14,10])
plt.subplot(212)
plt.plot(history.history['accuracy'],'r',linewidth=3.0)
plt.plot(history.history['val_accuracy'],'b',linewidth=3.0)
plt.legend(['Training Accuracy', 'Validation Accuracy'],fontsize=18)
plt.xlabel('Epochs ',fontsize=16)
plt.ylabel('Accuracy',fontsize=16)
plt.title('Accuracy Curves',fontsize=16)
Out[34]:
Text(0.5, 1.0, 'Accuracy Curves')

Save model

In [35]:
# Save model and weights
model_name = "HAD.h5"
model.save(model_name)
print('Saved trained model at %s ' % model_name)
Saved trained model at HAD.h5 

Evaluate model

In [36]:
scores = model.evaluate(x_test, y_test, verbose=1)
print('Test loss:', scores[0])
print('Test accuracy:', scores[1])
1/1 [==============================] - 0s 1ms/step - loss: 0.5206 - accuracy: 0.7895
Test loss: 0.520578145980835
Test accuracy: 0.7894737124443054

Accessing Model Performance

In [37]:
predictions = model.predict(x_test, verbose=1)
1/1 [==============================] - 0s 1ms/step
In [38]:
y_preds = predictions.argmax(axis=-1)
In [39]:
y_test = y_test.argmax(axis=-1)
In [40]:
y_pred = encoder.inverse_transform(y_preds)

y_test = encoder.inverse_transform(y_test)
In [41]:
df = pd.DataFrame(columns=['Predicted Labels', 'Actual Labels'])
df['Predicted Labels'] = y_pred.flatten()
df['Actual Labels'] = y_test.flatten()
df.head(19)
Out[41]:
Predicted Labels Actual Labels
0 murmur normal
1 artifact artifact
2 murmur murmur
3 extrahls normal
4 murmur extrahls
5 artifact normal
6 artifact artifact
7 artifact artifact
8 murmur murmur
9 artifact artifact
10 artifact artifact
11 murmur murmur
12 murmur murmur
13 artifact artifact
14 murmur murmur
15 murmur murmur
16 murmur murmur
17 artifact artifact
18 murmur murmur

Compiling model with DeepC

In [42]:
!deepCC HAD.h5
reading [keras model] from 'HAD.h5'
Saved 'HAD.onnx'
reading onnx model from file  HAD.onnx
Model info:
  ir_vesion :  4 
  doc       : 
WARN (ONNX): terminal (input/output) conv2d_input's shape is less than 1.
             changing it to 1.
WARN (ONNX): terminal (input/output) dense_2's shape is less than 1.
             changing it to 1.
WARN (GRAPH): found operator node with the same name (dense_2) as io node.
running DNNC graph sanity check ... passed.
Writing C++ file  HAD_deepC/HAD.cpp
INFO (ONNX): model files are ready in dir HAD_deepC
g++ -std=c++11 -O3 -I. -I/opt/tljh/user/lib/python3.7/site-packages/deepC-0.13-py3.7-linux-x86_64.egg/deepC/include -isystem /opt/tljh/user/lib/python3.7/site-packages/deepC-0.13-py3.7-linux-x86_64.egg/deepC/packages/eigen-eigen-323c052e1731 HAD_deepC/HAD.cpp -o HAD_deepC/HAD.exe
Model executable  HAD_deepC/HAD.exe