Prostate Cancer Detection¶
Credit: AITS Cainvas Community
Photo by Mother Volcano on Dribbble
Importing Libraries¶
In [1]:
import numpy as np
import pandas as pd
import matplotlib.pyplot as plt
import seaborn as sns
from sklearn.preprocessing import MinMaxScaler
from sklearn.model_selection import train_test_split
from tensorflow.keras.models import Sequential
from tensorflow.keras.layers import Dense
from sklearn.metrics import accuracy_score
In [2]:
Cancer = pd.read_csv('https://cainvas-static.s3.amazonaws.com/media/user_data/cainvas-admin/Prostate_Cancer_Data-_CSV.csv')
In [3]:
Cancer.info()
<class 'pandas.core.frame.DataFrame'> RangeIndex: 100 entries, 0 to 99 Data columns (total 10 columns): # Column Non-Null Count Dtype --- ------ -------------- ----- 0 id 100 non-null int64 1 diagnosis_result 100 non-null object 2 radius 100 non-null int64 3 texture 100 non-null int64 4 perimeter 100 non-null int64 5 area 100 non-null int64 6 smoothness 100 non-null float64 7 compactness 100 non-null float64 8 symmetry 100 non-null float64 9 fractal_dimension 100 non-null float64 dtypes: float64(4), int64(5), object(1) memory usage: 7.9+ KB
In [4]:
Cancer.head(10)
Out[4]:
id | diagnosis_result | radius | texture | perimeter | area | smoothness | compactness | symmetry | fractal_dimension | |
---|---|---|---|---|---|---|---|---|---|---|
0 | 1 | M | 23 | 12 | 151 | 954 | 0.143 | 0.278 | 0.242 | 0.079 |
1 | 2 | B | 9 | 13 | 133 | 1326 | 0.143 | 0.079 | 0.181 | 0.057 |
2 | 3 | M | 21 | 27 | 130 | 1203 | 0.125 | 0.160 | 0.207 | 0.060 |
3 | 4 | M | 14 | 16 | 78 | 386 | 0.070 | 0.284 | 0.260 | 0.097 |
4 | 5 | M | 9 | 19 | 135 | 1297 | 0.141 | 0.133 | 0.181 | 0.059 |
5 | 6 | B | 25 | 25 | 83 | 477 | 0.128 | 0.170 | 0.209 | 0.076 |
6 | 7 | M | 16 | 26 | 120 | 1040 | 0.095 | 0.109 | 0.179 | 0.057 |
7 | 8 | M | 15 | 18 | 90 | 578 | 0.119 | 0.165 | 0.220 | 0.075 |
8 | 9 | M | 19 | 24 | 88 | 520 | 0.127 | 0.193 | 0.235 | 0.074 |
9 | 10 | M | 25 | 11 | 84 | 476 | 0.119 | 0.240 | 0.203 | 0.082 |
In [5]:
Cancer.tail()
Out[5]:
id | diagnosis_result | radius | texture | perimeter | area | smoothness | compactness | symmetry | fractal_dimension | |
---|---|---|---|---|---|---|---|---|---|---|
95 | 96 | M | 23 | 16 | 132 | 1264 | 0.091 | 0.131 | 0.210 | 0.056 |
96 | 97 | B | 22 | 14 | 78 | 451 | 0.105 | 0.071 | 0.190 | 0.066 |
97 | 98 | B | 19 | 27 | 62 | 295 | 0.102 | 0.053 | 0.135 | 0.069 |
98 | 99 | B | 21 | 24 | 74 | 413 | 0.090 | 0.075 | 0.162 | 0.066 |
99 | 100 | M | 16 | 27 | 94 | 643 | 0.098 | 0.114 | 0.188 | 0.064 |
In [6]:
Cancer.describe()
Out[6]:
id | radius | texture | perimeter | area | smoothness | compactness | symmetry | fractal_dimension | |
---|---|---|---|---|---|---|---|---|---|
count | 100.000000 | 100.000000 | 100.000000 | 100.000000 | 100.000000 | 100.000000 | 100.000000 | 100.000000 | 100.000000 |
mean | 50.500000 | 16.850000 | 18.230000 | 96.780000 | 702.880000 | 0.102730 | 0.126700 | 0.193170 | 0.064690 |
std | 29.011492 | 4.879094 | 5.192954 | 23.676089 | 319.710895 | 0.014642 | 0.061144 | 0.030785 | 0.008151 |
min | 1.000000 | 9.000000 | 11.000000 | 52.000000 | 202.000000 | 0.070000 | 0.038000 | 0.135000 | 0.053000 |
25% | 25.750000 | 12.000000 | 14.000000 | 82.500000 | 476.750000 | 0.093500 | 0.080500 | 0.172000 | 0.059000 |
50% | 50.500000 | 17.000000 | 17.500000 | 94.000000 | 644.000000 | 0.102000 | 0.118500 | 0.190000 | 0.063000 |
75% | 75.250000 | 21.000000 | 22.250000 | 114.250000 | 917.000000 | 0.112000 | 0.157000 | 0.209000 | 0.069000 |
max | 100.000000 | 25.000000 | 27.000000 | 172.000000 | 1878.000000 | 0.143000 | 0.345000 | 0.304000 | 0.097000 |
In [7]:
Cancer.columns
Out[7]:
Index(['id', 'diagnosis_result', 'radius', 'texture', 'perimeter', 'area', 'smoothness', 'compactness', 'symmetry', 'fractal_dimension'], dtype='object')
In [8]:
# We don't care id of the columns. So, we drop that!
Cancer.drop(['id'],axis=1,inplace=True)
In [9]:
Cancer.head()
Out[9]:
diagnosis_result | radius | texture | perimeter | area | smoothness | compactness | symmetry | fractal_dimension | |
---|---|---|---|---|---|---|---|---|---|
0 | M | 23 | 12 | 151 | 954 | 0.143 | 0.278 | 0.242 | 0.079 |
1 | B | 9 | 13 | 133 | 1326 | 0.143 | 0.079 | 0.181 | 0.057 |
2 | M | 21 | 27 | 130 | 1203 | 0.125 | 0.160 | 0.207 | 0.060 |
3 | M | 14 | 16 | 78 | 386 | 0.070 | 0.284 | 0.260 | 0.097 |
4 | M | 9 | 19 | 135 | 1297 | 0.141 | 0.133 | 0.181 | 0.059 |
Exploratory Data Analysis
In [10]:
# diagnosis_result is the most important column for us. Because we'll classify datas depend on this column.
# We have to integers for classification. Therefore, we must convert them from object to integer.
Cancer.diagnosis_result = [1 if each == 'M' else 0 for each in Cancer.diagnosis_result]
Testing our model¶
In [11]:
# Let's check it.
Cancer.diagnosis_result.value_counts()
Out[11]:
1 62 0 38 Name: diagnosis_result, dtype: int64
In [12]:
plt.figure(figsize=(10,8))
plt.bar(list(Cancer['diagnosis_result'].value_counts().index), Cancer['diagnosis_result'].value_counts(), color = ['b','r'])
plt.title('Diagnosis Result')
plt.show()
print(Cancer['diagnosis_result'].value_counts())
1 62 0 38 Name: diagnosis_result, dtype: int64
Training and Testing Dataset¶
In [13]:
# We should assign x and y values for test-train datas split.
y = Cancer.diagnosis_result.values
x_data = Cancer.drop(['diagnosis_result'],axis=1)
In [14]:
# See our values
y
Out[14]:
array([1, 0, 1, 1, 1, 0, 1, 1, 1, 1, 1, 1, 0, 1, 1, 1, 1, 1, 1, 0, 0, 0, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 0, 1, 1, 1, 1, 1, 1, 1, 1, 0, 1, 0, 0, 0, 0, 0, 1, 1, 0, 1, 1, 0, 0, 0, 0, 1, 0, 1, 1, 0, 0, 0, 0, 1, 0, 1, 1, 0, 1, 0, 1, 1, 0, 0, 0, 1, 1, 0, 1, 1, 1, 0, 0, 0, 1, 0, 0, 1, 1, 0, 0, 0, 1])
In [15]:
x_data.head()
Out[15]:
radius | texture | perimeter | area | smoothness | compactness | symmetry | fractal_dimension | |
---|---|---|---|---|---|---|---|---|
0 | 23 | 12 | 151 | 954 | 0.143 | 0.278 | 0.242 | 0.079 |
1 | 9 | 13 | 133 | 1326 | 0.143 | 0.079 | 0.181 | 0.057 |
2 | 21 | 27 | 130 | 1203 | 0.125 | 0.160 | 0.207 | 0.060 |
3 | 14 | 16 | 78 | 386 | 0.070 | 0.284 | 0.260 | 0.097 |
4 | 9 | 19 | 135 | 1297 | 0.141 | 0.133 | 0.181 | 0.059 |
Normalization¶
In [16]:
# Normalization: Normalization means all of the values of data, scale between 0 and 1.
from sklearn.preprocessing import MinMaxScaler
scaler = MinMaxScaler(feature_range=(0,1))
x = scaler.fit_transform(x_data)
In [17]:
x
Out[17]:
array([[0.875 , 0.0625 , 0.825 , 0.44868735, 1. , 0.78175896, 0.63313609, 0.59090909], [0. , 0.125 , 0.675 , 0.67064439, 1. , 0.13355049, 0.27218935, 0.09090909], [0.75 , 1. , 0.65 , 0.59725537, 0.75342466, 0.39739414, 0.4260355 , 0.15909091], [0.3125 , 0.3125 , 0.21666667, 0.1097852 , 0. , 0.80130293, 0.73964497, 1. ], [0. , 0.5 , 0.69166667, 0.65334129, 0.97260274, 0.30944625, 0.27218935, 0.13636364], [1. , 0.875 , 0.25833333, 0.16408115, 0.79452055, 0.42996743, 0.43786982, 0.52272727], [0.4375 , 0.9375 , 0.56666667, 0.5 , 0.34246575, 0.23127036, 0.26035503, 0.09090909], [0.375 , 0.4375 , 0.31666667, 0.22434368, 0.67123288, 0.41368078, 0.50295858, 0.5 ], [0.625 , 0.8125 , 0.3 , 0.18973747, 0.78082192, 0.50488599, 0.59171598, 0.47727273], [1. , 0. , 0.26666667, 0.16348449, 0.67123288, 0.65798046, 0.40236686, 0.65909091], [0.9375 , 0.625 , 0.425 , 0.35560859, 0.16438356, 0.09446254, 0.10650888, 0.09090909], [0.5 , 0.25 , 0.43333333, 0.34546539, 0.36986301, 0.29641694, 0.28994083, 0.18181818], [0.3125 , 0.25 , 0.66666667, 0.54952267, 0.36986301, 0.67752443, 0.62130178, 0.56818182], [0.1875 , 0.6875 , 0.43333333, 0.34665871, 0.19178082, 0.2019544 , 0.29585799, 0. ], [0.1875 , 0.125 , 0.35 , 0.22434368, 0.5890411 , 0.62214984, 0.4260355 , 0.54545455], [0.8125 , 0.5 , 0.375 , 0.27267303, 0.60273973, 0.39739414, 0.56213018, 0.40909091], [0.0625 , 0.3125 , 0.35833333, 0.28818616, 0.39726027, 0.11074919, 0.14201183, 0.13636364], [0.375 , 0.1875 , 0.46666667, 0.35620525, 0.64383562, 0.53420195, 0.47928994, 0.47727273], [0.6875 , 0.1875 , 0.65 , 0.63126492, 0.38356164, 0.21172638, 0.13609467, 0.02272727], [0.5 , 0. , 0.29166667, 0.21718377, 0.38356164, 0.14006515, 0.31952663, 0.11363636], [0.4375 , 0.1875 , 0.28333333, 0.18973747, 0.52054795, 0.28990228, 0.36686391, 0.34090909], [0.5 , 0.8125 , 0.06666667, 0.04295943, 0.43835616, 0.08794788, 0.27810651, 0.36363636], [0.6875 , 1. , 0.425 , 0.29952267, 0.50684932, 0.5732899 , 0.69230769, 0.38636364], [0.625 , 0.0625 , 0.70833333, 0.71718377, 0.32876712, 0.20846906, 0.24852071, 0. ], [0. , 0.125 , 0.48333333, 0.41945107, 0.57534247, 0.35179153, 0.38461538, 0.22727273], [0.625 , 1. , 0.53333333, 0.42422434, 0.67123288, 0.61889251, 1. , 0.47727273], [0.0625 , 0.8125 , 0.375 , 0.26431981, 0.47945205, 0.48534202, 0.53254438, 0.36363636], [0.4375 , 0.8125 , 0.58333333, 0.53221957, 0.32876712, 0.2247557 , 0.20710059, 0.09090909], [0.375 , 0.25 , 0.41666667, 0.31622912, 0.52054795, 0.42996743, 0.34319527, 0.27272727], [0.125 , 0.3125 , 0.525 , 0.44928401, 0.38356164, 0.25407166, 0.23076923, 0.18181818], [0.125 , 0.6875 , 0.60833333, 0.52863962, 0.49315068, 0.49185668, 0.49112426, 0.20454545], [0.875 , 0.9375 , 0.21666667, 0.14260143, 0.56164384, 0.3713355 , 0.56213018, 0.56818182], [0.6875 , 0.4375 , 0.50833333, 0.41587112, 0.68493151, 0.36482085, 0.53254438, 0.25 ], [0.125 , 0.625 , 0.63333333, 0.57279236, 0.32876712, 0.43648208, 0.29585799, 0.22727273], [0.4375 , 0.75 , 0.45833333, 0.36097852, 0.46575342, 0.38436482, 0.38461538, 0.27272727], [0.0625 , 0.125 , 0.48333333, 0.39856802, 0.35616438, 0.31270358, 0.32544379, 0.09090909], [0.5625 , 0.0625 , 0.35 , 0.2571599 , 0.38356164, 0.23452769, 0.31952663, 0.18181818], [0.75 , 0. , 0.25833333, 0.19212411, 0.2739726 , 0. , 0.07100592, 0.13636364], [0.125 , 0.25 , 0.36666667, 0.29653938, 0.32876712, 0.04234528, 0.13017751, 0.04545455], [0.0625 , 0.1875 , 0.3 , 0.21300716, 0.43835616, 0.28664495, 0.21893491, 0.25 ], [0.9375 , 0.3125 , 0.28333333, 0.21539379, 0.16438356, 0.07166124, 0.25443787, 0.06818182], [0.625 , 1. , 0.16666667, 0.10083532, 0.7260274 , 0.27361564, 0.32544379, 0.36363636], [0.125 , 0. , 0.63333333, 0.53818616, 0.28767123, 0.58957655, 0.56804734, 0.22727273], [0.375 , 0.625 , 0.29166667, 0.20465394, 0.46575342, 0.34527687, 0.36686391, 0.34090909], [0.0625 , 0.25 , 0.275 , 0.19689737, 0.36986301, 0.21824104, 0.23668639, 0.20454545], [0.5625 , 0. , 0.6 , 0.52147971, 0.54794521, 0.4267101 , 0.33136095, 0.15909091], [0.8125 , 0.0625 , 0. , 0. , 0.21917808, 0.06840391, 0.24852071, 0.27272727], [0.6875 , 0.1875 , 0.28333333, 0.19868735, 0.63013699, 0.27687296, 0.46153846, 0.34090909], [0.6875 , 0.625 , 0.21666667, 0.1473747 , 0.45205479, 0.17263844, 0.19526627, 0.15909091], [1. , 0. , 0.29166667, 0.21420048, 0.24657534, 0.12703583, 0.27218935, 0.09090909], [0.625 , 0.875 , 0.19166667, 0.13484487, 0.21917808, 0.03908795, 0.0887574 , 0.13636364], [0.625 , 0.6875 , 0.29166667, 0.22076372, 0.09589041, 0.07491857, 0. , 0.15909091], [1. , 0.25 , 0.2 , 0.14081146, 0.17808219, 0.03257329, 0.30769231, 0.18181818], [0.3125 , 0.9375 , 0.56666667, 0.49582339, 0.61643836, 0.36156352, 0.43786982, 0.22727273], [0.5625 , 0.875 , 0.375 , 0.3048926 , 0.28767123, 0.10749186, 0.15976331, 0.09090909], [0.5625 , 0.125 , 0.175 , 0.12350835, 0.34246575, 0.05537459, 0.33727811, 0.13636364], [0.0625 , 0.5 , 0.61666667, 0.56682578, 0.47945205, 0.28990228, 0.33727811, 0.15909091], [0.5 , 0.5625 , 0.36666667, 0.27147971, 0.60273973, 0.32247557, 0.40236686, 0.34090909], [0.8125 , 0.25 , 0.25833333, 0.19391408, 0.15068493, 0. , 0.27810651, 0.04545455], [0.875 , 0.9375 , 0.01666667, 0.01372315, 0.38356164, 0.04885993, 0.19526627, 0.43181818], [0.375 , 0.4375 , 0.10833333, 0.06563246, 0.5890411 , 0.14006515, 0.82248521, 0.38636364], [1. , 0.25 , 0.025 , 0.01193317, 0.73972603, 0.16938111, 0.28402367, 0.34090909], [0.1875 , 0.6875 , 0.36666667, 0.26491647, 0.47945205, 0.53094463, 0.35502959, 0.45454545], [0.9375 , 0.375 , 0.05833333, 0.03520286, 0.09589041, 0.16286645, 0.58579882, 0.38636364], [0.4375 , 0.5 , 0.25833333, 0.17720764, 0.57534247, 0.28664495, 0.33136095, 0.29545455], [0.125 , 0.625 , 0.375 , 0.27804296, 0.64383562, 0.35830619, 0.35502959, 0.31818182], [0.1875 , 0.125 , 0.06666667, 0.03997613, 0.46575342, 0.13029316, 0.21893491, 0.36363636], [0.5625 , 0.0625 , 0.16666667, 0.11455847, 0.15068493, 0.02931596, 0.10059172, 0.09090909], [0.4375 , 0.375 , 0.05833333, 0.02923628, 0.50684932, 0.33550489, 0.44970414, 0.61363636], [0.5 , 0.625 , 0.24166667, 0.17959427, 0.38356164, 0.04560261, 0.14201183, 0.09090909], [0.75 , 0.4375 , 0.6 , 0.55369928, 0.2739726 , 0.21172638, 0.13609467, 0.04545455], [0. , 0.9375 , 0.05833333, 0.02505967, 0.38356164, 0.37459283, 0.32544379, 0.84090909], [0.75 , 0.0625 , 0.51666667, 0.43377088, 0.50684932, 0.4723127 , 0.34319527, 0.27272727], [0.8125 , 0.875 , 0.31666667, 0.22792363, 0.42465753, 0.29315961, 0.18343195, 0.29545455], [0.5625 , 0.125 , 0.225 , 0.16050119, 0.30136986, 0.09771987, 0.21893491, 0.13636364], [0.75 , 0.4375 , 0.43333333, 0.36754177, 0.30136986, 0.14983713, 0.26627219, 0.02272727], [0.0625 , 0.375 , 0.3 , 0.21300716, 0.80821918, 0.21824104, 0.62130178, 0.29545455], [0.125 , 0.625 , 0.56666667, 0.4797136 , 0.50684932, 0.57654723, 0.47337278, 0.31818182], [0.4375 , 0.4375 , 0.76666667, 0.62231504, 0.80821918, 1. , 0.92307692, 0.63636364], [0.8125 , 0.3125 , 0.25833333, 0.18138425, 0.39726027, 0.18566775, 0.21893491, 0.15909091], [0.0625 , 0.4375 , 0.18333333, 0.11933174, 0.54794521, 0.18241042, 0.28994083, 0.38636364], [0.5 , 0.625 , 0.28333333, 0.18973747, 0.52054795, 0.37785016, 0.34911243, 0.36363636], [0.0625 , 0.25 , 1. , 1. , 0.49315068, 0.74592834, 0.28402367, 0.34090909], [0.6875 , 0.1875 , 0.64166667, 0.5548926 , 0.71232877, 0.45928339, 0.16568047, 0.43181818], [1. , 0.625 , 0.20833333, 0.14379475, 0.36986301, 0.11074919, 0.43195266, 0.15909091], [0.3125 , 0.125 , 0.575 , 0.52088305, 0.39726027, 0.21824104, 0.46153846, 0.15909091], [0.625 , 0.9375 , 0.35 , 0.26610979, 0.32876712, 0.19869707, 0.43195266, 0.06818182], [0.625 , 0. , 0.58333333, 0.52147971, 0.2739726 , 0.27035831, 0.35502959, 0.06818182], [0.125 , 0. , 0.23333333, 0.1575179 , 0.24657534, 0.18241042, 0.34319527, 0.25 ], [0.1875 , 0.75 , 0.36666667, 0.26849642, 0.5890411 , 0.31270358, 0.4556213 , 0.22727273], [0.875 , 1. , 0.35833333, 0.27505967, 0.2739726 , 0.15635179, 0.20118343, 0.13636364], [0.0625 , 0.0625 , 0.4 , 0.31384248, 0.30136986, 0.21498371, 0.21893491, 0.18181818], [0.3125 , 0.1875 , 0.275 , 0.20883055, 0.05479452, 0.04234528, 0.02366864, 0. ], [0.0625 , 0.375 , 0.29166667, 0.21062053, 0.43835616, 0.14332248, 0.17159763, 0.09090909], [0.8125 , 0.9375 , 0.4 , 0.30071599, 0.46575342, 0.38110749, 0.30177515, 0.22727273], [0.875 , 0.3125 , 0.66666667, 0.63365155, 0.28767123, 0.3029316 , 0.44378698, 0.06818182], [0.8125 , 0.1875 , 0.21666667, 0.14856802, 0.47945205, 0.10749186, 0.32544379, 0.29545455], [0.625 , 1. , 0.08333333, 0.05548926, 0.43835616, 0.04885993, 0. , 0.36363636], [0.75 , 0.8125 , 0.18333333, 0.12589499, 0.2739726 , 0.12052117, 0.15976331, 0.29545455], [0.4375 , 1. , 0.35 , 0.26312649, 0.38356164, 0.247557 , 0.31360947, 0.25 ]])
Training and Testing Split Dataset¶
In [18]:
# We are ready to split datas as train and test.
from sklearn.model_selection import train_test_split
x_train, x_test, y_train, y_test = train_test_split(x,y,test_size=0.2,random_state=42)
#%40 data will assign as 'Test Datas'
method_names=[] # In Conclusion part, I'll try to show you which method gave the best result.
method_scores=[]
In [19]:
# Let's look at new values.
x_train
Out[19]:
array([[0.5625 , 0.125 , 0.175 , 0.12350835, 0.34246575, 0.05537459, 0.33727811, 0.13636364], [0.125 , 0. , 0.23333333, 0.1575179 , 0.24657534, 0.18241042, 0.34319527, 0.25 ], [0.0625 , 0.8125 , 0.375 , 0.26431981, 0.47945205, 0.48534202, 0.53254438, 0.36363636], [0.125 , 0. , 0.63333333, 0.53818616, 0.28767123, 0.58957655, 0.56804734, 0.22727273], [0.5 , 0.625 , 0.24166667, 0.17959427, 0.38356164, 0.04560261, 0.14201183, 0.09090909], [0.8125 , 0.5 , 0.375 , 0.27267303, 0.60273973, 0.39739414, 0.56213018, 0.40909091], [0.9375 , 0.3125 , 0.28333333, 0.21539379, 0.16438356, 0.07166124, 0.25443787, 0.06818182], [0.8125 , 0.1875 , 0.21666667, 0.14856802, 0.47945205, 0.10749186, 0.32544379, 0.29545455], [1. , 0. , 0.26666667, 0.16348449, 0.67123288, 0.65798046, 0.40236686, 0.65909091], [0.75 , 0.0625 , 0.51666667, 0.43377088, 0.50684932, 0.4723127 , 0.34319527, 0.27272727], [0.5 , 0.25 , 0.43333333, 0.34546539, 0.36986301, 0.29641694, 0.28994083, 0.18181818], [0.6875 , 0.1875 , 0.28333333, 0.19868735, 0.63013699, 0.27687296, 0.46153846, 0.34090909], [0.3125 , 0.125 , 0.575 , 0.52088305, 0.39726027, 0.21824104, 0.46153846, 0.15909091], [0.375 , 0.25 , 0.41666667, 0.31622912, 0.52054795, 0.42996743, 0.34319527, 0.27272727], [0.0625 , 0.375 , 0.29166667, 0.21062053, 0.43835616, 0.14332248, 0.17159763, 0.09090909], [1. , 0.875 , 0.25833333, 0.16408115, 0.79452055, 0.42996743, 0.43786982, 0.52272727], [0.1875 , 0.125 , 0.06666667, 0.03997613, 0.46575342, 0.13029316, 0.21893491, 0.36363636], [0.125 , 0.625 , 0.375 , 0.27804296, 0.64383562, 0.35830619, 0.35502959, 0.31818182], [0.0625 , 0.125 , 0.48333333, 0.39856802, 0.35616438, 0.31270358, 0.32544379, 0.09090909], [0.0625 , 0.3125 , 0.35833333, 0.28818616, 0.39726027, 0.11074919, 0.14201183, 0.13636364], [1. , 0. , 0.29166667, 0.21420048, 0.24657534, 0.12703583, 0.27218935, 0.09090909], [0.4375 , 0.75 , 0.45833333, 0.36097852, 0.46575342, 0.38436482, 0.38461538, 0.27272727], [0.375 , 0.4375 , 0.31666667, 0.22434368, 0.67123288, 0.41368078, 0.50295858, 0.5 ], [0.875 , 0.3125 , 0.66666667, 0.63365155, 0.28767123, 0.3029316 , 0.44378698, 0.06818182], [0.4375 , 0.8125 , 0.58333333, 0.53221957, 0.32876712, 0.2247557 , 0.20710059, 0.09090909], [0.5 , 0. , 0.29166667, 0.21718377, 0.38356164, 0.14006515, 0.31952663, 0.11363636], [0.5 , 0.625 , 0.28333333, 0.18973747, 0.52054795, 0.37785016, 0.34911243, 0.36363636], [0.625 , 1. , 0.53333333, 0.42422434, 0.67123288, 0.61889251, 1. , 0.47727273], [0.1875 , 0.6875 , 0.36666667, 0.26491647, 0.47945205, 0.53094463, 0.35502959, 0.45454545], [0.1875 , 0.6875 , 0.43333333, 0.34665871, 0.19178082, 0.2019544 , 0.29585799, 0. ], [0. , 0.125 , 0.48333333, 0.41945107, 0.57534247, 0.35179153, 0.38461538, 0.22727273], [0.3125 , 0.3125 , 0.21666667, 0.1097852 , 0. , 0.80130293, 0.73964497, 1. ], [0.375 , 0.1875 , 0.46666667, 0.35620525, 0.64383562, 0.53420195, 0.47928994, 0.47727273], [0.125 , 0.25 , 0.36666667, 0.29653938, 0.32876712, 0.04234528, 0.13017751, 0.04545455], [0.625 , 0.8125 , 0.3 , 0.18973747, 0.78082192, 0.50488599, 0.59171598, 0.47727273], [0.4375 , 0.4375 , 0.76666667, 0.62231504, 0.80821918, 1. , 0.92307692, 0.63636364], [0.4375 , 0.9375 , 0.56666667, 0.5 , 0.34246575, 0.23127036, 0.26035503, 0.09090909], [0.4375 , 0.5 , 0.25833333, 0.17720764, 0.57534247, 0.28664495, 0.33136095, 0.29545455], [0.5625 , 0.0625 , 0.35 , 0.2571599 , 0.38356164, 0.23452769, 0.31952663, 0.18181818], [0.1875 , 0.75 , 0.36666667, 0.26849642, 0.5890411 , 0.31270358, 0.4556213 , 0.22727273], [0.0625 , 0.5 , 0.61666667, 0.56682578, 0.47945205, 0.28990228, 0.33727811, 0.15909091], [0.4375 , 1. , 0.35 , 0.26312649, 0.38356164, 0.247557 , 0.31360947, 0.25 ], [0.5625 , 0.875 , 0.375 , 0.3048926 , 0.28767123, 0.10749186, 0.15976331, 0.09090909], [0.375 , 0.625 , 0.29166667, 0.20465394, 0.46575342, 0.34527687, 0.36686391, 0.34090909], [0.625 , 0.875 , 0.19166667, 0.13484487, 0.21917808, 0.03908795, 0.0887574 , 0.13636364], [0.5625 , 0.0625 , 0.16666667, 0.11455847, 0.15068493, 0.02931596, 0.10059172, 0.09090909], [0.8125 , 0.0625 , 0. , 0. , 0.21917808, 0.06840391, 0.24852071, 0.27272727], [0.4375 , 0.375 , 0.05833333, 0.02923628, 0.50684932, 0.33550489, 0.44970414, 0.61363636], [1. , 0.25 , 0.025 , 0.01193317, 0.73972603, 0.16938111, 0.28402367, 0.34090909], [0.625 , 1. , 0.08333333, 0.05548926, 0.43835616, 0.04885993, 0. , 0.36363636], [0.8125 , 0.3125 , 0.25833333, 0.18138425, 0.39726027, 0.18566775, 0.21893491, 0.15909091], [0.625 , 1. , 0.16666667, 0.10083532, 0.7260274 , 0.27361564, 0.32544379, 0.36363636], [0.8125 , 0.25 , 0.25833333, 0.19391408, 0.15068493, 0. , 0.27810651, 0.04545455], [0.6875 , 0.625 , 0.21666667, 0.1473747 , 0.45205479, 0.17263844, 0.19526627, 0.15909091], [0.75 , 0.8125 , 0.18333333, 0.12589499, 0.2739726 , 0.12052117, 0.15976331, 0.29545455], [0.5 , 0.5625 , 0.36666667, 0.27147971, 0.60273973, 0.32247557, 0.40236686, 0.34090909], [0.75 , 0.4375 , 0.43333333, 0.36754177, 0.30136986, 0.14983713, 0.26627219, 0.02272727], [0.6875 , 0.4375 , 0.50833333, 0.41587112, 0.68493151, 0.36482085, 0.53254438, 0.25 ], [0.8125 , 0.9375 , 0.4 , 0.30071599, 0.46575342, 0.38110749, 0.30177515, 0.22727273], [0.875 , 0.9375 , 0.01666667, 0.01372315, 0.38356164, 0.04885993, 0.19526627, 0.43181818], [0.9375 , 0.375 , 0.05833333, 0.03520286, 0.09589041, 0.16286645, 0.58579882, 0.38636364], [1. , 0.625 , 0.20833333, 0.14379475, 0.36986301, 0.11074919, 0.43195266, 0.15909091], [0.75 , 0. , 0.25833333, 0.19212411, 0.2739726 , 0. , 0.07100592, 0.13636364], [0.125 , 0.3125 , 0.525 , 0.44928401, 0.38356164, 0.25407166, 0.23076923, 0.18181818], [0. , 0.125 , 0.675 , 0.67064439, 1. , 0.13355049, 0.27218935, 0.09090909], [1. , 0.25 , 0.2 , 0.14081146, 0.17808219, 0.03257329, 0.30769231, 0.18181818], [0.5 , 0.8125 , 0.06666667, 0.04295943, 0.43835616, 0.08794788, 0.27810651, 0.36363636], [0.75 , 1. , 0.65 , 0.59725537, 0.75342466, 0.39739414, 0.4260355 , 0.15909091], [0.625 , 0.0625 , 0.70833333, 0.71718377, 0.32876712, 0.20846906, 0.24852071, 0. ], [0.625 , 0. , 0.58333333, 0.52147971, 0.2739726 , 0.27035831, 0.35502959, 0.06818182], [0.0625 , 0.0625 , 0.4 , 0.31384248, 0.30136986, 0.21498371, 0.21893491, 0.18181818], [0.5625 , 0.125 , 0.225 , 0.16050119, 0.30136986, 0.09771987, 0.21893491, 0.13636364], [0.625 , 0.9375 , 0.35 , 0.26610979, 0.32876712, 0.19869707, 0.43195266, 0.06818182], [0.0625 , 0.25 , 1. , 1. , 0.49315068, 0.74592834, 0.28402367, 0.34090909], [0.4375 , 0.1875 , 0.28333333, 0.18973747, 0.52054795, 0.28990228, 0.36686391, 0.34090909], [0.375 , 0.4375 , 0.10833333, 0.06563246, 0.5890411 , 0.14006515, 0.82248521, 0.38636364], [0. , 0.9375 , 0.05833333, 0.02505967, 0.38356164, 0.37459283, 0.32544379, 0.84090909], [0.1875 , 0.125 , 0.35 , 0.22434368, 0.5890411 , 0.62214984, 0.4260355 , 0.54545455], [0.3125 , 0.1875 , 0.275 , 0.20883055, 0.05479452, 0.04234528, 0.02366864, 0. ], [0.625 , 0.6875 , 0.29166667, 0.22076372, 0.09589041, 0.07491857, 0. , 0.15909091]])
Model Using ANN¶
In [35]:
model = Sequential()
model.add(Dense(32,activation = 'relu',input_dim = x_train.shape[1]))
model.add(Dense(64,activation = 'relu'))
model.add(Dense(1,activation = 'sigmoid'))
model.summary()
Model: "sequential_1" _________________________________________________________________ Layer (type) Output Shape Param # ================================================================= dense_3 (Dense) (None, 32) 288 _________________________________________________________________ dense_4 (Dense) (None, 64) 2112 _________________________________________________________________ dense_5 (Dense) (None, 1) 65 ================================================================= Total params: 2,465 Trainable params: 2,465 Non-trainable params: 0 _________________________________________________________________
In [36]:
model.compile(optimizer = 'adam', loss='binary_crossentropy', metrics=['accuracy'])
history = model.fit(x_train, y_train, epochs = 120, validation_data=(x_test,y_test))
Epoch 1/120 3/3 [==============================] - 1s 299ms/step - loss: 0.7005 - accuracy: 0.3875 - val_loss: 0.6852 - val_accuracy: 0.7000 Epoch 2/120 3/3 [==============================] - 0s 68ms/step - loss: 0.6901 - accuracy: 0.5875 - val_loss: 0.6650 - val_accuracy: 0.8000 Epoch 3/120 3/3 [==============================] - 0s 65ms/step - loss: 0.6805 - accuracy: 0.5750 - val_loss: 0.6497 - val_accuracy: 0.8000 Epoch 4/120 3/3 [==============================] - 0s 94ms/step - loss: 0.6719 - accuracy: 0.5750 - val_loss: 0.6361 - val_accuracy: 0.8000 Epoch 5/120 3/3 [==============================] - 0s 38ms/step - loss: 0.6660 - accuracy: 0.5750 - val_loss: 0.6232 - val_accuracy: 0.8000 Epoch 6/120 3/3 [==============================] - 0s 93ms/step - loss: 0.6583 - accuracy: 0.5750 - val_loss: 0.6139 - val_accuracy: 0.8000 Epoch 7/120 3/3 [==============================] - 0s 94ms/step - loss: 0.6519 - accuracy: 0.5750 - val_loss: 0.6048 - val_accuracy: 0.8000 Epoch 8/120 3/3 [==============================] - 0s 63ms/step - loss: 0.6450 - accuracy: 0.5750 - val_loss: 0.5976 - val_accuracy: 0.8000 Epoch 9/120 3/3 [==============================] - 0s 98ms/step - loss: 0.6389 - accuracy: 0.6000 - val_loss: 0.5911 - val_accuracy: 0.8000 Epoch 10/120 3/3 [==============================] - 0s 43ms/step - loss: 0.6323 - accuracy: 0.6250 - val_loss: 0.5850 - val_accuracy: 0.8000 Epoch 11/120 3/3 [==============================] - 0s 35ms/step - loss: 0.6251 - accuracy: 0.6750 - val_loss: 0.5803 - val_accuracy: 0.8000 Epoch 12/120 3/3 [==============================] - 0s 37ms/step - loss: 0.6184 - accuracy: 0.6875 - val_loss: 0.5743 - val_accuracy: 0.8000 Epoch 13/120 3/3 [==============================] - 0s 122ms/step - loss: 0.6111 - accuracy: 0.7250 - val_loss: 0.5682 - val_accuracy: 0.8000 Epoch 14/120 3/3 [==============================] - 0s 65ms/step - loss: 0.6039 - accuracy: 0.7250 - val_loss: 0.5625 - val_accuracy: 0.7500 Epoch 15/120 3/3 [==============================] - 0s 33ms/step - loss: 0.5971 - accuracy: 0.7250 - val_loss: 0.5563 - val_accuracy: 0.7500 Epoch 16/120 3/3 [==============================] - 0s 65ms/step - loss: 0.5894 - accuracy: 0.7250 - val_loss: 0.5522 - val_accuracy: 0.7500 Epoch 17/120 3/3 [==============================] - 0s 62ms/step - loss: 0.5823 - accuracy: 0.7375 - val_loss: 0.5495 - val_accuracy: 0.8000 Epoch 18/120 3/3 [==============================] - 0s 40ms/step - loss: 0.5747 - accuracy: 0.7375 - val_loss: 0.5458 - val_accuracy: 0.8000 Epoch 19/120 3/3 [==============================] - 0s 60ms/step - loss: 0.5673 - accuracy: 0.7500 - val_loss: 0.5435 - val_accuracy: 0.8000 Epoch 20/120 3/3 [==============================] - 0s 126ms/step - loss: 0.5592 - accuracy: 0.7500 - val_loss: 0.5414 - val_accuracy: 0.8000 Epoch 21/120 3/3 [==============================] - 0s 65ms/step - loss: 0.5514 - accuracy: 0.7750 - val_loss: 0.5384 - val_accuracy: 0.8000 Epoch 22/120 3/3 [==============================] - 0s 40ms/step - loss: 0.5428 - accuracy: 0.8125 - val_loss: 0.5355 - val_accuracy: 0.8000 Epoch 23/120 3/3 [==============================] - 0s 125ms/step - loss: 0.5362 - accuracy: 0.8125 - val_loss: 0.5337 - val_accuracy: 0.8000 Epoch 24/120 3/3 [==============================] - 0s 44ms/step - loss: 0.5265 - accuracy: 0.8125 - val_loss: 0.5332 - val_accuracy: 0.7500 Epoch 25/120 3/3 [==============================] - 0s 35ms/step - loss: 0.5184 - accuracy: 0.8250 - val_loss: 0.5350 - val_accuracy: 0.7500 Epoch 26/120 3/3 [==============================] - 0s 62ms/step - loss: 0.5106 - accuracy: 0.8375 - val_loss: 0.5352 - val_accuracy: 0.7500 Epoch 27/120 3/3 [==============================] - 0s 37ms/step - loss: 0.5019 - accuracy: 0.8375 - val_loss: 0.5322 - val_accuracy: 0.7500 Epoch 28/120 3/3 [==============================] - 0s 68ms/step - loss: 0.4941 - accuracy: 0.8375 - val_loss: 0.5308 - val_accuracy: 0.7000 Epoch 29/120 3/3 [==============================] - 0s 56ms/step - loss: 0.4853 - accuracy: 0.8500 - val_loss: 0.5241 - val_accuracy: 0.7500 Epoch 30/120 3/3 [==============================] - 0s 64ms/step - loss: 0.4786 - accuracy: 0.8375 - val_loss: 0.5198 - val_accuracy: 0.7500 Epoch 31/120 3/3 [==============================] - 0s 35ms/step - loss: 0.4704 - accuracy: 0.8375 - val_loss: 0.5190 - val_accuracy: 0.7000 Epoch 32/120 3/3 [==============================] - 0s 65ms/step - loss: 0.4632 - accuracy: 0.8500 - val_loss: 0.5195 - val_accuracy: 0.7000 Epoch 33/120 3/3 [==============================] - 0s 132ms/step - loss: 0.4551 - accuracy: 0.8375 - val_loss: 0.5233 - val_accuracy: 0.7000 Epoch 34/120 3/3 [==============================] - 0s 57ms/step - loss: 0.4480 - accuracy: 0.8375 - val_loss: 0.5250 - val_accuracy: 0.7000 Epoch 35/120 3/3 [==============================] - 0s 40ms/step - loss: 0.4407 - accuracy: 0.8500 - val_loss: 0.5266 - val_accuracy: 0.7000 Epoch 36/120 3/3 [==============================] - 0s 57ms/step - loss: 0.4346 - accuracy: 0.8625 - val_loss: 0.5278 - val_accuracy: 0.7000 Epoch 37/120 3/3 [==============================] - 0s 70ms/step - loss: 0.4277 - accuracy: 0.8625 - val_loss: 0.5269 - val_accuracy: 0.7000 Epoch 38/120 3/3 [==============================] - 0s 59ms/step - loss: 0.4210 - accuracy: 0.8625 - val_loss: 0.5249 - val_accuracy: 0.7000 Epoch 39/120 3/3 [==============================] - 0s 34ms/step - loss: 0.4147 - accuracy: 0.8625 - val_loss: 0.5209 - val_accuracy: 0.7000 Epoch 40/120 3/3 [==============================] - 0s 62ms/step - loss: 0.4080 - accuracy: 0.8625 - val_loss: 0.5159 - val_accuracy: 0.7000 Epoch 41/120 3/3 [==============================] - 0s 10ms/step - loss: 0.4018 - accuracy: 0.8625 - val_loss: 0.5116 - val_accuracy: 0.7000 Epoch 42/120 3/3 [==============================] - 0s 65ms/step - loss: 0.3972 - accuracy: 0.8625 - val_loss: 0.5072 - val_accuracy: 0.7000 Epoch 43/120 3/3 [==============================] - 0s 35ms/step - loss: 0.3914 - accuracy: 0.8750 - val_loss: 0.5082 - val_accuracy: 0.7000 Epoch 44/120 3/3 [==============================] - 0s 36ms/step - loss: 0.3851 - accuracy: 0.8750 - val_loss: 0.5091 - val_accuracy: 0.7000 Epoch 45/120 3/3 [==============================] - 0s 59ms/step - loss: 0.3796 - accuracy: 0.8750 - val_loss: 0.5104 - val_accuracy: 0.7000 Epoch 46/120 3/3 [==============================] - 0s 36ms/step - loss: 0.3735 - accuracy: 0.8750 - val_loss: 0.5093 - val_accuracy: 0.7000 Epoch 47/120 3/3 [==============================] - 0s 94ms/step - loss: 0.3680 - accuracy: 0.8750 - val_loss: 0.5072 - val_accuracy: 0.7000 Epoch 48/120 3/3 [==============================] - 0s 63ms/step - loss: 0.3629 - accuracy: 0.8750 - val_loss: 0.5082 - val_accuracy: 0.7000 Epoch 49/120 3/3 [==============================] - 0s 40ms/step - loss: 0.3591 - accuracy: 0.8750 - val_loss: 0.5072 - val_accuracy: 0.7000 Epoch 50/120 3/3 [==============================] - 0s 67ms/step - loss: 0.3533 - accuracy: 0.8750 - val_loss: 0.5052 - val_accuracy: 0.7000 Epoch 51/120 3/3 [==============================] - 0s 33ms/step - loss: 0.3486 - accuracy: 0.8750 - val_loss: 0.5065 - val_accuracy: 0.7000 Epoch 52/120 3/3 [==============================] - 0s 71ms/step - loss: 0.3451 - accuracy: 0.8750 - val_loss: 0.5081 - val_accuracy: 0.7000 Epoch 53/120 3/3 [==============================] - 0s 57ms/step - loss: 0.3405 - accuracy: 0.8750 - val_loss: 0.5125 - val_accuracy: 0.7500 Epoch 54/120 3/3 [==============================] - 0s 37ms/step - loss: 0.3366 - accuracy: 0.9000 - val_loss: 0.5127 - val_accuracy: 0.7500 Epoch 55/120 3/3 [==============================] - 0s 36ms/step - loss: 0.3331 - accuracy: 0.9000 - val_loss: 0.5122 - val_accuracy: 0.7500 Epoch 56/120 3/3 [==============================] - 0s 67ms/step - loss: 0.3291 - accuracy: 0.9000 - val_loss: 0.5111 - val_accuracy: 0.7500 Epoch 57/120 3/3 [==============================] - 0s 9ms/step - loss: 0.3255 - accuracy: 0.9000 - val_loss: 0.5079 - val_accuracy: 0.7500 Epoch 58/120 3/3 [==============================] - 0s 43ms/step - loss: 0.3216 - accuracy: 0.9000 - val_loss: 0.5063 - val_accuracy: 0.7500 Epoch 59/120 3/3 [==============================] - 0s 88ms/step - loss: 0.3180 - accuracy: 0.9000 - val_loss: 0.5056 - val_accuracy: 0.7500 Epoch 60/120 3/3 [==============================] - 0s 75ms/step - loss: 0.3145 - accuracy: 0.9000 - val_loss: 0.5052 - val_accuracy: 0.7500 Epoch 61/120 3/3 [==============================] - 0s 65ms/step - loss: 0.3112 - accuracy: 0.9000 - val_loss: 0.5037 - val_accuracy: 0.7500 Epoch 62/120 3/3 [==============================] - 0s 40ms/step - loss: 0.3078 - accuracy: 0.9000 - val_loss: 0.4992 - val_accuracy: 0.7500 Epoch 63/120 3/3 [==============================] - 0s 63ms/step - loss: 0.3044 - accuracy: 0.9000 - val_loss: 0.4984 - val_accuracy: 0.7500 Epoch 64/120 3/3 [==============================] - 0s 101ms/step - loss: 0.3011 - accuracy: 0.9000 - val_loss: 0.4971 - val_accuracy: 0.7500 Epoch 65/120 3/3 [==============================] - 0s 64ms/step - loss: 0.2976 - accuracy: 0.9000 - val_loss: 0.4957 - val_accuracy: 0.7500 Epoch 66/120 3/3 [==============================] - 0s 87ms/step - loss: 0.2945 - accuracy: 0.9000 - val_loss: 0.4940 - val_accuracy: 0.7500 Epoch 67/120 3/3 [==============================] - 0s 65ms/step - loss: 0.2915 - accuracy: 0.9000 - val_loss: 0.4930 - val_accuracy: 0.7500 Epoch 68/120 3/3 [==============================] - 0s 40ms/step - loss: 0.2888 - accuracy: 0.9000 - val_loss: 0.4928 - val_accuracy: 0.7500 Epoch 69/120 3/3 [==============================] - 0s 91ms/step - loss: 0.2857 - accuracy: 0.9000 - val_loss: 0.4935 - val_accuracy: 0.7500 Epoch 70/120 3/3 [==============================] - 0s 35ms/step - loss: 0.2834 - accuracy: 0.9000 - val_loss: 0.4944 - val_accuracy: 0.7500 Epoch 71/120 3/3 [==============================] - 0s 37ms/step - loss: 0.2806 - accuracy: 0.9000 - val_loss: 0.4922 - val_accuracy: 0.8000 Epoch 72/120 3/3 [==============================] - 0s 64ms/step - loss: 0.2780 - accuracy: 0.9000 - val_loss: 0.4929 - val_accuracy: 0.8000 Epoch 73/120 3/3 [==============================] - 0s 35ms/step - loss: 0.2753 - accuracy: 0.9125 - val_loss: 0.4929 - val_accuracy: 0.8000 Epoch 74/120 3/3 [==============================] - 0s 128ms/step - loss: 0.2746 - accuracy: 0.9000 - val_loss: 0.4939 - val_accuracy: 0.8500 Epoch 75/120 3/3 [==============================] - 0s 59ms/step - loss: 0.2700 - accuracy: 0.9000 - val_loss: 0.4905 - val_accuracy: 0.8500 Epoch 76/120 3/3 [==============================] - 0s 95ms/step - loss: 0.2691 - accuracy: 0.9125 - val_loss: 0.4887 - val_accuracy: 0.8500 Epoch 77/120 3/3 [==============================] - 0s 63ms/step - loss: 0.2675 - accuracy: 0.9125 - val_loss: 0.4905 - val_accuracy: 0.8500 Epoch 78/120 3/3 [==============================] - 0s 67ms/step - loss: 0.2644 - accuracy: 0.9125 - val_loss: 0.4962 - val_accuracy: 0.8500 Epoch 79/120 3/3 [==============================] - 0s 36ms/step - loss: 0.2643 - accuracy: 0.9000 - val_loss: 0.5056 - val_accuracy: 0.8500 Epoch 80/120 3/3 [==============================] - 0s 63ms/step - loss: 0.2608 - accuracy: 0.9125 - val_loss: 0.5100 - val_accuracy: 0.8000 Epoch 81/120 3/3 [==============================] - 0s 61ms/step - loss: 0.2600 - accuracy: 0.9125 - val_loss: 0.5121 - val_accuracy: 0.8000 Epoch 82/120 3/3 [==============================] - 0s 41ms/step - loss: 0.2590 - accuracy: 0.9125 - val_loss: 0.5145 - val_accuracy: 0.8000 Epoch 83/120 3/3 [==============================] - 0s 76ms/step - loss: 0.2566 - accuracy: 0.9125 - val_loss: 0.5087 - val_accuracy: 0.8500 Epoch 84/120 3/3 [==============================] - 0s 126ms/step - loss: 0.2541 - accuracy: 0.9125 - val_loss: 0.4998 - val_accuracy: 0.8500 Epoch 85/120 3/3 [==============================] - 0s 93ms/step - loss: 0.2526 - accuracy: 0.9125 - val_loss: 0.4977 - val_accuracy: 0.8500 Epoch 86/120 3/3 [==============================] - 0s 41ms/step - loss: 0.2521 - accuracy: 0.9125 - val_loss: 0.4980 - val_accuracy: 0.8500 Epoch 87/120 3/3 [==============================] - 0s 41ms/step - loss: 0.2515 - accuracy: 0.9125 - val_loss: 0.5000 - val_accuracy: 0.8500 Epoch 88/120 3/3 [==============================] - 0s 40ms/step - loss: 0.2497 - accuracy: 0.9250 - val_loss: 0.5082 - val_accuracy: 0.8500 Epoch 89/120 3/3 [==============================] - 0s 68ms/step - loss: 0.2463 - accuracy: 0.9125 - val_loss: 0.5126 - val_accuracy: 0.8500 Epoch 90/120 3/3 [==============================] - 0s 65ms/step - loss: 0.2443 - accuracy: 0.9125 - val_loss: 0.5128 - val_accuracy: 0.8500 Epoch 91/120 3/3 [==============================] - 0s 40ms/step - loss: 0.2433 - accuracy: 0.9125 - val_loss: 0.5124 - val_accuracy: 0.8500 Epoch 92/120 3/3 [==============================] - 0s 62ms/step - loss: 0.2420 - accuracy: 0.9125 - val_loss: 0.5148 - val_accuracy: 0.8500 Epoch 93/120 3/3 [==============================] - 0s 67ms/step - loss: 0.2417 - accuracy: 0.9125 - val_loss: 0.5218 - val_accuracy: 0.8500 Epoch 94/120 3/3 [==============================] - 0s 32ms/step - loss: 0.2405 - accuracy: 0.9250 - val_loss: 0.5231 - val_accuracy: 0.8500 Epoch 95/120 3/3 [==============================] - 0s 64ms/step - loss: 0.2387 - accuracy: 0.9250 - val_loss: 0.5233 - val_accuracy: 0.8500 Epoch 96/120 3/3 [==============================] - 0s 32ms/step - loss: 0.2368 - accuracy: 0.9250 - val_loss: 0.5206 - val_accuracy: 0.8500 Epoch 97/120 3/3 [==============================] - 0s 36ms/step - loss: 0.2391 - accuracy: 0.9250 - val_loss: 0.5166 - val_accuracy: 0.8500 Epoch 98/120 3/3 [==============================] - 0s 98ms/step - loss: 0.2355 - accuracy: 0.9250 - val_loss: 0.5182 - val_accuracy: 0.8500 Epoch 99/120 3/3 [==============================] - 0s 62ms/step - loss: 0.2347 - accuracy: 0.9250 - val_loss: 0.5194 - val_accuracy: 0.8500 Epoch 100/120 3/3 [==============================] - 0s 62ms/step - loss: 0.2329 - accuracy: 0.9250 - val_loss: 0.5240 - val_accuracy: 0.8500 Epoch 101/120 3/3 [==============================] - 0s 35ms/step - loss: 0.2321 - accuracy: 0.9250 - val_loss: 0.5303 - val_accuracy: 0.8500 Epoch 102/120 3/3 [==============================] - 0s 33ms/step - loss: 0.2314 - accuracy: 0.9250 - val_loss: 0.5317 - val_accuracy: 0.8500 Epoch 103/120 3/3 [==============================] - 0s 98ms/step - loss: 0.2302 - accuracy: 0.9250 - val_loss: 0.5354 - val_accuracy: 0.8500 Epoch 104/120 3/3 [==============================] - 0s 60ms/step - loss: 0.2297 - accuracy: 0.9250 - val_loss: 0.5370 - val_accuracy: 0.8500 Epoch 105/120 3/3 [==============================] - 0s 32ms/step - loss: 0.2290 - accuracy: 0.9250 - val_loss: 0.5336 - val_accuracy: 0.8500 Epoch 106/120 3/3 [==============================] - 0s 65ms/step - loss: 0.2274 - accuracy: 0.9250 - val_loss: 0.5333 - val_accuracy: 0.8500 Epoch 107/120 3/3 [==============================] - 0s 35ms/step - loss: 0.2266 - accuracy: 0.9375 - val_loss: 0.5348 - val_accuracy: 0.8500 Epoch 108/120 3/3 [==============================] - 0s 59ms/step - loss: 0.2262 - accuracy: 0.9250 - val_loss: 0.5379 - val_accuracy: 0.8500 Epoch 109/120 3/3 [==============================] - 0s 33ms/step - loss: 0.2252 - accuracy: 0.9250 - val_loss: 0.5395 - val_accuracy: 0.8500 Epoch 110/120 3/3 [==============================] - 0s 32ms/step - loss: 0.2256 - accuracy: 0.9250 - val_loss: 0.5383 - val_accuracy: 0.8500 Epoch 111/120 3/3 [==============================] - 0s 9ms/step - loss: 0.2260 - accuracy: 0.9250 - val_loss: 0.5427 - val_accuracy: 0.8500 Epoch 112/120 3/3 [==============================] - 0s 63ms/step - loss: 0.2233 - accuracy: 0.9250 - val_loss: 0.5409 - val_accuracy: 0.8500 Epoch 113/120 3/3 [==============================] - 0s 63ms/step - loss: 0.2226 - accuracy: 0.9375 - val_loss: 0.5362 - val_accuracy: 0.8500 Epoch 114/120 3/3 [==============================] - 0s 10ms/step - loss: 0.2234 - accuracy: 0.9250 - val_loss: 0.5366 - val_accuracy: 0.8500 Epoch 115/120 3/3 [==============================] - 0s 11ms/step - loss: 0.2227 - accuracy: 0.9250 - val_loss: 0.5393 - val_accuracy: 0.8500 Epoch 116/120 3/3 [==============================] - 0s 45ms/step - loss: 0.2210 - accuracy: 0.9250 - val_loss: 0.5431 - val_accuracy: 0.8500 Epoch 117/120 3/3 [==============================] - 0s 67ms/step - loss: 0.2187 - accuracy: 0.9375 - val_loss: 0.5509 - val_accuracy: 0.8000 Epoch 118/120 3/3 [==============================] - 0s 41ms/step - loss: 0.2233 - accuracy: 0.9125 - val_loss: 0.5656 - val_accuracy: 0.8000 Epoch 119/120 3/3 [==============================] - 0s 35ms/step - loss: 0.2214 - accuracy: 0.9000 - val_loss: 0.5635 - val_accuracy: 0.8000 Epoch 120/120 3/3 [==============================] - 0s 42ms/step - loss: 0.2222 - accuracy: 0.9125 - val_loss: 0.5548 - val_accuracy: 0.8000
In [21]:
method_names.append("ANN")
method_scores.append(0.851)
In [22]:
trainX = np.reshape(x_train, (x_train.shape[0], x_train.shape[1],1))
testX = np.reshape(x_test, (x_test.shape[0],x_test.shape[1],1))
# Print and check shapes
print("Shape of trainX is {}".format(trainX.shape))
print("Shape of testX is {}".format(testX.shape))
Shape of trainX is (80, 8, 1) Shape of testX is (20, 8, 1)
Plotting Accuracy and Loss Graphs¶
In [24]:
plt.figure(figsize=(10,8))
#plt.plot(history.history[''])
plt.plot(history.history['val_accuracy'],color='orange')
plt.title('model accuracy')
plt.ylabel('accuracy')
plt.xlabel('epoch')
plt.show()
In [25]:
plt.figure(figsize=(10,8))
plt.plot(history.history['loss'])
plt.plot(history.history['val_loss'])
plt.title('model loss')
plt.ylabel('loss')
plt.xlabel('epoch')
plt.show()
In [26]:
y_pred = model.predict_classes(x_test)
y_pred
WARNING:tensorflow:From <ipython-input-26-b49d4880b284>:1: Sequential.predict_classes (from tensorflow.python.keras.engine.sequential) is deprecated and will be removed after 2021-01-01. Instructions for updating: Please use instead:* `np.argmax(model.predict(x), axis=-1)`, if your model does multi-class classification (e.g. if it uses a `softmax` last-layer activation).* `(model.predict(x) > 0.5).astype("int32")`, if your model does binary classification (e.g. if it uses a `sigmoid` last-layer activation).
Out[26]:
array([[1], [1], [1], [1], [1], [1], [1], [0], [1], [1], [1], [1], [1], [1], [1], [1], [0], [1], [1], [1]], dtype=int32)
In [27]:
y_test.shape
Out[27]:
(20,)
In [28]:
y_pred = np.squeeze(y_pred)
y_pred.shape
Out[28]:
(20,)
In [29]:
print('Test Accuracy : ',accuracy_score(y_test, y_pred))
Test Accuracy : 0.9
Saving the Model¶
In [32]:
model.save('prostatecancer.h5')
DeepC¶
In [33]:
!deepCC prostatecancer.h5
[INFO] Reading [keras model] 'prostatecancer.h5' [SUCCESS] Saved 'prostatecancer_deepC/prostatecancer.onnx' [INFO] Reading [onnx model] 'prostatecancer_deepC/prostatecancer.onnx' [INFO] Model info: ir_vesion : 4 doc : [WARNING] [ONNX]: terminal (input/output) dense_input's shape is less than 1. Changing it to 1. [WARNING] [ONNX]: terminal (input/output) dense_2's shape is less than 1. Changing it to 1. [INFO] Running DNNC graph sanity check ... [SUCCESS] Passed sanity check. [INFO] Writing C++ file 'prostatecancer_deepC/prostatecancer.cpp' [INFO] deepSea model files are ready in 'prostatecancer_deepC/' [RUNNING COMMAND] g++ -std=c++11 -O3 -fno-rtti -fno-exceptions -I. -I/opt/tljh/user/lib/python3.7/site-packages/deepC-0.13-py3.7-linux-x86_64.egg/deepC/include -isystem /opt/tljh/user/lib/python3.7/site-packages/deepC-0.13-py3.7-linux-x86_64.egg/deepC/packages/eigen-eigen-323c052e1731 "prostatecancer_deepC/prostatecancer.cpp" -D_AITS_MAIN -o "prostatecancer_deepC/prostatecancer.exe" [RUNNING COMMAND] size "prostatecancer_deepC/prostatecancer.exe" text data bss dec hex filename 128803 2968 760 132531 205b3 prostatecancer_deepC/prostatecancer.exe [SUCCESS] Saved model as executable "prostatecancer_deepC/prostatecancer.exe"