Cainvas

Mobile Price Range Classifier

Credit: AITS Cainvas Community

Photo by Eugene Machiavelli for Shakuro on Dribbble

You'll want to evaluate almost every model you ever build. In most (though not all) applications, the relevant measure of model quality is predictive accuracy. In other words, will the model's predictions be close to what actually happens.

Many people make a huge mistake when measuring predictive accuracy. They make predictions with their training data and compare those predictions to the target values in the training data. You'll see the problem with this approach and how to solve it.

The Internet of Things (IoT) is a network of intelligent devices ranging from home appliances to industrial equipment that can become connected to the Internet, monitor themselves, send contextual information such as pressure, location, and temperature, and communicate somehow, anytime, anywhere on the planet. IoT means “connecting anyone, anything, anytime, anyplace, any service and any network”. The proliferation of mobile connectivity and the decreasing prices of sensors and processors are encouraging the rapid growth of the IoT and IoE. Smart devices, for example, smartphones, smartwatches, PDAs, phablets, and tablets, will be the primary interaction tools used by people in a connected environment including cars, homes, and workplaces. A smartphone can be considered a miniature computer that has a virtual store of many applications such as games, different browsers, maps, emails, image editors, and that help to turn it into a device that is smarter than a regular cell phone.

In this project, we are focusing on how the mobile prices are ranging between different values and how can we enhance it. The technology necessary for all the example applications of IoT and IoE, stated in the previous sections, to succeed is available today. RFID, Bluetooth, NFC, 3G, 4G, 5G, etc. can transfer data over the Internet, also batteries’ technologies have evolved; for example, wireless and solar power batteries and long-lasting batteries are available in today’s market.

Importing the Dataset

In [1]:
!wget https://cainvas-static.s3.amazonaws.com/media/user_data/cainvas-admin/archive_nMtDHpe.zip
--2021-07-15 10:31:59--  https://cainvas-static.s3.amazonaws.com/media/user_data/cainvas-admin/archive_nMtDHpe.zip
Resolving cainvas-static.s3.amazonaws.com (cainvas-static.s3.amazonaws.com)... 52.219.62.44
Connecting to cainvas-static.s3.amazonaws.com (cainvas-static.s3.amazonaws.com)|52.219.62.44|:443... connected.
HTTP request sent, awaiting response... 200 OK
Length: 72340 (71K) [application/x-zip-compressed]
Saving to: ‘archive_nMtDHpe.zip’

archive_nMtDHpe.zip 100%[===================>]  70.64K  --.-KB/s    in 0.001s  

2021-07-15 10:32:00 (71.2 MB/s) - ‘archive_nMtDHpe.zip’ saved [72340/72340]

In [2]:
!unzip archive_nMtDHpe.zip
Archive:  archive_nMtDHpe.zip
  inflating: test.csv                
  inflating: train.csv               

Importing necessary libraries

In [3]:
# some necessary libraries

import numpy as np
import matplotlib.pyplot as plt
import pandas as pd

Loading the Data

In [4]:
# loading the training dataset

dataset = pd.read_csv('train.csv')
dataset.head(10)
Out[4]:
battery_power blue clock_speed dual_sim fc four_g int_memory m_dep mobile_wt n_cores ... px_height px_width ram sc_h sc_w talk_time three_g touch_screen wifi price_range
0 842 0 2.2 0 1 0 7 0.6 188 2 ... 20 756 2549 9 7 19 0 0 1 1
1 1021 1 0.5 1 0 1 53 0.7 136 3 ... 905 1988 2631 17 3 7 1 1 0 2
2 563 1 0.5 1 2 1 41 0.9 145 5 ... 1263 1716 2603 11 2 9 1 1 0 2
3 615 1 2.5 0 0 0 10 0.8 131 6 ... 1216 1786 2769 16 8 11 1 0 0 2
4 1821 1 1.2 0 13 1 44 0.6 141 2 ... 1208 1212 1411 8 2 15 1 1 0 1
5 1859 0 0.5 1 3 0 22 0.7 164 1 ... 1004 1654 1067 17 1 10 1 0 0 1
6 1821 0 1.7 0 4 1 10 0.8 139 8 ... 381 1018 3220 13 8 18 1 0 1 3
7 1954 0 0.5 1 0 0 24 0.8 187 4 ... 512 1149 700 16 3 5 1 1 1 0
8 1445 1 0.5 0 0 0 53 0.7 174 7 ... 386 836 1099 17 1 20 1 0 0 0
9 509 1 0.6 1 2 1 9 0.1 93 5 ... 1137 1224 513 19 10 12 1 0 0 0

10 rows × 21 columns

Now, we check for the columns

In [5]:
dataset.columns
Out[5]:
Index(['battery_power', 'blue', 'clock_speed', 'dual_sim', 'fc', 'four_g',
       'int_memory', 'm_dep', 'mobile_wt', 'n_cores', 'pc', 'px_height',
       'px_width', 'ram', 'sc_h', 'sc_w', 'talk_time', 'three_g',
       'touch_screen', 'wifi', 'price_range'],
      dtype='object')

Data Preprocessing

In [6]:
#Changing pandas dataframe to numpy array

X = dataset.iloc[:,:20].values
y = dataset.iloc[:,20:21].values

Now, we are normalizing the data

In [7]:
#Normalizing the data

from sklearn.preprocessing import StandardScaler
sc = StandardScaler()
X = sc.fit_transform(X)
print('Normalized data:')
print(X[0])
Normalized data:
[-0.90259726 -0.9900495   0.83077942 -1.01918398 -0.76249466 -1.04396559
 -1.38064353  0.34073951  1.34924881 -1.10197128 -1.3057501  -1.40894856
 -1.14678403  0.39170341 -0.78498329  0.2831028   1.46249332 -1.78686097
 -1.00601811  0.98609664]

Here, we are doing the one hot encoding

In [8]:
#One hot encode

from sklearn.preprocessing import OneHotEncoder
ohe = OneHotEncoder()
y = ohe.fit_transform(y).toarray()
print('One hot encoded array:')
print(y[0:5])
One hot encoded array:
[[0. 1. 0. 0.]
 [0. 0. 1. 0.]
 [0. 0. 1. 0.]
 [0. 0. 1. 0.]
 [0. 1. 0. 0.]]

Splitting of the Data will be done here

In [9]:
#Train test split of model

from sklearn.model_selection import train_test_split

X_train,X_test,y_train,y_test = train_test_split(X,y,test_size = 0.1,random_state = 0)

Building our Model

In [11]:
# importing libraries

import tensorflow.keras
from tensorflow.keras.models import Sequential
from tensorflow.keras.layers import Dense
from tensorflow.keras.layers import Dropout
In [12]:
# creating the model

model = Sequential()
model.add(Dense(16, input_dim=20, activation='relu'))
model.add(Dropout(0.3))
model.add(Dense(12, activation='relu'))
model.add(Dropout(0.3))
model.add(Dense(4, activation='softmax'))
In [13]:
#To visualize neural network

model.summary()
Model: "sequential"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
dense (Dense)                (None, 16)                336       
_________________________________________________________________
dropout (Dropout)            (None, 16)                0         
_________________________________________________________________
dense_1 (Dense)              (None, 12)                204       
_________________________________________________________________
dropout_1 (Dropout)          (None, 12)                0         
_________________________________________________________________
dense_2 (Dense)              (None, 4)                 52        
=================================================================
Total params: 592
Trainable params: 592
Non-trainable params: 0
_________________________________________________________________
In [14]:
# compiling the model

model.compile(loss='categorical_crossentropy', optimizer='adam', metrics=['accuracy'])
In [15]:
# fitting the model

history = model.fit(X_train, y_train, epochs=100, batch_size=64)
Epoch 1/100
29/29 [==============================] - 0s 1ms/step - loss: 1.5127 - accuracy: 0.2572
Epoch 2/100
29/29 [==============================] - 0s 1ms/step - loss: 1.4624 - accuracy: 0.2628
Epoch 3/100
29/29 [==============================] - 0s 1ms/step - loss: 1.4317 - accuracy: 0.2939
Epoch 4/100
29/29 [==============================] - 0s 1ms/step - loss: 1.4069 - accuracy: 0.2783
Epoch 5/100
29/29 [==============================] - 0s 1ms/step - loss: 1.3887 - accuracy: 0.2783
Epoch 6/100
29/29 [==============================] - 0s 1ms/step - loss: 1.3735 - accuracy: 0.3061
Epoch 7/100
29/29 [==============================] - 0s 1ms/step - loss: 1.3522 - accuracy: 0.3256
Epoch 8/100
29/29 [==============================] - 0s 1ms/step - loss: 1.3220 - accuracy: 0.3306
Epoch 9/100
29/29 [==============================] - 0s 1ms/step - loss: 1.2976 - accuracy: 0.3567
Epoch 10/100
29/29 [==============================] - 0s 1ms/step - loss: 1.2654 - accuracy: 0.3556
Epoch 11/100
29/29 [==============================] - 0s 1ms/step - loss: 1.2415 - accuracy: 0.3928
Epoch 12/100
29/29 [==============================] - 0s 1ms/step - loss: 1.2085 - accuracy: 0.4133
Epoch 13/100
29/29 [==============================] - 0s 1ms/step - loss: 1.1715 - accuracy: 0.4228
Epoch 14/100
29/29 [==============================] - 0s 1ms/step - loss: 1.1355 - accuracy: 0.4372
Epoch 15/100
29/29 [==============================] - 0s 1ms/step - loss: 1.0834 - accuracy: 0.4639
Epoch 16/100
29/29 [==============================] - 0s 1ms/step - loss: 1.0359 - accuracy: 0.4917
Epoch 17/100
29/29 [==============================] - 0s 1ms/step - loss: 0.9952 - accuracy: 0.5139
Epoch 18/100
29/29 [==============================] - 0s 1ms/step - loss: 0.9619 - accuracy: 0.5183
Epoch 19/100
29/29 [==============================] - 0s 1ms/step - loss: 0.9202 - accuracy: 0.5367
Epoch 20/100
29/29 [==============================] - 0s 2ms/step - loss: 0.9077 - accuracy: 0.5450
Epoch 21/100
29/29 [==============================] - 0s 1ms/step - loss: 0.8606 - accuracy: 0.5750
Epoch 22/100
29/29 [==============================] - 0s 1ms/step - loss: 0.8462 - accuracy: 0.5672
Epoch 23/100
29/29 [==============================] - 0s 1ms/step - loss: 0.8285 - accuracy: 0.5967
Epoch 24/100
29/29 [==============================] - 0s 1ms/step - loss: 0.8196 - accuracy: 0.6000
Epoch 25/100
29/29 [==============================] - 0s 1ms/step - loss: 0.7878 - accuracy: 0.6111
Epoch 26/100
29/29 [==============================] - 0s 1ms/step - loss: 0.7695 - accuracy: 0.6150
Epoch 27/100
29/29 [==============================] - 0s 1ms/step - loss: 0.7511 - accuracy: 0.6344
Epoch 28/100
29/29 [==============================] - 0s 1ms/step - loss: 0.7497 - accuracy: 0.6478
Epoch 29/100
29/29 [==============================] - 0s 1ms/step - loss: 0.7273 - accuracy: 0.6567
Epoch 30/100
29/29 [==============================] - 0s 1ms/step - loss: 0.7168 - accuracy: 0.6528
Epoch 31/100
29/29 [==============================] - 0s 1ms/step - loss: 0.7070 - accuracy: 0.6594
Epoch 32/100
29/29 [==============================] - 0s 1ms/step - loss: 0.6738 - accuracy: 0.6961
Epoch 33/100
29/29 [==============================] - 0s 1ms/step - loss: 0.6601 - accuracy: 0.7039
Epoch 34/100
29/29 [==============================] - 0s 1ms/step - loss: 0.6709 - accuracy: 0.6883
Epoch 35/100
29/29 [==============================] - 0s 1ms/step - loss: 0.6508 - accuracy: 0.6972
Epoch 36/100
29/29 [==============================] - 0s 1ms/step - loss: 0.6353 - accuracy: 0.7194
Epoch 37/100
29/29 [==============================] - 0s 1ms/step - loss: 0.6233 - accuracy: 0.7172
Epoch 38/100
29/29 [==============================] - 0s 1ms/step - loss: 0.6088 - accuracy: 0.7228
Epoch 39/100
29/29 [==============================] - 0s 1ms/step - loss: 0.5993 - accuracy: 0.7178
Epoch 40/100
29/29 [==============================] - 0s 1ms/step - loss: 0.5774 - accuracy: 0.7500
Epoch 41/100
29/29 [==============================] - 0s 1ms/step - loss: 0.5727 - accuracy: 0.7450
Epoch 42/100
29/29 [==============================] - 0s 1ms/step - loss: 0.5599 - accuracy: 0.7467
Epoch 43/100
29/29 [==============================] - 0s 1ms/step - loss: 0.5641 - accuracy: 0.7550
Epoch 44/100
29/29 [==============================] - 0s 1ms/step - loss: 0.5599 - accuracy: 0.7528
Epoch 45/100
29/29 [==============================] - 0s 1ms/step - loss: 0.5314 - accuracy: 0.7717
Epoch 46/100
29/29 [==============================] - 0s 1ms/step - loss: 0.5196 - accuracy: 0.7656
Epoch 47/100
29/29 [==============================] - 0s 1ms/step - loss: 0.5109 - accuracy: 0.7783
Epoch 48/100
29/29 [==============================] - 0s 1ms/step - loss: 0.5127 - accuracy: 0.7711
Epoch 49/100
29/29 [==============================] - 0s 1ms/step - loss: 0.5010 - accuracy: 0.7872
Epoch 50/100
29/29 [==============================] - 0s 1ms/step - loss: 0.4990 - accuracy: 0.7878
Epoch 51/100
29/29 [==============================] - 0s 1ms/step - loss: 0.4913 - accuracy: 0.7900
Epoch 52/100
29/29 [==============================] - 0s 1ms/step - loss: 0.4725 - accuracy: 0.8017
Epoch 53/100
29/29 [==============================] - 0s 1ms/step - loss: 0.4788 - accuracy: 0.7850
Epoch 54/100
29/29 [==============================] - 0s 1ms/step - loss: 0.4831 - accuracy: 0.7900
Epoch 55/100
29/29 [==============================] - 0s 1ms/step - loss: 0.4800 - accuracy: 0.7978
Epoch 56/100
29/29 [==============================] - 0s 1ms/step - loss: 0.4612 - accuracy: 0.8017
Epoch 57/100
29/29 [==============================] - 0s 1ms/step - loss: 0.4602 - accuracy: 0.7994
Epoch 58/100
29/29 [==============================] - 0s 1ms/step - loss: 0.4429 - accuracy: 0.8194
Epoch 59/100
29/29 [==============================] - 0s 1ms/step - loss: 0.4432 - accuracy: 0.8100
Epoch 60/100
29/29 [==============================] - 0s 1ms/step - loss: 0.4339 - accuracy: 0.8161
Epoch 61/100
29/29 [==============================] - 0s 1ms/step - loss: 0.4383 - accuracy: 0.8061
Epoch 62/100
29/29 [==============================] - 0s 1ms/step - loss: 0.4096 - accuracy: 0.8278
Epoch 63/100
29/29 [==============================] - 0s 1ms/step - loss: 0.4160 - accuracy: 0.8222
Epoch 64/100
29/29 [==============================] - 0s 1ms/step - loss: 0.4079 - accuracy: 0.8239
Epoch 65/100
29/29 [==============================] - 0s 1ms/step - loss: 0.4029 - accuracy: 0.8244
Epoch 66/100
29/29 [==============================] - 0s 1ms/step - loss: 0.4027 - accuracy: 0.8350
Epoch 67/100
29/29 [==============================] - 0s 1ms/step - loss: 0.4153 - accuracy: 0.8300
Epoch 68/100
29/29 [==============================] - 0s 1ms/step - loss: 0.3915 - accuracy: 0.8333
Epoch 69/100
29/29 [==============================] - 0s 1ms/step - loss: 0.3867 - accuracy: 0.8439
Epoch 70/100
29/29 [==============================] - 0s 1ms/step - loss: 0.3960 - accuracy: 0.8372
Epoch 71/100
29/29 [==============================] - 0s 1ms/step - loss: 0.3847 - accuracy: 0.8372
Epoch 72/100
29/29 [==============================] - 0s 1ms/step - loss: 0.3908 - accuracy: 0.8244
Epoch 73/100
29/29 [==============================] - 0s 1ms/step - loss: 0.3850 - accuracy: 0.8344
Epoch 74/100
29/29 [==============================] - 0s 1ms/step - loss: 0.3671 - accuracy: 0.8544
Epoch 75/100
29/29 [==============================] - 0s 1ms/step - loss: 0.3695 - accuracy: 0.8394
Epoch 76/100
29/29 [==============================] - 0s 1ms/step - loss: 0.3736 - accuracy: 0.8350
Epoch 77/100
29/29 [==============================] - 0s 1ms/step - loss: 0.3687 - accuracy: 0.8533
Epoch 78/100
29/29 [==============================] - 0s 1ms/step - loss: 0.3639 - accuracy: 0.8456
Epoch 79/100
29/29 [==============================] - 0s 1ms/step - loss: 0.3529 - accuracy: 0.8494
Epoch 80/100
29/29 [==============================] - 0s 1ms/step - loss: 0.3454 - accuracy: 0.8472
Epoch 81/100
29/29 [==============================] - 0s 1ms/step - loss: 0.3596 - accuracy: 0.8483
Epoch 82/100
29/29 [==============================] - 0s 1ms/step - loss: 0.3500 - accuracy: 0.8511
Epoch 83/100
29/29 [==============================] - 0s 1ms/step - loss: 0.3543 - accuracy: 0.8550
Epoch 84/100
29/29 [==============================] - 0s 1ms/step - loss: 0.3507 - accuracy: 0.8561
Epoch 85/100
29/29 [==============================] - 0s 1ms/step - loss: 0.3440 - accuracy: 0.8617
Epoch 86/100
29/29 [==============================] - 0s 1ms/step - loss: 0.3338 - accuracy: 0.8683
Epoch 87/100
29/29 [==============================] - 0s 1ms/step - loss: 0.3369 - accuracy: 0.8550
Epoch 88/100
29/29 [==============================] - 0s 1ms/step - loss: 0.3372 - accuracy: 0.8539
Epoch 89/100
29/29 [==============================] - 0s 1ms/step - loss: 0.3452 - accuracy: 0.8550
Epoch 90/100
29/29 [==============================] - 0s 1ms/step - loss: 0.3225 - accuracy: 0.8667
Epoch 91/100
29/29 [==============================] - 0s 1ms/step - loss: 0.3252 - accuracy: 0.8606
Epoch 92/100
29/29 [==============================] - 0s 1ms/step - loss: 0.3224 - accuracy: 0.8756
Epoch 93/100
29/29 [==============================] - 0s 2ms/step - loss: 0.3273 - accuracy: 0.8639
Epoch 94/100
29/29 [==============================] - 0s 1ms/step - loss: 0.3500 - accuracy: 0.8483
Epoch 95/100
29/29 [==============================] - 0s 2ms/step - loss: 0.3184 - accuracy: 0.8672
Epoch 96/100
29/29 [==============================] - 0s 1ms/step - loss: 0.3406 - accuracy: 0.8606
Epoch 97/100
29/29 [==============================] - 0s 1ms/step - loss: 0.3353 - accuracy: 0.8622
Epoch 98/100
29/29 [==============================] - 0s 1ms/step - loss: 0.3243 - accuracy: 0.8650
Epoch 99/100
29/29 [==============================] - 0s 1ms/step - loss: 0.3209 - accuracy: 0.8717
Epoch 100/100
29/29 [==============================] - 0s 1ms/step - loss: 0.3228 - accuracy: 0.8606

Now, we are doing the predictions on the testing dataset and then convert them to labels.

In [16]:
y_pred = model.predict(X_test)

#Converting predictions to label

pred = list()
for i in range(len(y_pred)):
    pred.append(np.argmax(y_pred[i]))
In [17]:
#Converting one hot encoded test label to label

test = list()
for i in range(len(y_test)):
    test.append(np.argmax(y_test[i]))

Now, we are checking the accuracy score

In [18]:
from sklearn.metrics import accuracy_score
a = accuracy_score(pred,test)
print('Accuracy is:', a*100)
Accuracy is: 90.5
In [19]:
#Using test data as validation data.

history1 = model.fit(X_train, y_train,validation_data = (X_test,y_test), epochs=100, batch_size=64)
Epoch 1/100
29/29 [==============================] - 0s 6ms/step - loss: 0.3117 - accuracy: 0.8628 - val_loss: 0.1930 - val_accuracy: 0.9000
Epoch 2/100
29/29 [==============================] - 0s 2ms/step - loss: 0.3239 - accuracy: 0.8756 - val_loss: 0.1970 - val_accuracy: 0.9100
Epoch 3/100
29/29 [==============================] - 0s 2ms/step - loss: 0.3145 - accuracy: 0.8661 - val_loss: 0.1945 - val_accuracy: 0.9150
Epoch 4/100
29/29 [==============================] - 0s 2ms/step - loss: 0.3185 - accuracy: 0.8661 - val_loss: 0.1944 - val_accuracy: 0.9300
Epoch 5/100
29/29 [==============================] - 0s 2ms/step - loss: 0.3144 - accuracy: 0.8667 - val_loss: 0.1842 - val_accuracy: 0.9100
Epoch 6/100
29/29 [==============================] - 0s 2ms/step - loss: 0.3083 - accuracy: 0.8639 - val_loss: 0.1891 - val_accuracy: 0.9050
Epoch 7/100
29/29 [==============================] - 0s 2ms/step - loss: 0.3121 - accuracy: 0.8744 - val_loss: 0.1878 - val_accuracy: 0.9150
Epoch 8/100
29/29 [==============================] - 0s 2ms/step - loss: 0.3054 - accuracy: 0.8683 - val_loss: 0.1824 - val_accuracy: 0.9200
Epoch 9/100
29/29 [==============================] - 0s 2ms/step - loss: 0.3110 - accuracy: 0.8706 - val_loss: 0.1801 - val_accuracy: 0.9200
Epoch 10/100
29/29 [==============================] - 0s 2ms/step - loss: 0.3036 - accuracy: 0.8761 - val_loss: 0.1842 - val_accuracy: 0.9200
Epoch 11/100
29/29 [==============================] - 0s 2ms/step - loss: 0.3018 - accuracy: 0.8700 - val_loss: 0.1894 - val_accuracy: 0.9150
Epoch 12/100
29/29 [==============================] - 0s 2ms/step - loss: 0.2952 - accuracy: 0.8700 - val_loss: 0.1838 - val_accuracy: 0.9100
Epoch 13/100
29/29 [==============================] - 0s 2ms/step - loss: 0.2954 - accuracy: 0.8806 - val_loss: 0.1900 - val_accuracy: 0.9300
Epoch 14/100
29/29 [==============================] - 0s 2ms/step - loss: 0.2890 - accuracy: 0.8783 - val_loss: 0.1808 - val_accuracy: 0.9250
Epoch 15/100
29/29 [==============================] - 0s 2ms/step - loss: 0.3003 - accuracy: 0.8639 - val_loss: 0.1778 - val_accuracy: 0.9250
Epoch 16/100
29/29 [==============================] - 0s 2ms/step - loss: 0.2919 - accuracy: 0.8911 - val_loss: 0.1749 - val_accuracy: 0.9350
Epoch 17/100
29/29 [==============================] - 0s 2ms/step - loss: 0.3029 - accuracy: 0.8739 - val_loss: 0.1736 - val_accuracy: 0.9300
Epoch 18/100
29/29 [==============================] - 0s 2ms/step - loss: 0.2848 - accuracy: 0.8800 - val_loss: 0.1764 - val_accuracy: 0.9300
Epoch 19/100
29/29 [==============================] - 0s 2ms/step - loss: 0.2843 - accuracy: 0.8750 - val_loss: 0.1743 - val_accuracy: 0.9250
Epoch 20/100
29/29 [==============================] - 0s 2ms/step - loss: 0.2915 - accuracy: 0.8717 - val_loss: 0.1653 - val_accuracy: 0.9400
Epoch 21/100
29/29 [==============================] - 0s 2ms/step - loss: 0.3052 - accuracy: 0.8744 - val_loss: 0.1772 - val_accuracy: 0.9300
Epoch 22/100
29/29 [==============================] - 0s 2ms/step - loss: 0.2915 - accuracy: 0.8706 - val_loss: 0.1773 - val_accuracy: 0.9250
Epoch 23/100
29/29 [==============================] - 0s 2ms/step - loss: 0.3010 - accuracy: 0.8689 - val_loss: 0.1669 - val_accuracy: 0.9300
Epoch 24/100
29/29 [==============================] - 0s 2ms/step - loss: 0.2887 - accuracy: 0.8806 - val_loss: 0.1683 - val_accuracy: 0.9350
Epoch 25/100
29/29 [==============================] - 0s 2ms/step - loss: 0.2882 - accuracy: 0.8889 - val_loss: 0.1771 - val_accuracy: 0.9350
Epoch 26/100
29/29 [==============================] - 0s 2ms/step - loss: 0.2844 - accuracy: 0.8739 - val_loss: 0.1820 - val_accuracy: 0.9300
Epoch 27/100
29/29 [==============================] - 0s 2ms/step - loss: 0.2748 - accuracy: 0.8906 - val_loss: 0.1775 - val_accuracy: 0.9400
Epoch 28/100
29/29 [==============================] - 0s 2ms/step - loss: 0.2946 - accuracy: 0.8756 - val_loss: 0.1698 - val_accuracy: 0.9350
Epoch 29/100
29/29 [==============================] - 0s 2ms/step - loss: 0.2950 - accuracy: 0.8828 - val_loss: 0.1731 - val_accuracy: 0.9400
Epoch 30/100
29/29 [==============================] - 0s 2ms/step - loss: 0.2753 - accuracy: 0.8794 - val_loss: 0.1758 - val_accuracy: 0.9400
Epoch 31/100
29/29 [==============================] - 0s 2ms/step - loss: 0.2727 - accuracy: 0.8844 - val_loss: 0.1767 - val_accuracy: 0.9450
Epoch 32/100
29/29 [==============================] - 0s 2ms/step - loss: 0.2784 - accuracy: 0.8878 - val_loss: 0.1787 - val_accuracy: 0.9250
Epoch 33/100
29/29 [==============================] - 0s 2ms/step - loss: 0.2609 - accuracy: 0.8989 - val_loss: 0.1698 - val_accuracy: 0.9450
Epoch 34/100
29/29 [==============================] - 0s 2ms/step - loss: 0.2746 - accuracy: 0.8861 - val_loss: 0.1705 - val_accuracy: 0.9350
Epoch 35/100
29/29 [==============================] - 0s 2ms/step - loss: 0.2715 - accuracy: 0.8878 - val_loss: 0.1683 - val_accuracy: 0.9450
Epoch 36/100
29/29 [==============================] - 0s 2ms/step - loss: 0.2656 - accuracy: 0.8944 - val_loss: 0.1743 - val_accuracy: 0.9350
Epoch 37/100
29/29 [==============================] - 0s 2ms/step - loss: 0.2562 - accuracy: 0.8928 - val_loss: 0.1717 - val_accuracy: 0.9300
Epoch 38/100
29/29 [==============================] - 0s 2ms/step - loss: 0.2610 - accuracy: 0.8967 - val_loss: 0.1742 - val_accuracy: 0.9350
Epoch 39/100
29/29 [==============================] - 0s 2ms/step - loss: 0.2476 - accuracy: 0.8989 - val_loss: 0.1730 - val_accuracy: 0.9300
Epoch 40/100
29/29 [==============================] - 0s 2ms/step - loss: 0.2669 - accuracy: 0.8889 - val_loss: 0.1654 - val_accuracy: 0.9450
Epoch 41/100
29/29 [==============================] - 0s 2ms/step - loss: 0.2603 - accuracy: 0.8911 - val_loss: 0.1647 - val_accuracy: 0.9450
Epoch 42/100
29/29 [==============================] - 0s 2ms/step - loss: 0.2435 - accuracy: 0.8978 - val_loss: 0.1690 - val_accuracy: 0.9350
Epoch 43/100
29/29 [==============================] - 0s 2ms/step - loss: 0.2535 - accuracy: 0.8956 - val_loss: 0.1748 - val_accuracy: 0.9400
Epoch 44/100
29/29 [==============================] - 0s 2ms/step - loss: 0.2601 - accuracy: 0.8939 - val_loss: 0.1753 - val_accuracy: 0.9400
Epoch 45/100
29/29 [==============================] - 0s 2ms/step - loss: 0.2454 - accuracy: 0.8961 - val_loss: 0.1737 - val_accuracy: 0.9350
Epoch 46/100
29/29 [==============================] - 0s 2ms/step - loss: 0.2470 - accuracy: 0.9022 - val_loss: 0.1666 - val_accuracy: 0.9400
Epoch 47/100
29/29 [==============================] - 0s 2ms/step - loss: 0.2493 - accuracy: 0.8967 - val_loss: 0.1702 - val_accuracy: 0.9500
Epoch 48/100
29/29 [==============================] - 0s 2ms/step - loss: 0.2443 - accuracy: 0.9000 - val_loss: 0.1652 - val_accuracy: 0.9450
Epoch 49/100
29/29 [==============================] - 0s 2ms/step - loss: 0.2439 - accuracy: 0.8972 - val_loss: 0.1606 - val_accuracy: 0.9350
Epoch 50/100
29/29 [==============================] - 0s 2ms/step - loss: 0.2559 - accuracy: 0.8978 - val_loss: 0.1669 - val_accuracy: 0.9400
Epoch 51/100
29/29 [==============================] - 0s 2ms/step - loss: 0.2600 - accuracy: 0.8922 - val_loss: 0.1658 - val_accuracy: 0.9450
Epoch 52/100
29/29 [==============================] - 0s 2ms/step - loss: 0.2431 - accuracy: 0.9017 - val_loss: 0.1598 - val_accuracy: 0.9550
Epoch 53/100
29/29 [==============================] - 0s 2ms/step - loss: 0.2437 - accuracy: 0.8989 - val_loss: 0.1645 - val_accuracy: 0.9350
Epoch 54/100
29/29 [==============================] - 0s 2ms/step - loss: 0.2422 - accuracy: 0.8939 - val_loss: 0.1600 - val_accuracy: 0.9500
Epoch 55/100
29/29 [==============================] - 0s 2ms/step - loss: 0.2437 - accuracy: 0.9044 - val_loss: 0.1649 - val_accuracy: 0.9550
Epoch 56/100
29/29 [==============================] - 0s 2ms/step - loss: 0.2316 - accuracy: 0.8994 - val_loss: 0.1682 - val_accuracy: 0.9500
Epoch 57/100
29/29 [==============================] - 0s 2ms/step - loss: 0.2480 - accuracy: 0.8994 - val_loss: 0.1678 - val_accuracy: 0.9400
Epoch 58/100
29/29 [==============================] - 0s 2ms/step - loss: 0.2560 - accuracy: 0.8911 - val_loss: 0.1757 - val_accuracy: 0.9450
Epoch 59/100
29/29 [==============================] - 0s 2ms/step - loss: 0.2595 - accuracy: 0.8933 - val_loss: 0.1664 - val_accuracy: 0.9450
Epoch 60/100
29/29 [==============================] - 0s 2ms/step - loss: 0.2457 - accuracy: 0.8983 - val_loss: 0.1660 - val_accuracy: 0.9400
Epoch 61/100
29/29 [==============================] - 0s 2ms/step - loss: 0.2374 - accuracy: 0.9039 - val_loss: 0.1613 - val_accuracy: 0.9450
Epoch 62/100
29/29 [==============================] - 0s 2ms/step - loss: 0.2221 - accuracy: 0.9122 - val_loss: 0.1611 - val_accuracy: 0.9450
Epoch 63/100
29/29 [==============================] - 0s 2ms/step - loss: 0.2397 - accuracy: 0.9000 - val_loss: 0.1618 - val_accuracy: 0.9600
Epoch 64/100
29/29 [==============================] - 0s 2ms/step - loss: 0.2345 - accuracy: 0.9078 - val_loss: 0.1654 - val_accuracy: 0.9550
Epoch 65/100
29/29 [==============================] - 0s 2ms/step - loss: 0.2363 - accuracy: 0.9056 - val_loss: 0.1616 - val_accuracy: 0.9550
Epoch 66/100
29/29 [==============================] - 0s 2ms/step - loss: 0.2284 - accuracy: 0.9089 - val_loss: 0.1580 - val_accuracy: 0.9550
Epoch 67/100
29/29 [==============================] - 0s 2ms/step - loss: 0.2280 - accuracy: 0.9022 - val_loss: 0.1619 - val_accuracy: 0.9500
Epoch 68/100
29/29 [==============================] - 0s 2ms/step - loss: 0.2145 - accuracy: 0.9106 - val_loss: 0.1572 - val_accuracy: 0.9500
Epoch 69/100
29/29 [==============================] - 0s 2ms/step - loss: 0.2192 - accuracy: 0.9117 - val_loss: 0.1591 - val_accuracy: 0.9350
Epoch 70/100
29/29 [==============================] - 0s 2ms/step - loss: 0.2318 - accuracy: 0.9011 - val_loss: 0.1647 - val_accuracy: 0.9400
Epoch 71/100
29/29 [==============================] - 0s 2ms/step - loss: 0.2272 - accuracy: 0.9106 - val_loss: 0.1600 - val_accuracy: 0.9450
Epoch 72/100
29/29 [==============================] - 0s 2ms/step - loss: 0.2358 - accuracy: 0.9089 - val_loss: 0.1629 - val_accuracy: 0.9400
Epoch 73/100
29/29 [==============================] - 0s 2ms/step - loss: 0.2416 - accuracy: 0.8978 - val_loss: 0.1685 - val_accuracy: 0.9450
Epoch 74/100
29/29 [==============================] - 0s 2ms/step - loss: 0.2147 - accuracy: 0.9117 - val_loss: 0.1592 - val_accuracy: 0.9550
Epoch 75/100
29/29 [==============================] - 0s 2ms/step - loss: 0.2157 - accuracy: 0.9150 - val_loss: 0.1509 - val_accuracy: 0.9550
Epoch 76/100
29/29 [==============================] - 0s 2ms/step - loss: 0.2278 - accuracy: 0.9033 - val_loss: 0.1586 - val_accuracy: 0.9600
Epoch 77/100
29/29 [==============================] - 0s 2ms/step - loss: 0.2163 - accuracy: 0.9156 - val_loss: 0.1581 - val_accuracy: 0.9500
Epoch 78/100
29/29 [==============================] - 0s 2ms/step - loss: 0.2223 - accuracy: 0.9033 - val_loss: 0.1535 - val_accuracy: 0.9500
Epoch 79/100
29/29 [==============================] - 0s 2ms/step - loss: 0.2082 - accuracy: 0.9161 - val_loss: 0.1534 - val_accuracy: 0.9450
Epoch 80/100
29/29 [==============================] - 0s 2ms/step - loss: 0.2102 - accuracy: 0.9128 - val_loss: 0.1498 - val_accuracy: 0.9550
Epoch 81/100
29/29 [==============================] - 0s 2ms/step - loss: 0.2090 - accuracy: 0.9150 - val_loss: 0.1543 - val_accuracy: 0.9500
Epoch 82/100
29/29 [==============================] - 0s 2ms/step - loss: 0.2387 - accuracy: 0.9028 - val_loss: 0.1588 - val_accuracy: 0.9450
Epoch 83/100
29/29 [==============================] - 0s 2ms/step - loss: 0.2261 - accuracy: 0.9000 - val_loss: 0.1573 - val_accuracy: 0.9350
Epoch 84/100
29/29 [==============================] - 0s 2ms/step - loss: 0.2292 - accuracy: 0.9044 - val_loss: 0.1611 - val_accuracy: 0.9450
Epoch 85/100
29/29 [==============================] - 0s 2ms/step - loss: 0.2067 - accuracy: 0.9156 - val_loss: 0.1508 - val_accuracy: 0.9400
Epoch 86/100
29/29 [==============================] - 0s 2ms/step - loss: 0.2023 - accuracy: 0.9206 - val_loss: 0.1592 - val_accuracy: 0.9450
Epoch 87/100
29/29 [==============================] - 0s 2ms/step - loss: 0.2267 - accuracy: 0.9044 - val_loss: 0.1576 - val_accuracy: 0.9500
Epoch 88/100
29/29 [==============================] - 0s 2ms/step - loss: 0.2179 - accuracy: 0.9011 - val_loss: 0.1560 - val_accuracy: 0.9450
Epoch 89/100
29/29 [==============================] - 0s 2ms/step - loss: 0.2042 - accuracy: 0.9094 - val_loss: 0.1591 - val_accuracy: 0.9450
Epoch 90/100
29/29 [==============================] - 0s 2ms/step - loss: 0.2024 - accuracy: 0.9150 - val_loss: 0.1777 - val_accuracy: 0.9300
Epoch 91/100
29/29 [==============================] - 0s 2ms/step - loss: 0.2039 - accuracy: 0.9167 - val_loss: 0.1665 - val_accuracy: 0.9400
Epoch 92/100
29/29 [==============================] - 0s 2ms/step - loss: 0.2034 - accuracy: 0.9150 - val_loss: 0.1659 - val_accuracy: 0.9450
Epoch 93/100
29/29 [==============================] - 0s 2ms/step - loss: 0.2143 - accuracy: 0.9094 - val_loss: 0.1605 - val_accuracy: 0.9450
Epoch 94/100
29/29 [==============================] - 0s 2ms/step - loss: 0.2124 - accuracy: 0.9106 - val_loss: 0.1655 - val_accuracy: 0.9450
Epoch 95/100
29/29 [==============================] - 0s 2ms/step - loss: 0.2086 - accuracy: 0.9144 - val_loss: 0.1588 - val_accuracy: 0.9400
Epoch 96/100
29/29 [==============================] - 0s 2ms/step - loss: 0.2000 - accuracy: 0.9178 - val_loss: 0.1576 - val_accuracy: 0.9400
Epoch 97/100
29/29 [==============================] - 0s 2ms/step - loss: 0.2122 - accuracy: 0.9106 - val_loss: 0.1960 - val_accuracy: 0.9350
Epoch 98/100
29/29 [==============================] - 0s 2ms/step - loss: 0.2213 - accuracy: 0.9072 - val_loss: 0.1764 - val_accuracy: 0.9250
Epoch 99/100
29/29 [==============================] - 0s 2ms/step - loss: 0.2113 - accuracy: 0.9128 - val_loss: 0.1677 - val_accuracy: 0.9350
Epoch 100/100
29/29 [==============================] - 0s 2ms/step - loss: 0.2086 - accuracy: 0.9028 - val_loss: 0.1659 - val_accuracy: 0.9350

Plotting the graph for Model Accuracy

In [20]:
# Model Accuracy

plt.plot(history1.history['accuracy'])
plt.plot(history1.history['val_accuracy'])
plt.title('Model accuracy')
plt.ylabel('Accuracy')
plt.xlabel('Epoch')
plt.legend(['Train', 'Test'], loc='upper left')
plt.show()

Plotting graph for Model Loss

In [21]:
# Model Loss

plt.plot(history1.history['loss'])
plt.plot(history1.history['val_loss'])
plt.title('Model loss')
plt.ylabel('Loss')
plt.xlabel('Epoch')
plt.legend(['Train', 'Test'], loc='upper left')
plt.show()

Saving the Model

Now, we save our model

In [22]:
model.save('mobile_price_range.h5')

Doing some predictions

In [23]:
from tensorflow.keras.models import load_model
In [24]:
# loading the model for predictions

m = load_model('mobile_price_range.h5')
In [25]:
# predicting the values

m.predict_classes(X_test)
WARNING:tensorflow:From <ipython-input-25-ee424f36794e>:3: Sequential.predict_classes (from tensorflow.python.keras.engine.sequential) is deprecated and will be removed after 2021-01-01.
Instructions for updating:
Please use instead:* `np.argmax(model.predict(x), axis=-1)`,   if your model does multi-class classification   (e.g. if it uses a `softmax` last-layer activation).* `(model.predict(x) > 0.5).astype("int32")`,   if your model does binary classification   (e.g. if it uses a `sigmoid` last-layer activation).
Out[25]:
array([3, 0, 2, 2, 3, 0, 0, 2, 3, 1, 0, 3, 0, 2, 3, 0, 3, 2, 2, 1, 0, 0,
       3, 1, 2, 2, 3, 1, 3, 1, 1, 0, 2, 0, 2, 3, 0, 0, 3, 3, 3, 1, 3, 3,
       1, 3, 0, 1, 3, 1, 1, 3, 0, 3, 0, 2, 2, 2, 0, 3, 3, 1, 3, 2, 1, 2,
       3, 3, 2, 2, 3, 2, 1, 0, 1, 3, 2, 2, 1, 2, 3, 3, 3, 0, 0, 0, 2, 1,
       2, 3, 1, 2, 2, 1, 0, 3, 3, 3, 0, 3, 1, 1, 3, 1, 3, 2, 2, 3, 2, 3,
       3, 0, 0, 1, 2, 3, 0, 0, 1, 0, 0, 3, 2, 2, 1, 2, 1, 1, 0, 2, 1, 3,
       3, 3, 3, 3, 3, 2, 0, 1, 1, 2, 1, 3, 0, 3, 0, 0, 2, 0, 1, 1, 1, 1,
       3, 0, 0, 3, 1, 3, 2, 1, 3, 1, 2, 3, 3, 2, 1, 0, 3, 1, 2, 3, 3, 0,
       2, 2, 3, 0, 2, 1, 0, 1, 2, 1, 2, 0, 2, 3, 1, 1, 0, 2, 2, 0, 1, 2,
       2, 0])

DeepCC

In [ ]:
!deepCC mobile_price_range.h5
[INFO]
Reading [keras model] 'mobile_price_range.h5'
In [ ]: