Classifying Clothing Items from Fashion MNIST Dataset¶
Credit: AITS Cainvas Community
Photo by Ofspace Digital Agency on Dribbble
This notebook employs the use of neural network to classify different types of clothing items present in the Fashion MNIST Dataset. Tensorflow and Keras has been used to create the neural network for classifying the clothing images to their appropriate classes.
In [1]:
import numpy as np
from tensorflow import keras
from tensorflow.keras.models import Sequential
from tensorflow.keras.layers import Flatten, Dense, Dropout, BatchNormalization
import matplotlib.pyplot as plt
%matplotlib inline
In [2]:
import tensorflow as tf
gpus = tf.config.experimental.list_physical_devices('GPU')
if gpus:
try:
# Currently, memory growth needs to be the same across GPUs
for gpu in gpus:
tf.config.experimental.set_memory_growth(gpu, True)
logical_gpus = tf.config.experimental.list_logical_devices('GPU')
print(len(gpus), "Physical GPUs,", len(logical_gpus), "Logical GPUs")
except RuntimeError as e:
# Memory growth must be set before GPUs have been initialized
print(e)
Fashion MNIST Dataset
Fashion-MNIST is a dataset of Zalando's article images - consisting of a training set of 60,000 examples and a test set of 10,000 examples. Each example is a 28x28 grayscale imaeg, associated with a label from 10 classes.
- Class 0: T-shirt/top
- Class 1: Trouser
- Class 2: Pullover
- Class 3: Dress
- Class 4: Coat
- Class 5: Sandal
- Class 6: Shirt
- Class 7: Sneaker
- Class 8: Bag
- Class 9: Ankle Boot
In [3]:
from tensorflow.keras.datasets import fashion_mnist
(x_train, x_lab), (y_test,y_lab) = fashion_mnist.load_data()
print(x_train.shape)
print(y_test.shape)
plt.imshow(x_train[5])
plt.title('Class: {}'.format(x_lab[5]))
plt.show()
In [4]:
x_train = keras.utils.normalize(x_train, axis=1)
y_test = keras.utils.normalize(y_test, axis=1)
plt.imshow(x_train[5])
plt.title('Class: {}'.format(x_lab[5]))
plt.show()
In [5]:
from tensorflow.keras import regularizers
model = Sequential()
model.add(Flatten(input_shape = ((28,28))))
model.add(Dropout(0.05))
model.add(Dense(128, activation = "relu"))
model.add(Dense(64, activation = "relu"))
model.add(Dropout(0.05))
model.add(Dense(10,activation="softmax"))
model.summary()
In [6]:
model.compile(optimizer="adam", loss="sparse_categorical_crossentropy", metrics=["accuracy"])
history = model.fit(x_train, x_lab, validation_split = 0.33, epochs = 35)
In [7]:
model.evaluate(y_test,y_lab)
Out[7]:
In [8]:
plt.plot(history.history['accuracy'])
plt.plot(history.history['val_accuracy'])
plt.title('model accuracy')
plt.ylabel('accuracy')
plt.xlabel('epoch')
plt.legend(['train', 'test'], loc='upper left')
plt.show()
In [9]:
plt.plot(history.history['loss'])
plt.plot(history.history['val_loss'])
plt.title('model loss')
plt.ylabel('loss')
plt.xlabel('epoch')
plt.legend(['train', 'test'], loc='upper left')
plt.show()
In [10]:
p = model.predict(y_test[:10])
print(p)
In [11]:
pred = np.argmax(p, axis = 1)
print(pred)
print(y_lab[:10])
In [12]:
plt.figure(figsize = (10,10))
for i in range(10):
plt.subplot(5,5,i+1)
plt.imshow(y_test[i])
plt.title('Original: {}, Predicted: {}'.format(y_lab[i], pred[i]))
plt.axis('Off')
plt.subplots_adjust(left=1.5, right=2.5, top=1.2)
plt.show()
In [13]:
model.save("fashion_mnist.h5")
deepCC¶
In [14]:
!deepCC fashion_mnist.h5