Cainvas
Model Files
road_crack.h5
keras
Model
deepSea Compiled Models
road_crack.exe
deepSea
Ubuntu

Road Crack Detection

Credit: AITS Cainvas Community

Photo by Teodor Hristov for Lobster on Dribbble

By using the we can assess the quality of the road, Also mark the landmarks where the damage is made and thereby allowing fast repair planning.

In [1]:
from matplotlib import pyplot as plt
import cv2
import numpy as np
from PIL import Image
import pandas as pd
import matplotlib.pyplot as plt
import tensorflow as tf
from tensorflow.keras import layers, callbacks, optimizers
from sklearn.metrics import confusion_matrix, f1_score
from tensorflow import keras 
import os, shutil
import random
from PIL import Image

Getting the Dataset

In [2]:
!wget "https://cainvas-static.s3.amazonaws.com/media/user_data/cainvas-admin/Road_Crack_DCSV9HG.zip"
!unzip -qo "Road_Crack.zip"
--2021-08-20 11:43:12--  https://cainvas-static.s3.amazonaws.com/media/user_data/cainvas-admin/Road_Crack_DCSV9HG.zip
Resolving cainvas-static.s3.amazonaws.com (cainvas-static.s3.amazonaws.com)... 52.219.160.27
Connecting to cainvas-static.s3.amazonaws.com (cainvas-static.s3.amazonaws.com)|52.219.160.27|:443... connected.
HTTP request sent, awaiting response... 200 OK
Length: 87484192 (83M) [application/x-zip-compressed]
Saving to: ‘Road_Crack_DCSV9HG.zip’

Road_Crack_DCSV9HG. 100%[===================>]  83.43M  75.0MB/s    in 1.1s    

2021-08-20 11:43:13 (75.0 MB/s) - ‘Road_Crack_DCSV9HG.zip’ saved [87484192/87484192]

In [3]:
data_dir = 'Road Crack/'

batch_size = 64
# image_size = (32, 32)
image_size = (28, 28)

print("Training set")
train_ds = tf.keras.preprocessing.image_dataset_from_directory(
                data_dir,
                validation_split=0.2,
                subset="training",
                color_mode="grayscale",
                image_size=image_size, 
                seed=113,
                shuffle=True,
                batch_size=batch_size
            )

print("Validation set")
val_ds = tf.keras.preprocessing.image_dataset_from_directory(
                data_dir,
                validation_split=0.2,
                subset="validation",
                color_mode="grayscale",
                image_size=image_size, 
                seed=113,
                shuffle=True,
                batch_size=batch_size
            )
Training set
Found 40000 files belonging to 2 classes.
Using 32000 files for training.
Validation set
Found 40000 files belonging to 2 classes.
Using 8000 files for validation.

As you can see the dataset is containing 40000 images segregation on Postive and Negative with negative having no cracks and positive contains crack

In [4]:
Xtrain = np.empty((0,*image_size,1))
ytrain = np.empty((0,1))

for x in train_ds.enumerate():
    for y in x[1][0]:  
        Xtrain = np.append(Xtrain, np.expand_dims(np.array(y),0), axis = 0)
    #print(Xtrain.shape)
    ytrain = np.append(ytrain, np.array(x[1][1]))
    #print(ytrain.shape)
    
Xtrain.shape, ytrain.shape
Out[4]:
((32000, 28, 28, 1), (32000,))
In [5]:
class_names = train_ds.class_names
print(class_names)
['Negative', 'Positive']
In [6]:
Xval = np.empty((0,*image_size,1))
yval = np.empty((0,1))

for x in val_ds.enumerate():
    for y in x[1][0]:  
        Xval = np.append(Xval, np.expand_dims(np.array(y),0), axis = 0)
    #print(Xtrain.shape)
    yval = np.append(yval, np.array(x[1][1]))
    #print(ytrain.shape)
    
Xval.shape, yval.shape
Out[6]:
((8000, 28, 28, 1), (8000,))
In [7]:
print("Number of samples - ")
for i in range(len(class_names)):
    print(class_names[i], "-", yval.tolist().count(float(i)))
Number of samples - 
Negative - 3999
Positive - 4001

Lets visualize the data

In [8]:
num_samples = 4    # the number of samples to be displayed in each class

for x in class_names:
    plt.figure(figsize=(10, 10))

    filenames = os.listdir(data_dir + x)

    for i in range(num_samples):
        ax = plt.subplot(1, num_samples, i + 1)
        img = Image.open(os.path.join(data_dir, x, filenames[i]))
        plt.imshow(img)
        plt.title(x)
        plt.axis("off")
In [9]:
Xtrain = Xtrain/255
Xval = Xval/255

Model

In [10]:
model = keras.models.Sequential([
    layers.Conv2D(8, 3, activation='relu', input_shape=Xtrain[0].shape),
    layers.MaxPool2D(pool_size=(2, 2)),
    
    layers.Conv2D(16, 3, activation='relu'),
    layers.MaxPool2D(pool_size=(2, 2)),
    
    layers.Conv2D(32, 3, activation='relu'),
    layers.MaxPool2D(pool_size=(2, 2)),
    
    layers.Flatten(),
    layers.Dense(32, activation='relu'),
    layers.Dense(1, activation='sigmoid')
])

cb = [callbacks.EarlyStopping(monitor = 'val_loss', patience = 5, restore_best_weights = True)]
model.summary()
Model: "sequential"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
conv2d (Conv2D)              (None, 26, 26, 8)         80        
_________________________________________________________________
max_pooling2d (MaxPooling2D) (None, 13, 13, 8)         0         
_________________________________________________________________
conv2d_1 (Conv2D)            (None, 11, 11, 16)        1168      
_________________________________________________________________
max_pooling2d_1 (MaxPooling2 (None, 5, 5, 16)          0         
_________________________________________________________________
conv2d_2 (Conv2D)            (None, 3, 3, 32)          4640      
_________________________________________________________________
max_pooling2d_2 (MaxPooling2 (None, 1, 1, 32)          0         
_________________________________________________________________
flatten (Flatten)            (None, 32)                0         
_________________________________________________________________
dense (Dense)                (None, 32)                1056      
_________________________________________________________________
dense_1 (Dense)              (None, 1)                 33        
=================================================================
Total params: 6,977
Trainable params: 6,977
Non-trainable params: 0
_________________________________________________________________
In [11]:
model.compile(loss=keras.losses.BinaryCrossentropy(), optimizer=optimizers.Adam(0.0001), metrics=['accuracy'])

history = model.fit(Xtrain, ytrain, validation_data=(Xval, yval), epochs=300, callbacks=cb)
Epoch 1/300
1000/1000 [==============================] - 2s 2ms/step - loss: 0.3346 - accuracy: 0.9195 - val_loss: 0.1028 - val_accuracy: 0.9689
Epoch 2/300
1000/1000 [==============================] - 2s 2ms/step - loss: 0.0844 - accuracy: 0.9724 - val_loss: 0.0776 - val_accuracy: 0.9744
Epoch 3/300
1000/1000 [==============================] - 2s 2ms/step - loss: 0.0720 - accuracy: 0.9765 - val_loss: 0.0714 - val_accuracy: 0.9759
Epoch 4/300
1000/1000 [==============================] - 2s 2ms/step - loss: 0.0674 - accuracy: 0.9774 - val_loss: 0.0669 - val_accuracy: 0.9778
Epoch 5/300
1000/1000 [==============================] - 2s 2ms/step - loss: 0.0636 - accuracy: 0.9796 - val_loss: 0.0666 - val_accuracy: 0.9789
Epoch 6/300
1000/1000 [==============================] - 2s 2ms/step - loss: 0.0611 - accuracy: 0.9800 - val_loss: 0.0619 - val_accuracy: 0.9803
Epoch 7/300
1000/1000 [==============================] - 2s 2ms/step - loss: 0.0590 - accuracy: 0.9813 - val_loss: 0.0595 - val_accuracy: 0.9818
Epoch 8/300
1000/1000 [==============================] - 2s 2ms/step - loss: 0.0572 - accuracy: 0.9818 - val_loss: 0.0573 - val_accuracy: 0.9830
Epoch 9/300
1000/1000 [==============================] - 2s 2ms/step - loss: 0.0555 - accuracy: 0.9819 - val_loss: 0.0575 - val_accuracy: 0.9829
Epoch 10/300
1000/1000 [==============================] - 2s 2ms/step - loss: 0.0541 - accuracy: 0.9829 - val_loss: 0.0544 - val_accuracy: 0.9835
Epoch 11/300
1000/1000 [==============================] - 2s 2ms/step - loss: 0.0528 - accuracy: 0.9835 - val_loss: 0.0601 - val_accuracy: 0.9811
Epoch 12/300
1000/1000 [==============================] - 2s 2ms/step - loss: 0.0518 - accuracy: 0.9838 - val_loss: 0.0527 - val_accuracy: 0.9834
Epoch 13/300
1000/1000 [==============================] - 2s 2ms/step - loss: 0.0506 - accuracy: 0.9840 - val_loss: 0.0520 - val_accuracy: 0.9850
Epoch 14/300
1000/1000 [==============================] - 2s 2ms/step - loss: 0.0498 - accuracy: 0.9841 - val_loss: 0.0541 - val_accuracy: 0.9822
Epoch 15/300
1000/1000 [==============================] - 2s 2ms/step - loss: 0.0489 - accuracy: 0.9846 - val_loss: 0.0504 - val_accuracy: 0.9839
Epoch 16/300
1000/1000 [==============================] - 2s 2ms/step - loss: 0.0482 - accuracy: 0.9845 - val_loss: 0.0491 - val_accuracy: 0.9844
Epoch 17/300
1000/1000 [==============================] - 2s 2ms/step - loss: 0.0475 - accuracy: 0.9849 - val_loss: 0.0477 - val_accuracy: 0.9854
Epoch 18/300
1000/1000 [==============================] - 2s 2ms/step - loss: 0.0470 - accuracy: 0.9854 - val_loss: 0.0476 - val_accuracy: 0.9855
Epoch 19/300
1000/1000 [==============================] - 2s 2ms/step - loss: 0.0461 - accuracy: 0.9855 - val_loss: 0.0473 - val_accuracy: 0.9859
Epoch 20/300
1000/1000 [==============================] - 2s 2ms/step - loss: 0.0456 - accuracy: 0.9860 - val_loss: 0.0466 - val_accuracy: 0.9852
Epoch 21/300
1000/1000 [==============================] - 2s 2ms/step - loss: 0.0450 - accuracy: 0.9857 - val_loss: 0.0492 - val_accuracy: 0.9844
Epoch 22/300
1000/1000 [==============================] - 2s 2ms/step - loss: 0.0447 - accuracy: 0.9863 - val_loss: 0.0453 - val_accuracy: 0.9855
Epoch 23/300
1000/1000 [==============================] - 2s 2ms/step - loss: 0.0441 - accuracy: 0.9858 - val_loss: 0.0445 - val_accuracy: 0.9860
Epoch 24/300
1000/1000 [==============================] - 2s 2ms/step - loss: 0.0437 - accuracy: 0.9865 - val_loss: 0.0447 - val_accuracy: 0.9860
Epoch 25/300
1000/1000 [==============================] - 2s 2ms/step - loss: 0.0438 - accuracy: 0.9861 - val_loss: 0.0454 - val_accuracy: 0.9861
Epoch 26/300
1000/1000 [==============================] - 2s 2ms/step - loss: 0.0430 - accuracy: 0.9865 - val_loss: 0.0443 - val_accuracy: 0.9864
Epoch 27/300
1000/1000 [==============================] - 2s 2ms/step - loss: 0.0426 - accuracy: 0.9869 - val_loss: 0.0440 - val_accuracy: 0.9864
Epoch 28/300
1000/1000 [==============================] - 2s 2ms/step - loss: 0.0421 - accuracy: 0.9866 - val_loss: 0.0428 - val_accuracy: 0.9869
Epoch 29/300
1000/1000 [==============================] - 2s 2ms/step - loss: 0.0417 - accuracy: 0.9869 - val_loss: 0.0427 - val_accuracy: 0.9874
Epoch 30/300
1000/1000 [==============================] - 2s 2ms/step - loss: 0.0413 - accuracy: 0.9869 - val_loss: 0.0424 - val_accuracy: 0.9869
Epoch 31/300
1000/1000 [==============================] - 2s 2ms/step - loss: 0.0410 - accuracy: 0.9869 - val_loss: 0.0419 - val_accuracy: 0.9870
Epoch 32/300
1000/1000 [==============================] - 2s 2ms/step - loss: 0.0406 - accuracy: 0.9869 - val_loss: 0.0413 - val_accuracy: 0.9870
Epoch 33/300
1000/1000 [==============================] - 2s 2ms/step - loss: 0.0402 - accuracy: 0.9876 - val_loss: 0.0410 - val_accuracy: 0.9874
Epoch 34/300
1000/1000 [==============================] - 2s 2ms/step - loss: 0.0397 - accuracy: 0.9873 - val_loss: 0.0418 - val_accuracy: 0.9872
Epoch 35/300
1000/1000 [==============================] - 2s 2ms/step - loss: 0.0395 - accuracy: 0.9877 - val_loss: 0.0418 - val_accuracy: 0.9877
Epoch 36/300
1000/1000 [==============================] - 2s 2ms/step - loss: 0.0390 - accuracy: 0.9878 - val_loss: 0.0420 - val_accuracy: 0.9872
Epoch 37/300
1000/1000 [==============================] - 2s 2ms/step - loss: 0.0389 - accuracy: 0.9876 - val_loss: 0.0397 - val_accuracy: 0.9881
Epoch 38/300
1000/1000 [==============================] - 2s 2ms/step - loss: 0.0384 - accuracy: 0.9880 - val_loss: 0.0415 - val_accuracy: 0.9875
Epoch 39/300
1000/1000 [==============================] - 2s 2ms/step - loss: 0.0383 - accuracy: 0.9877 - val_loss: 0.0393 - val_accuracy: 0.9879
Epoch 40/300
1000/1000 [==============================] - 2s 2ms/step - loss: 0.0377 - accuracy: 0.9883 - val_loss: 0.0392 - val_accuracy: 0.9883
Epoch 41/300
1000/1000 [==============================] - 2s 2ms/step - loss: 0.0376 - accuracy: 0.9884 - val_loss: 0.0398 - val_accuracy: 0.9876
Epoch 42/300
1000/1000 [==============================] - 2s 2ms/step - loss: 0.0366 - accuracy: 0.9886 - val_loss: 0.0390 - val_accuracy: 0.9885
Epoch 43/300
1000/1000 [==============================] - 2s 2ms/step - loss: 0.0367 - accuracy: 0.9885 - val_loss: 0.0380 - val_accuracy: 0.9890
Epoch 44/300
1000/1000 [==============================] - 2s 2ms/step - loss: 0.0363 - accuracy: 0.9886 - val_loss: 0.0384 - val_accuracy: 0.9889
Epoch 45/300
1000/1000 [==============================] - 2s 2ms/step - loss: 0.0359 - accuracy: 0.9889 - val_loss: 0.0376 - val_accuracy: 0.9894
Epoch 46/300
1000/1000 [==============================] - 2s 2ms/step - loss: 0.0360 - accuracy: 0.9887 - val_loss: 0.0387 - val_accuracy: 0.9899
Epoch 47/300
1000/1000 [==============================] - 2s 2ms/step - loss: 0.0353 - accuracy: 0.9890 - val_loss: 0.0388 - val_accuracy: 0.9877
Epoch 48/300
1000/1000 [==============================] - 2s 2ms/step - loss: 0.0350 - accuracy: 0.9890 - val_loss: 0.0378 - val_accuracy: 0.9890
Epoch 49/300
1000/1000 [==============================] - 2s 2ms/step - loss: 0.0349 - accuracy: 0.9891 - val_loss: 0.0426 - val_accuracy: 0.9868
Epoch 50/300
1000/1000 [==============================] - 2s 2ms/step - loss: 0.0344 - accuracy: 0.9890 - val_loss: 0.0375 - val_accuracy: 0.9884
Epoch 51/300
1000/1000 [==============================] - 2s 2ms/step - loss: 0.0341 - accuracy: 0.9894 - val_loss: 0.0366 - val_accuracy: 0.9905
Epoch 52/300
1000/1000 [==============================] - 2s 2ms/step - loss: 0.0338 - accuracy: 0.9893 - val_loss: 0.0370 - val_accuracy: 0.9904
Epoch 53/300
1000/1000 [==============================] - 2s 2ms/step - loss: 0.0336 - accuracy: 0.9900 - val_loss: 0.0361 - val_accuracy: 0.9891
Epoch 54/300
1000/1000 [==============================] - 2s 2ms/step - loss: 0.0331 - accuracy: 0.9897 - val_loss: 0.0375 - val_accuracy: 0.9883
Epoch 55/300
1000/1000 [==============================] - 2s 2ms/step - loss: 0.0330 - accuracy: 0.9896 - val_loss: 0.0355 - val_accuracy: 0.9900
Epoch 56/300
1000/1000 [==============================] - 2s 2ms/step - loss: 0.0326 - accuracy: 0.9898 - val_loss: 0.0354 - val_accuracy: 0.9899
Epoch 57/300
1000/1000 [==============================] - 2s 2ms/step - loss: 0.0321 - accuracy: 0.9900 - val_loss: 0.0357 - val_accuracy: 0.9885
Epoch 58/300
1000/1000 [==============================] - 2s 2ms/step - loss: 0.0320 - accuracy: 0.9902 - val_loss: 0.0361 - val_accuracy: 0.9889
Epoch 59/300
1000/1000 [==============================] - 2s 2ms/step - loss: 0.0313 - accuracy: 0.9902 - val_loss: 0.0345 - val_accuracy: 0.9905
Epoch 60/300
1000/1000 [==============================] - 2s 2ms/step - loss: 0.0311 - accuracy: 0.9903 - val_loss: 0.0350 - val_accuracy: 0.9896
Epoch 61/300
1000/1000 [==============================] - 2s 2ms/step - loss: 0.0310 - accuracy: 0.9901 - val_loss: 0.0334 - val_accuracy: 0.9910
Epoch 62/300
1000/1000 [==============================] - 2s 2ms/step - loss: 0.0304 - accuracy: 0.9906 - val_loss: 0.0325 - val_accuracy: 0.9911
Epoch 63/300
1000/1000 [==============================] - 2s 2ms/step - loss: 0.0302 - accuracy: 0.9905 - val_loss: 0.0340 - val_accuracy: 0.9908
Epoch 64/300
1000/1000 [==============================] - 2s 2ms/step - loss: 0.0296 - accuracy: 0.9910 - val_loss: 0.0324 - val_accuracy: 0.9912
Epoch 65/300
1000/1000 [==============================] - 2s 2ms/step - loss: 0.0291 - accuracy: 0.9912 - val_loss: 0.0318 - val_accuracy: 0.9912
Epoch 66/300
1000/1000 [==============================] - 2s 2ms/step - loss: 0.0289 - accuracy: 0.9913 - val_loss: 0.0326 - val_accuracy: 0.9902
Epoch 67/300
1000/1000 [==============================] - 2s 2ms/step - loss: 0.0285 - accuracy: 0.9914 - val_loss: 0.0326 - val_accuracy: 0.9905
Epoch 68/300
1000/1000 [==============================] - 2s 2ms/step - loss: 0.0282 - accuracy: 0.9916 - val_loss: 0.0316 - val_accuracy: 0.9914
Epoch 69/300
1000/1000 [==============================] - 2s 2ms/step - loss: 0.0283 - accuracy: 0.9915 - val_loss: 0.0311 - val_accuracy: 0.9911
Epoch 70/300
1000/1000 [==============================] - 2s 2ms/step - loss: 0.0273 - accuracy: 0.9917 - val_loss: 0.0308 - val_accuracy: 0.9918
Epoch 71/300
1000/1000 [==============================] - 2s 2ms/step - loss: 0.0272 - accuracy: 0.9919 - val_loss: 0.0313 - val_accuracy: 0.9912
Epoch 72/300
1000/1000 [==============================] - 2s 2ms/step - loss: 0.0269 - accuracy: 0.9921 - val_loss: 0.0305 - val_accuracy: 0.9918
Epoch 73/300
1000/1000 [==============================] - 2s 2ms/step - loss: 0.0267 - accuracy: 0.9922 - val_loss: 0.0335 - val_accuracy: 0.9910
Epoch 74/300
1000/1000 [==============================] - 2s 2ms/step - loss: 0.0267 - accuracy: 0.9921 - val_loss: 0.0311 - val_accuracy: 0.9908
Epoch 75/300
1000/1000 [==============================] - 2s 2ms/step - loss: 0.0259 - accuracy: 0.9922 - val_loss: 0.0301 - val_accuracy: 0.9919
Epoch 76/300
1000/1000 [==============================] - 2s 2ms/step - loss: 0.0256 - accuracy: 0.9928 - val_loss: 0.0297 - val_accuracy: 0.9919
Epoch 77/300
1000/1000 [==============================] - 2s 2ms/step - loss: 0.0255 - accuracy: 0.9928 - val_loss: 0.0310 - val_accuracy: 0.9915
Epoch 78/300
1000/1000 [==============================] - 2s 2ms/step - loss: 0.0256 - accuracy: 0.9924 - val_loss: 0.0295 - val_accuracy: 0.9919
Epoch 79/300
1000/1000 [==============================] - 2s 2ms/step - loss: 0.0250 - accuracy: 0.9926 - val_loss: 0.0289 - val_accuracy: 0.9924
Epoch 80/300
1000/1000 [==============================] - 2s 2ms/step - loss: 0.0244 - accuracy: 0.9929 - val_loss: 0.0283 - val_accuracy: 0.9921
Epoch 81/300
1000/1000 [==============================] - 2s 2ms/step - loss: 0.0242 - accuracy: 0.9931 - val_loss: 0.0285 - val_accuracy: 0.9924
Epoch 82/300
1000/1000 [==============================] - 2s 2ms/step - loss: 0.0243 - accuracy: 0.9930 - val_loss: 0.0292 - val_accuracy: 0.9916
Epoch 83/300
1000/1000 [==============================] - 2s 2ms/step - loss: 0.0240 - accuracy: 0.9928 - val_loss: 0.0293 - val_accuracy: 0.9918
Epoch 84/300
1000/1000 [==============================] - 2s 2ms/step - loss: 0.0238 - accuracy: 0.9933 - val_loss: 0.0307 - val_accuracy: 0.9906
Epoch 85/300
1000/1000 [==============================] - 2s 2ms/step - loss: 0.0235 - accuracy: 0.9935 - val_loss: 0.0278 - val_accuracy: 0.9923
Epoch 86/300
1000/1000 [==============================] - 2s 2ms/step - loss: 0.0231 - accuracy: 0.9934 - val_loss: 0.0288 - val_accuracy: 0.9918
Epoch 87/300
1000/1000 [==============================] - 2s 2ms/step - loss: 0.0230 - accuracy: 0.9936 - val_loss: 0.0281 - val_accuracy: 0.9923
Epoch 88/300
1000/1000 [==============================] - 2s 2ms/step - loss: 0.0229 - accuracy: 0.9936 - val_loss: 0.0269 - val_accuracy: 0.9931
Epoch 89/300
1000/1000 [==============================] - 2s 2ms/step - loss: 0.0225 - accuracy: 0.9934 - val_loss: 0.0269 - val_accuracy: 0.9927
Epoch 90/300
1000/1000 [==============================] - 2s 2ms/step - loss: 0.0224 - accuracy: 0.9937 - val_loss: 0.0267 - val_accuracy: 0.9923
Epoch 91/300
1000/1000 [==============================] - 2s 2ms/step - loss: 0.0226 - accuracy: 0.9935 - val_loss: 0.0284 - val_accuracy: 0.9921
Epoch 92/300
1000/1000 [==============================] - 2s 2ms/step - loss: 0.0221 - accuracy: 0.9935 - val_loss: 0.0265 - val_accuracy: 0.9927
Epoch 93/300
1000/1000 [==============================] - 2s 2ms/step - loss: 0.0221 - accuracy: 0.9939 - val_loss: 0.0262 - val_accuracy: 0.9930
Epoch 94/300
1000/1000 [==============================] - 2s 2ms/step - loss: 0.0215 - accuracy: 0.9940 - val_loss: 0.0260 - val_accuracy: 0.9931
Epoch 95/300
1000/1000 [==============================] - 2s 2ms/step - loss: 0.0212 - accuracy: 0.9942 - val_loss: 0.0368 - val_accuracy: 0.9901
Epoch 96/300
1000/1000 [==============================] - 2s 2ms/step - loss: 0.0214 - accuracy: 0.9941 - val_loss: 0.0265 - val_accuracy: 0.9930
Epoch 97/300
1000/1000 [==============================] - 2s 2ms/step - loss: 0.0214 - accuracy: 0.9940 - val_loss: 0.0269 - val_accuracy: 0.9929
Epoch 98/300
1000/1000 [==============================] - 2s 2ms/step - loss: 0.0210 - accuracy: 0.9941 - val_loss: 0.0284 - val_accuracy: 0.9918
Epoch 99/300
1000/1000 [==============================] - 2s 2ms/step - loss: 0.0210 - accuracy: 0.9944 - val_loss: 0.0260 - val_accuracy: 0.9930
In [12]:
model.evaluate(Xval, yval)
ypred = (model.predict(Xval)>0.5).astype('int')
cm = confusion_matrix(yval, ypred)

cm = cm.astype('int') / cm.sum(axis=1)[:, np.newaxis]

fig = plt.figure(figsize = (4, 4))
ax = fig.add_subplot(111)

for i in range(cm.shape[1]):
    for j in range(cm.shape[0]):
        if cm[i,j] > 0.8:
            clr = "white"
        else:
            clr = "black"
        ax.text(j, i, format(cm[i, j], '.2f'), horizontalalignment="center", color=clr)

_ = ax.imshow(cm, cmap=plt.cm.Blues)
ax.set_xticks(range(len(class_names)))
ax.set_yticks(range(len(class_names)))
ax.set_xticklabels(class_names, rotation = 90)
ax.set_yticklabels(class_names)
plt.xlabel('Predicted')
plt.ylabel('True')
plt.show()
250/250 [==============================] - 0s 976us/step - loss: 0.0260 - accuracy: 0.9931

Graphs

In [13]:
def plot(history, variable1, variable2):
    plt.plot(range(len(history[variable1])), history[variable1])
    plt.plot(range(len(history[variable2])), history[variable2])
    plt.legend([variable1, variable2])
    plt.title(variable1)


plot(history.history, "loss", 'val_loss')
In [18]:
plot(history.history, "accuracy", 'val_accuracy')
In [14]:
# pick random test data sample from one batch
x = random.randint(0, 32 - 1) # default batch size is 32

for i in val_ds.as_numpy_iterator():
    img, label = i    
    plt.axis('off')   # remove axes
    plt.imshow(img[x])    # shape from (64, 64, 64, 1) --> (64, 64, 1)
    output = model.predict(np.expand_dims(img[x],0))[0][0]    # getting output; input shape (64, 64, 3) --> (1, 64, 64, 1)
    pred = (output > 0.5).astype('int')
    print("Predicted: ", class_names[pred], '(', output, '-->', pred, ')')    # Picking the label from class_names base don the model output
    print("True: ", class_names[label[x]])
    break
Predicted:  Positive ( 1.0 --> 1 )
True:  Positive
In [15]:
x = random.randint(0, 32 - 1) # default batch size is 32

for i in val_ds.as_numpy_iterator():
    img, label = i    
    plt.axis('off')   # remove axes
    plt.imshow(img[x])    # shape from (64, 64, 64, 1) --> (64, 64, 1)
    output = model.predict(np.expand_dims(img[x],0))[0][0]    # getting output; input shape (64, 64, 3) --> (1, 64, 64, 1)
    pred = (output > 0.5).astype('int')
    print("Predicted: ", class_names[pred], '(', output, '-->', pred, ')')    # Picking the label from class_names base don the model output
    print("True: ", class_names[label[x]])
    break
Predicted:  Positive ( 1.0 --> 1 )
True:  Positive
In [16]:
model.save("road_crack.h5")

DeepCC

In [17]:
!deepCC "road_crack.h5"
[INFO]
Reading [keras model] 'road_crack.h5'
[SUCCESS]
Saved 'road_crack_deepC/road_crack.onnx'
[INFO]
Reading [onnx model] 'road_crack_deepC/road_crack.onnx'
[INFO]
Model info:
  ir_vesion : 5
  doc       : 
[WARNING]
[ONNX]: terminal (input/output) conv2d_input's shape is less than 1. Changing it to 1.
[WARNING]
[ONNX]: terminal (input/output) dense_1's shape is less than 1. Changing it to 1.
WARN (GRAPH): found operator node with the same name (dense_1) as io node.
[INFO]
Running DNNC graph sanity check ...
[SUCCESS]
Passed sanity check.
[INFO]
Writing C++ file 'road_crack_deepC/road_crack.cpp'
[INFO]
deepSea model files are ready in 'road_crack_deepC/' 
[RUNNING COMMAND]
g++ -std=c++11 -O3 -fno-rtti -fno-exceptions -I. -I/opt/tljh/user/lib/python3.7/site-packages/deepC-0.13-py3.7-linux-x86_64.egg/deepC/include -isystem /opt/tljh/user/lib/python3.7/site-packages/deepC-0.13-py3.7-linux-x86_64.egg/deepC/packages/eigen-eigen-323c052e1731 "road_crack_deepC/road_crack.cpp" -D_AITS_MAIN -o "road_crack_deepC/road_crack.exe"
[RUNNING COMMAND]
size "road_crack_deepC/road_crack.exe"
   text	   data	    bss	    dec	    hex	filename
 200811	   3768	    760	 205339	  3221b	road_crack_deepC/road_crack.exe
[SUCCESS]
Saved model as executable "road_crack_deepC/road_crack.exe"