Cainvas
Model Files
fuel.h5
keras
Model
deepSea Compiled Models
fuel.exe
deepSea
Ubuntu

Fuel consumption prediction

Credit: AITS Cainvas Community

Photo by Tim Constantinov on Dribbble

Taking into account multiple factors such as distance, speed, temperatures inside and outside, AC, and other weather conditions to predict the consumption of different types of fuels during drives.

In [1]:
import pandas as pd
import numpy as np
from sklearn.preprocessing import OneHotEncoder, StandardScaler
from sklearn.model_selection import train_test_split
from keras import models, optimizers, losses, layers, callbacks
from sklearn.metrics import confusion_matrix, auc, roc_auc_score, r2_score
import matplotlib.pyplot as plt
import warnings
warnings.simplefilter("ignore")

The dataset

On Kaggle by Andreas Wagener

The dataset is a CSV file which was recorded by Andreas on everyday rides by noting factors such as weather (raining or warm), temperatures inside and outside, average speed etc. in addition to the distance travelled and corresponding fuel consumed per 100 km. The values are recorded for two different ypes of fuels - E10 and SP98.

(A similar dataset used for consumption prediction for any type of fuel)

In [2]:
df = pd.read_csv('https://cainvas-static.s3.amazonaws.com/media/user_data/cainvas-admin/measurements.csv')
df
Out[2]:
distance consume speed temp_inside temp_outside specials gas_type AC rain sun refill liters refill gas
0 28 5 26 21,5 12 NaN E10 0 0 0 45 E10
1 12 4,2 30 21,5 13 NaN E10 0 0 0 NaN NaN
2 11,2 5,5 38 21,5 15 NaN E10 0 0 0 NaN NaN
3 12,9 3,9 36 21,5 14 NaN E10 0 0 0 NaN NaN
4 18,5 4,5 46 21,5 15 NaN E10 0 0 0 NaN NaN
... ... ... ... ... ... ... ... ... ... ... ... ...
383 16 3,7 39 24,5 18 NaN SP98 0 0 0 NaN NaN
384 16,1 4,3 38 25 31 AC SP98 1 0 0 NaN NaN
385 16 3,8 45 25 19 NaN SP98 0 0 0 NaN NaN
386 15,4 4,6 42 25 31 AC SP98 1 0 0 NaN NaN
387 14,7 5 25 25 30 AC SP98 1 0 0 NaN NaN

388 rows × 12 columns

Data preprocessing

Checking for NA values

In [3]:
df.isna().sum()
Out[3]:
distance           0
consume            0
speed              0
temp_inside       12
temp_outside       0
specials         295
gas_type           0
AC                 0
rain               0
sun                0
refill liters    375
refill gas       375
dtype: int64

Filling NA temp_inside column values with mode corresponding to each category.

In [4]:
df = df.sort_values('gas_type')
temp_inside = []
for t in df['gas_type'].unique():
    temp_inside.extend(df[df['gas_type']==t]['temp_inside'].fillna(df[df['gas_type']==t]['temp_inside'].mode()[0]))

df['temp_inside'] = temp_inside

# df.isna().sum()

Filling NA 'refill liters' column values with '0'.

In [5]:
df['refill liters'] = df['refill liters'].fillna('0')

# df.isna().sum()

One hot encoding the 'refill liters' and 'gas_type' columns.

The NA values of the 'refill liters' column are encoded with 0 in each column.

Dropping unnecessary columns

In [6]:
df = pd.get_dummies(df, columns = ['refill gas'], dummy_na = True)
df = pd.get_dummies(df, columns = ['gas_type'], drop_first=True)

df = df.drop(columns = ['specials','refill gas_nan'], axis = 1)

df
Out[6]:
distance consume speed temp_inside temp_outside AC rain sun refill liters refill gas_E10 refill gas_SP98 gas_type_SP98
0 28 5 26 21,5 12 0 0 0 45 1 0 0
159 39,4 5,3 60 21,5 9 0 1 0 0 0 0 0
160 5,1 8,1 39 21,5 4 0 0 0 0 0 0 0
161 26,6 4,8 38 21,5 7 0 0 0 0 0 0 0
162 53,2 5,1 71 21,5 2 0 0 0 0 0 0 0
... ... ... ... ... ... ... ... ... ... ... ... ...
133 11,8 4,5 43 21,5 3 0 0 0 0 0 0 1
132 16,1 4,5 33 21,5 6 0 0 0 0 0 0 1
131 5,1 6,4 39 21,5 4 0 0 0 0 0 0 1
137 11,8 4,5 38 21,5 5 0 1 0 0 0 0 1
387 14,7 5 25 25 30 1 0 0 0 0 0 1

388 rows × 12 columns

Checking the datatypes

In [7]:
df.dtypes
Out[7]:
distance           object
consume            object
speed               int64
temp_inside        object
temp_outside        int64
AC                  int64
rain                int64
sun                 int64
refill liters      object
refill gas_E10      uint8
refill gas_SP98     uint8
gas_type_SP98       uint8
dtype: object

Values in certain columns have ',' in numeric values. In some instances, the decimal point is represented using ','s.

Replacing such values.

In [8]:
numeric_columns = ['distance','consume','temp_inside','refill liters'] # columns with ','

for column in numeric_columns:
    #print(column)
    x = [x.replace(',','.') for x in df[column].values]
    df[column] = list(map(np.float32, x))

Extend it to include all numeric columns.

In [9]:
numeric_columns.extend(['speed','temp_outside'])

Change datatypes of the columns.

In [10]:
for column in df.columns:
    if column in numeric_columns:
        df[column] = df[column].astype('float32')
    else:
        df[column] = df[column].astype('int64')

Train val split

In [11]:
# Splitting into train and val set -- 90-10 split

train_df, val_df = train_test_split(df, test_size = 0.1)

print("Number of samples in...")
print("Training set: ", len(train_df))
print("Validation set: ", len(val_df))
Number of samples in...
Training set:  349
Validation set:  39

Scaling the values

The values in the feature columns are not of the same range.

In [12]:
numeric_columns.remove('consume')

ss = StandardScaler()
train_df[numeric_columns] = ss.fit_transform(train_df[numeric_columns])
val_df[numeric_columns] = ss.transform(val_df[numeric_columns])

Defining the input and output columns

In [13]:
input_columns = df.columns.tolist()
input_columns.remove('consume')

output_columns = ['consume']
In [14]:
# Splitting into X (input) and y (output)

Xtrain, ytrain = np.array(train_df[input_columns]), np.array(train_df[output_columns])

Xval, yval = np.array(val_df[input_columns]), np.array(val_df[output_columns])

The model

In [15]:
model = models.Sequential([
    layers.Dense(1024, activation = 'relu', input_shape = Xtrain[0].shape),
    layers.Dense(512, activation = 'relu'),
    layers.Dense(256, activation = 'relu'),
    layers.Dense(64, activation = 'relu'),
    layers.Dense(16, activation = 'relu'),
    layers.Dense(1)
])

cb = callbacks.EarlyStopping(patience = 20, restore_best_weights = True)
In [16]:
model.compile(optimizer = optimizers.Adam(0.00001), loss = losses.MeanSquaredError(), metrics = ['mae'])

history = model.fit(Xtrain, ytrain, validation_data = (Xval, yval), epochs = 512, callbacks = cb)
Epoch 1/512
11/11 [==============================] - 0s 13ms/step - loss: 24.6851 - mae: 4.8533 - val_loss: 25.6297 - val_mae: 5.0164
Epoch 2/512
11/11 [==============================] - 0s 3ms/step - loss: 24.2673 - mae: 4.8105 - val_loss: 25.2043 - val_mae: 4.9734
Epoch 3/512
11/11 [==============================] - 0s 3ms/step - loss: 23.8519 - mae: 4.7672 - val_loss: 24.7735 - val_mae: 4.9296
Epoch 4/512
11/11 [==============================] - 0s 3ms/step - loss: 23.4383 - mae: 4.7237 - val_loss: 24.3134 - val_mae: 4.8819
Epoch 5/512
11/11 [==============================] - 0s 3ms/step - loss: 23.0042 - mae: 4.6772 - val_loss: 23.8304 - val_mae: 4.8311
Epoch 6/512
11/11 [==============================] - 0s 3ms/step - loss: 22.5339 - mae: 4.6257 - val_loss: 23.3020 - val_mae: 4.7745
Epoch 7/512
11/11 [==============================] - 0s 3ms/step - loss: 22.0118 - mae: 4.5684 - val_loss: 22.7364 - val_mae: 4.7128
Epoch 8/512
11/11 [==============================] - 0s 3ms/step - loss: 21.4521 - mae: 4.5054 - val_loss: 22.1283 - val_mae: 4.6453
Epoch 9/512
11/11 [==============================] - 0s 3ms/step - loss: 20.8502 - mae: 4.4358 - val_loss: 21.4783 - val_mae: 4.5717
Epoch 10/512
11/11 [==============================] - 0s 3ms/step - loss: 20.2050 - mae: 4.3602 - val_loss: 20.7879 - val_mae: 4.4917
Epoch 11/512
11/11 [==============================] - 0s 3ms/step - loss: 19.5216 - mae: 4.2787 - val_loss: 20.0688 - val_mae: 4.4064
Epoch 12/512
11/11 [==============================] - 0s 3ms/step - loss: 18.8164 - mae: 4.1915 - val_loss: 19.2918 - val_mae: 4.3114
Epoch 13/512
11/11 [==============================] - 0s 3ms/step - loss: 18.0702 - mae: 4.0960 - val_loss: 18.4831 - val_mae: 4.2097
Epoch 14/512
11/11 [==============================] - 0s 3ms/step - loss: 17.2787 - mae: 3.9929 - val_loss: 17.6605 - val_mae: 4.1023
Epoch 15/512
11/11 [==============================] - 0s 3ms/step - loss: 16.4663 - mae: 3.8836 - val_loss: 16.8033 - val_mae: 3.9864
Epoch 16/512
11/11 [==============================] - 0s 3ms/step - loss: 15.6348 - mae: 3.7658 - val_loss: 15.9115 - val_mae: 3.8844
Epoch 17/512
11/11 [==============================] - 0s 3ms/step - loss: 14.7616 - mae: 3.6395 - val_loss: 15.0028 - val_mae: 3.7763
Epoch 18/512
11/11 [==============================] - 0s 3ms/step - loss: 13.8838 - mae: 3.5078 - val_loss: 14.0698 - val_mae: 3.6591
Epoch 19/512
11/11 [==============================] - 0s 3ms/step - loss: 12.9786 - mae: 3.3740 - val_loss: 13.1470 - val_mae: 3.5360
Epoch 20/512
11/11 [==============================] - 0s 3ms/step - loss: 12.0866 - mae: 3.2355 - val_loss: 12.2299 - val_mae: 3.4051
Epoch 21/512
11/11 [==============================] - 0s 3ms/step - loss: 11.2265 - mae: 3.0979 - val_loss: 11.3254 - val_mae: 3.2666
Epoch 22/512
11/11 [==============================] - 0s 3ms/step - loss: 10.3768 - mae: 2.9558 - val_loss: 10.4665 - val_mae: 3.1243
Epoch 23/512
11/11 [==============================] - 0s 3ms/step - loss: 9.5541 - mae: 2.8252 - val_loss: 9.6622 - val_mae: 2.9797
Epoch 24/512
11/11 [==============================] - 0s 3ms/step - loss: 8.8082 - mae: 2.6951 - val_loss: 8.8857 - val_mae: 2.8284
Epoch 25/512
11/11 [==============================] - 0s 3ms/step - loss: 8.0766 - mae: 2.5707 - val_loss: 8.1810 - val_mae: 2.6784
Epoch 26/512
11/11 [==============================] - 0s 3ms/step - loss: 7.4297 - mae: 2.4525 - val_loss: 7.5198 - val_mae: 2.5244
Epoch 27/512
11/11 [==============================] - 0s 3ms/step - loss: 6.8240 - mae: 2.3366 - val_loss: 6.9238 - val_mae: 2.3803
Epoch 28/512
11/11 [==============================] - 0s 3ms/step - loss: 6.2734 - mae: 2.2244 - val_loss: 6.3841 - val_mae: 2.2521
Epoch 29/512
11/11 [==============================] - 0s 3ms/step - loss: 5.7999 - mae: 2.1186 - val_loss: 5.8902 - val_mae: 2.1248
Epoch 30/512
11/11 [==============================] - 0s 3ms/step - loss: 5.3589 - mae: 2.0199 - val_loss: 5.4523 - val_mae: 2.0153
Epoch 31/512
11/11 [==============================] - 0s 3ms/step - loss: 4.9754 - mae: 1.9302 - val_loss: 5.0682 - val_mae: 1.9123
Epoch 32/512
11/11 [==============================] - 0s 3ms/step - loss: 4.6372 - mae: 1.8447 - val_loss: 4.7238 - val_mae: 1.8200
Epoch 33/512
11/11 [==============================] - 0s 3ms/step - loss: 4.3376 - mae: 1.7733 - val_loss: 4.4194 - val_mae: 1.7299
Epoch 34/512
11/11 [==============================] - 0s 3ms/step - loss: 4.0778 - mae: 1.7052 - val_loss: 4.1590 - val_mae: 1.6470
Epoch 35/512
11/11 [==============================] - 0s 3ms/step - loss: 3.8488 - mae: 1.6437 - val_loss: 3.9035 - val_mae: 1.5745
Epoch 36/512
11/11 [==============================] - 0s 4ms/step - loss: 3.6263 - mae: 1.5891 - val_loss: 3.7000 - val_mae: 1.5150
Epoch 37/512
11/11 [==============================] - 0s 3ms/step - loss: 3.4493 - mae: 1.5424 - val_loss: 3.4902 - val_mae: 1.4576
Epoch 38/512
11/11 [==============================] - 0s 3ms/step - loss: 3.2838 - mae: 1.4982 - val_loss: 3.3286 - val_mae: 1.4015
Epoch 39/512
11/11 [==============================] - 0s 3ms/step - loss: 3.1451 - mae: 1.4592 - val_loss: 3.1681 - val_mae: 1.3617
Epoch 40/512
11/11 [==============================] - 0s 3ms/step - loss: 3.0121 - mae: 1.4237 - val_loss: 3.0116 - val_mae: 1.3330
Epoch 41/512
11/11 [==============================] - 0s 3ms/step - loss: 2.8935 - mae: 1.3936 - val_loss: 2.8848 - val_mae: 1.3055
Epoch 42/512
11/11 [==============================] - 0s 3ms/step - loss: 2.7925 - mae: 1.3654 - val_loss: 2.7811 - val_mae: 1.2801
Epoch 43/512
11/11 [==============================] - 0s 3ms/step - loss: 2.7026 - mae: 1.3408 - val_loss: 2.6707 - val_mae: 1.2569
Epoch 44/512
11/11 [==============================] - 0s 3ms/step - loss: 2.6189 - mae: 1.3189 - val_loss: 2.5754 - val_mae: 1.2360
Epoch 45/512
11/11 [==============================] - 0s 3ms/step - loss: 2.5469 - mae: 1.2998 - val_loss: 2.4998 - val_mae: 1.2187
Epoch 46/512
11/11 [==============================] - 0s 3ms/step - loss: 2.4825 - mae: 1.2808 - val_loss: 2.4041 - val_mae: 1.2036
Epoch 47/512
11/11 [==============================] - 0s 3ms/step - loss: 2.4191 - mae: 1.2642 - val_loss: 2.3368 - val_mae: 1.1881
Epoch 48/512
11/11 [==============================] - 0s 3ms/step - loss: 2.3630 - mae: 1.2488 - val_loss: 2.2749 - val_mae: 1.1753
Epoch 49/512
11/11 [==============================] - 0s 3ms/step - loss: 2.3169 - mae: 1.2351 - val_loss: 2.1983 - val_mae: 1.1626
Epoch 50/512
11/11 [==============================] - 0s 3ms/step - loss: 2.2684 - mae: 1.2230 - val_loss: 2.1408 - val_mae: 1.1511
Epoch 51/512
11/11 [==============================] - 0s 3ms/step - loss: 2.2289 - mae: 1.2117 - val_loss: 2.0864 - val_mae: 1.1396
Epoch 52/512
11/11 [==============================] - 0s 3ms/step - loss: 2.1908 - mae: 1.2008 - val_loss: 2.0290 - val_mae: 1.1281
Epoch 53/512
11/11 [==============================] - 0s 3ms/step - loss: 2.1548 - mae: 1.1899 - val_loss: 1.9817 - val_mae: 1.1171
Epoch 54/512
11/11 [==============================] - 0s 3ms/step - loss: 2.1224 - mae: 1.1806 - val_loss: 1.9498 - val_mae: 1.1066
Epoch 55/512
11/11 [==============================] - 0s 3ms/step - loss: 2.0916 - mae: 1.1717 - val_loss: 1.9096 - val_mae: 1.0965
Epoch 56/512
11/11 [==============================] - 0s 3ms/step - loss: 2.0623 - mae: 1.1618 - val_loss: 1.8739 - val_mae: 1.0873
Epoch 57/512
11/11 [==============================] - 0s 3ms/step - loss: 2.0383 - mae: 1.1541 - val_loss: 1.8364 - val_mae: 1.0765
Epoch 58/512
11/11 [==============================] - 0s 3ms/step - loss: 2.0101 - mae: 1.1461 - val_loss: 1.7968 - val_mae: 1.0676
Epoch 59/512
11/11 [==============================] - 0s 3ms/step - loss: 1.9838 - mae: 1.1391 - val_loss: 1.7640 - val_mae: 1.0594
Epoch 60/512
11/11 [==============================] - 0s 3ms/step - loss: 1.9606 - mae: 1.1330 - val_loss: 1.7375 - val_mae: 1.0507
Epoch 61/512
11/11 [==============================] - 0s 3ms/step - loss: 1.9356 - mae: 1.1265 - val_loss: 1.7178 - val_mae: 1.0432
Epoch 62/512
11/11 [==============================] - 0s 3ms/step - loss: 1.9133 - mae: 1.1208 - val_loss: 1.6915 - val_mae: 1.0354
Epoch 63/512
11/11 [==============================] - 0s 3ms/step - loss: 1.8909 - mae: 1.1146 - val_loss: 1.6683 - val_mae: 1.0280
Epoch 64/512
11/11 [==============================] - 0s 3ms/step - loss: 1.8702 - mae: 1.1088 - val_loss: 1.6422 - val_mae: 1.0200
Epoch 65/512
11/11 [==============================] - 0s 3ms/step - loss: 1.8503 - mae: 1.1031 - val_loss: 1.6226 - val_mae: 1.0138
Epoch 66/512
11/11 [==============================] - 0s 3ms/step - loss: 1.8314 - mae: 1.0981 - val_loss: 1.5963 - val_mae: 1.0069
Epoch 67/512
11/11 [==============================] - 0s 3ms/step - loss: 1.8114 - mae: 1.0929 - val_loss: 1.5819 - val_mae: 1.0008
Epoch 68/512
11/11 [==============================] - 0s 3ms/step - loss: 1.7922 - mae: 1.0873 - val_loss: 1.5557 - val_mae: 0.9939
Epoch 69/512
11/11 [==============================] - 0s 3ms/step - loss: 1.7736 - mae: 1.0818 - val_loss: 1.5322 - val_mae: 0.9871
Epoch 70/512
11/11 [==============================] - 0s 3ms/step - loss: 1.7556 - mae: 1.0768 - val_loss: 1.5185 - val_mae: 0.9808
Epoch 71/512
11/11 [==============================] - 0s 3ms/step - loss: 1.7378 - mae: 1.0714 - val_loss: 1.4982 - val_mae: 0.9748
Epoch 72/512
11/11 [==============================] - 0s 3ms/step - loss: 1.7184 - mae: 1.0654 - val_loss: 1.4806 - val_mae: 0.9688
Epoch 73/512
11/11 [==============================] - 0s 3ms/step - loss: 1.7018 - mae: 1.0602 - val_loss: 1.4622 - val_mae: 0.9627
Epoch 74/512
11/11 [==============================] - 0s 3ms/step - loss: 1.6847 - mae: 1.0548 - val_loss: 1.4495 - val_mae: 0.9582
Epoch 75/512
11/11 [==============================] - 0s 3ms/step - loss: 1.6676 - mae: 1.0489 - val_loss: 1.4337 - val_mae: 0.9533
Epoch 76/512
11/11 [==============================] - 0s 3ms/step - loss: 1.6524 - mae: 1.0438 - val_loss: 1.4128 - val_mae: 0.9478
Epoch 77/512
11/11 [==============================] - 0s 3ms/step - loss: 1.6356 - mae: 1.0385 - val_loss: 1.3991 - val_mae: 0.9435
Epoch 78/512
11/11 [==============================] - 0s 3ms/step - loss: 1.6202 - mae: 1.0335 - val_loss: 1.3807 - val_mae: 0.9388
Epoch 79/512
11/11 [==============================] - 0s 3ms/step - loss: 1.6036 - mae: 1.0282 - val_loss: 1.3685 - val_mae: 0.9347
Epoch 80/512
11/11 [==============================] - 0s 3ms/step - loss: 1.5890 - mae: 1.0228 - val_loss: 1.3521 - val_mae: 0.9299
Epoch 81/512
11/11 [==============================] - 0s 3ms/step - loss: 1.5736 - mae: 1.0173 - val_loss: 1.3448 - val_mae: 0.9283
Epoch 82/512
11/11 [==============================] - 0s 3ms/step - loss: 1.5576 - mae: 1.0118 - val_loss: 1.3263 - val_mae: 0.9230
Epoch 83/512
11/11 [==============================] - 0s 3ms/step - loss: 1.5437 - mae: 1.0065 - val_loss: 1.3091 - val_mae: 0.9183
Epoch 84/512
11/11 [==============================] - 0s 3ms/step - loss: 1.5289 - mae: 1.0014 - val_loss: 1.2960 - val_mae: 0.9149
Epoch 85/512
11/11 [==============================] - 0s 3ms/step - loss: 1.5153 - mae: 0.9970 - val_loss: 1.2880 - val_mae: 0.9126
Epoch 86/512
11/11 [==============================] - 0s 3ms/step - loss: 1.5007 - mae: 0.9916 - val_loss: 1.2689 - val_mae: 0.9073
Epoch 87/512
11/11 [==============================] - 0s 3ms/step - loss: 1.4860 - mae: 0.9858 - val_loss: 1.2583 - val_mae: 0.9039
Epoch 88/512
11/11 [==============================] - 0s 3ms/step - loss: 1.4715 - mae: 0.9809 - val_loss: 1.2511 - val_mae: 0.9015
Epoch 89/512
11/11 [==============================] - 0s 3ms/step - loss: 1.4588 - mae: 0.9761 - val_loss: 1.2348 - val_mae: 0.8968
Epoch 90/512
11/11 [==============================] - 0s 3ms/step - loss: 1.4450 - mae: 0.9709 - val_loss: 1.2272 - val_mae: 0.8943
Epoch 91/512
11/11 [==============================] - 0s 3ms/step - loss: 1.4325 - mae: 0.9654 - val_loss: 1.2101 - val_mae: 0.8889
Epoch 92/512
11/11 [==============================] - 0s 3ms/step - loss: 1.4192 - mae: 0.9603 - val_loss: 1.2024 - val_mae: 0.8859
Epoch 93/512
11/11 [==============================] - 0s 3ms/step - loss: 1.4058 - mae: 0.9555 - val_loss: 1.1934 - val_mae: 0.8835
Epoch 94/512
11/11 [==============================] - 0s 3ms/step - loss: 1.3934 - mae: 0.9510 - val_loss: 1.1839 - val_mae: 0.8811
Epoch 95/512
11/11 [==============================] - 0s 3ms/step - loss: 1.3808 - mae: 0.9453 - val_loss: 1.1663 - val_mae: 0.8747
Epoch 96/512
11/11 [==============================] - 0s 3ms/step - loss: 1.3717 - mae: 0.9417 - val_loss: 1.1632 - val_mae: 0.8753
Epoch 97/512
11/11 [==============================] - 0s 3ms/step - loss: 1.3573 - mae: 0.9364 - val_loss: 1.1496 - val_mae: 0.8708
Epoch 98/512
11/11 [==============================] - 0s 3ms/step - loss: 1.3441 - mae: 0.9310 - val_loss: 1.1373 - val_mae: 0.8664
Epoch 99/512
11/11 [==============================] - 0s 3ms/step - loss: 1.3325 - mae: 0.9261 - val_loss: 1.1290 - val_mae: 0.8641
Epoch 100/512
11/11 [==============================] - 0s 3ms/step - loss: 1.3211 - mae: 0.9214 - val_loss: 1.1208 - val_mae: 0.8616
Epoch 101/512
11/11 [==============================] - 0s 3ms/step - loss: 1.3091 - mae: 0.9166 - val_loss: 1.1085 - val_mae: 0.8570
Epoch 102/512
11/11 [==============================] - 0s 3ms/step - loss: 1.2978 - mae: 0.9122 - val_loss: 1.1036 - val_mae: 0.8557
Epoch 103/512
11/11 [==============================] - 0s 3ms/step - loss: 1.2870 - mae: 0.9076 - val_loss: 1.0953 - val_mae: 0.8531
Epoch 104/512
11/11 [==============================] - 0s 3ms/step - loss: 1.2755 - mae: 0.9030 - val_loss: 1.0858 - val_mae: 0.8497
Epoch 105/512
11/11 [==============================] - 0s 3ms/step - loss: 1.2651 - mae: 0.8985 - val_loss: 1.0767 - val_mae: 0.8469
Epoch 106/512
11/11 [==============================] - 0s 3ms/step - loss: 1.2540 - mae: 0.8938 - val_loss: 1.0712 - val_mae: 0.8455
Epoch 107/512
11/11 [==============================] - 0s 3ms/step - loss: 1.2434 - mae: 0.8896 - val_loss: 1.0657 - val_mae: 0.8434
Epoch 108/512
11/11 [==============================] - 0s 3ms/step - loss: 1.2336 - mae: 0.8860 - val_loss: 1.0613 - val_mae: 0.8422
Epoch 109/512
11/11 [==============================] - 0s 3ms/step - loss: 1.2235 - mae: 0.8816 - val_loss: 1.0473 - val_mae: 0.8370
Epoch 110/512
11/11 [==============================] - 0s 3ms/step - loss: 1.2126 - mae: 0.8765 - val_loss: 1.0351 - val_mae: 0.8315
Epoch 111/512
11/11 [==============================] - 0s 3ms/step - loss: 1.2018 - mae: 0.8719 - val_loss: 1.0269 - val_mae: 0.8278
Epoch 112/512
11/11 [==============================] - 0s 3ms/step - loss: 1.1919 - mae: 0.8682 - val_loss: 1.0305 - val_mae: 0.8306
Epoch 113/512
11/11 [==============================] - 0s 3ms/step - loss: 1.1813 - mae: 0.8651 - val_loss: 1.0217 - val_mae: 0.8271
Epoch 114/512
11/11 [==============================] - 0s 3ms/step - loss: 1.1723 - mae: 0.8610 - val_loss: 1.0148 - val_mae: 0.8244
Epoch 115/512
11/11 [==============================] - 0s 3ms/step - loss: 1.1627 - mae: 0.8559 - val_loss: 1.0028 - val_mae: 0.8186
Epoch 116/512
11/11 [==============================] - 0s 3ms/step - loss: 1.1538 - mae: 0.8522 - val_loss: 1.0027 - val_mae: 0.8206
Epoch 117/512
11/11 [==============================] - 0s 3ms/step - loss: 1.1449 - mae: 0.8486 - val_loss: 0.9968 - val_mae: 0.8181
Epoch 118/512
11/11 [==============================] - 0s 3ms/step - loss: 1.1337 - mae: 0.8435 - val_loss: 0.9840 - val_mae: 0.8125
Epoch 119/512
11/11 [==============================] - 0s 3ms/step - loss: 1.1247 - mae: 0.8401 - val_loss: 0.9807 - val_mae: 0.8128
Epoch 120/512
11/11 [==============================] - 0s 3ms/step - loss: 1.1166 - mae: 0.8376 - val_loss: 0.9810 - val_mae: 0.8136
Epoch 121/512
11/11 [==============================] - 0s 3ms/step - loss: 1.1086 - mae: 0.8328 - val_loss: 0.9618 - val_mae: 0.8041
Epoch 122/512
11/11 [==============================] - 0s 3ms/step - loss: 1.0980 - mae: 0.8288 - val_loss: 0.9619 - val_mae: 0.8056
Epoch 123/512
11/11 [==============================] - 0s 3ms/step - loss: 1.0914 - mae: 0.8261 - val_loss: 0.9646 - val_mae: 0.8086
Epoch 124/512
11/11 [==============================] - 0s 3ms/step - loss: 1.0830 - mae: 0.8218 - val_loss: 0.9500 - val_mae: 0.8002
Epoch 125/512
11/11 [==============================] - 0s 3ms/step - loss: 1.0728 - mae: 0.8170 - val_loss: 0.9401 - val_mae: 0.7953
Epoch 126/512
11/11 [==============================] - 0s 3ms/step - loss: 1.0633 - mae: 0.8125 - val_loss: 0.9372 - val_mae: 0.7947
Epoch 127/512
11/11 [==============================] - 0s 3ms/step - loss: 1.0562 - mae: 0.8110 - val_loss: 0.9387 - val_mae: 0.7971
Epoch 128/512
11/11 [==============================] - 0s 3ms/step - loss: 1.0476 - mae: 0.8083 - val_loss: 0.9378 - val_mae: 0.7975
Epoch 129/512
11/11 [==============================] - 0s 3ms/step - loss: 1.0384 - mae: 0.8038 - val_loss: 0.9239 - val_mae: 0.7902
Epoch 130/512
11/11 [==============================] - 0s 3ms/step - loss: 1.0303 - mae: 0.7988 - val_loss: 0.9181 - val_mae: 0.7872
Epoch 131/512
11/11 [==============================] - 0s 3ms/step - loss: 1.0224 - mae: 0.7960 - val_loss: 0.9191 - val_mae: 0.7885
Epoch 132/512
11/11 [==============================] - 0s 3ms/step - loss: 1.0145 - mae: 0.7926 - val_loss: 0.9068 - val_mae: 0.7822
Epoch 133/512
11/11 [==============================] - 0s 3ms/step - loss: 1.0065 - mae: 0.7885 - val_loss: 0.9074 - val_mae: 0.7832
Epoch 134/512
11/11 [==============================] - 0s 3ms/step - loss: 0.9988 - mae: 0.7855 - val_loss: 0.9055 - val_mae: 0.7825
Epoch 135/512
11/11 [==============================] - 0s 3ms/step - loss: 0.9917 - mae: 0.7825 - val_loss: 0.8995 - val_mae: 0.7801
Epoch 136/512
11/11 [==============================] - 0s 3ms/step - loss: 0.9837 - mae: 0.7786 - val_loss: 0.8889 - val_mae: 0.7740
Epoch 137/512
11/11 [==============================] - 0s 3ms/step - loss: 0.9775 - mae: 0.7740 - val_loss: 0.8766 - val_mae: 0.7680
Epoch 138/512
11/11 [==============================] - 0s 3ms/step - loss: 0.9688 - mae: 0.7703 - val_loss: 0.8790 - val_mae: 0.7697
Epoch 139/512
11/11 [==============================] - 0s 3ms/step - loss: 0.9624 - mae: 0.7687 - val_loss: 0.8831 - val_mae: 0.7718
Epoch 140/512
11/11 [==============================] - 0s 3ms/step - loss: 0.9551 - mae: 0.7646 - val_loss: 0.8672 - val_mae: 0.7648
Epoch 141/512
11/11 [==============================] - 0s 3ms/step - loss: 0.9475 - mae: 0.7618 - val_loss: 0.8739 - val_mae: 0.7698
Epoch 142/512
11/11 [==============================] - 0s 3ms/step - loss: 0.9421 - mae: 0.7581 - val_loss: 0.8622 - val_mae: 0.7636
Epoch 143/512
11/11 [==============================] - 0s 3ms/step - loss: 0.9326 - mae: 0.7535 - val_loss: 0.8605 - val_mae: 0.7628
Epoch 144/512
11/11 [==============================] - 0s 3ms/step - loss: 0.9263 - mae: 0.7499 - val_loss: 0.8516 - val_mae: 0.7587
Epoch 145/512
11/11 [==============================] - 0s 3ms/step - loss: 0.9198 - mae: 0.7481 - val_loss: 0.8624 - val_mae: 0.7657
Epoch 146/512
11/11 [==============================] - 0s 3ms/step - loss: 0.9119 - mae: 0.7448 - val_loss: 0.8513 - val_mae: 0.7596
Epoch 147/512
11/11 [==============================] - 0s 3ms/step - loss: 0.9045 - mae: 0.7403 - val_loss: 0.8460 - val_mae: 0.7564
Epoch 148/512
11/11 [==============================] - 0s 3ms/step - loss: 0.8988 - mae: 0.7376 - val_loss: 0.8478 - val_mae: 0.7587
Epoch 149/512
11/11 [==============================] - 0s 3ms/step - loss: 0.8927 - mae: 0.7336 - val_loss: 0.8325 - val_mae: 0.7502
Epoch 150/512
11/11 [==============================] - 0s 3ms/step - loss: 0.8849 - mae: 0.7300 - val_loss: 0.8352 - val_mae: 0.7527
Epoch 151/512
11/11 [==============================] - 0s 3ms/step - loss: 0.8794 - mae: 0.7277 - val_loss: 0.8366 - val_mae: 0.7539
Epoch 152/512
11/11 [==============================] - 0s 3ms/step - loss: 0.8724 - mae: 0.7245 - val_loss: 0.8304 - val_mae: 0.7502
Epoch 153/512
11/11 [==============================] - 0s 3ms/step - loss: 0.8665 - mae: 0.7211 - val_loss: 0.8193 - val_mae: 0.7441
Epoch 154/512
11/11 [==============================] - 0s 3ms/step - loss: 0.8613 - mae: 0.7162 - val_loss: 0.8144 - val_mae: 0.7411
Epoch 155/512
11/11 [==============================] - 0s 3ms/step - loss: 0.8545 - mae: 0.7137 - val_loss: 0.8134 - val_mae: 0.7416
Epoch 156/512
11/11 [==============================] - 0s 3ms/step - loss: 0.8469 - mae: 0.7109 - val_loss: 0.8150 - val_mae: 0.7429
Epoch 157/512
11/11 [==============================] - 0s 3ms/step - loss: 0.8422 - mae: 0.7099 - val_loss: 0.8171 - val_mae: 0.7445
Epoch 158/512
11/11 [==============================] - 0s 3ms/step - loss: 0.8362 - mae: 0.7060 - val_loss: 0.8058 - val_mae: 0.7384
Epoch 159/512
11/11 [==============================] - 0s 3ms/step - loss: 0.8302 - mae: 0.7018 - val_loss: 0.7979 - val_mae: 0.7339
Epoch 160/512
11/11 [==============================] - 0s 3ms/step - loss: 0.8243 - mae: 0.6989 - val_loss: 0.8036 - val_mae: 0.7372
Epoch 161/512
11/11 [==============================] - 0s 3ms/step - loss: 0.8191 - mae: 0.6983 - val_loss: 0.8032 - val_mae: 0.7378
Epoch 162/512
11/11 [==============================] - 0s 3ms/step - loss: 0.8131 - mae: 0.6936 - val_loss: 0.7936 - val_mae: 0.7329
Epoch 163/512
11/11 [==============================] - 0s 3ms/step - loss: 0.8075 - mae: 0.6909 - val_loss: 0.7905 - val_mae: 0.7314
Epoch 164/512
11/11 [==============================] - 0s 3ms/step - loss: 0.8020 - mae: 0.6872 - val_loss: 0.7811 - val_mae: 0.7271
Epoch 165/512
11/11 [==============================] - 0s 3ms/step - loss: 0.7967 - mae: 0.6848 - val_loss: 0.7858 - val_mae: 0.7291
Epoch 166/512
11/11 [==============================] - 0s 3ms/step - loss: 0.7917 - mae: 0.6829 - val_loss: 0.7848 - val_mae: 0.7294
Epoch 167/512
11/11 [==============================] - 0s 3ms/step - loss: 0.7866 - mae: 0.6786 - val_loss: 0.7713 - val_mae: 0.7224
Epoch 168/512
11/11 [==============================] - 0s 3ms/step - loss: 0.7807 - mae: 0.6750 - val_loss: 0.7733 - val_mae: 0.7235
Epoch 169/512
11/11 [==============================] - 0s 3ms/step - loss: 0.7754 - mae: 0.6739 - val_loss: 0.7745 - val_mae: 0.7242
Epoch 170/512
11/11 [==============================] - 0s 3ms/step - loss: 0.7704 - mae: 0.6709 - val_loss: 0.7662 - val_mae: 0.7199
Epoch 171/512
11/11 [==============================] - 0s 3ms/step - loss: 0.7655 - mae: 0.6692 - val_loss: 0.7666 - val_mae: 0.7199
Epoch 172/512
11/11 [==============================] - 0s 3ms/step - loss: 0.7599 - mae: 0.6664 - val_loss: 0.7603 - val_mae: 0.7170
Epoch 173/512
11/11 [==============================] - 0s 3ms/step - loss: 0.7553 - mae: 0.6631 - val_loss: 0.7576 - val_mae: 0.7155
Epoch 174/512
11/11 [==============================] - 0s 3ms/step - loss: 0.7505 - mae: 0.6603 - val_loss: 0.7550 - val_mae: 0.7137
Epoch 175/512
11/11 [==============================] - 0s 3ms/step - loss: 0.7456 - mae: 0.6571 - val_loss: 0.7531 - val_mae: 0.7128
Epoch 176/512
11/11 [==============================] - 0s 3ms/step - loss: 0.7411 - mae: 0.6552 - val_loss: 0.7484 - val_mae: 0.7108
Epoch 177/512
11/11 [==============================] - 0s 3ms/step - loss: 0.7362 - mae: 0.6541 - val_loss: 0.7560 - val_mae: 0.7147
Epoch 178/512
11/11 [==============================] - 0s 3ms/step - loss: 0.7319 - mae: 0.6530 - val_loss: 0.7520 - val_mae: 0.7119
Epoch 179/512
11/11 [==============================] - 0s 3ms/step - loss: 0.7273 - mae: 0.6508 - val_loss: 0.7387 - val_mae: 0.7049
Epoch 180/512
11/11 [==============================] - 0s 3ms/step - loss: 0.7231 - mae: 0.6464 - val_loss: 0.7350 - val_mae: 0.7039
Epoch 181/512
11/11 [==============================] - 0s 3ms/step - loss: 0.7217 - mae: 0.6492 - val_loss: 0.7475 - val_mae: 0.7098
Epoch 182/512
11/11 [==============================] - 0s 3ms/step - loss: 0.7136 - mae: 0.6434 - val_loss: 0.7348 - val_mae: 0.7025
Epoch 183/512
11/11 [==============================] - 0s 3ms/step - loss: 0.7102 - mae: 0.6391 - val_loss: 0.7266 - val_mae: 0.6969
Epoch 184/512
11/11 [==============================] - 0s 3ms/step - loss: 0.7046 - mae: 0.6374 - val_loss: 0.7316 - val_mae: 0.7008
Epoch 185/512
11/11 [==============================] - 0s 3ms/step - loss: 0.7005 - mae: 0.6373 - val_loss: 0.7326 - val_mae: 0.7005
Epoch 186/512
11/11 [==============================] - 0s 3ms/step - loss: 0.6972 - mae: 0.6374 - val_loss: 0.7355 - val_mae: 0.7021
Epoch 187/512
11/11 [==============================] - 0s 3ms/step - loss: 0.6915 - mae: 0.6331 - val_loss: 0.7204 - val_mae: 0.6936
Epoch 188/512
11/11 [==============================] - 0s 3ms/step - loss: 0.6884 - mae: 0.6285 - val_loss: 0.7174 - val_mae: 0.6919
Epoch 189/512
11/11 [==============================] - 0s 3ms/step - loss: 0.6847 - mae: 0.6289 - val_loss: 0.7251 - val_mae: 0.6955
Epoch 190/512
11/11 [==============================] - 0s 3ms/step - loss: 0.6799 - mae: 0.6285 - val_loss: 0.7234 - val_mae: 0.6939
Epoch 191/512
11/11 [==============================] - 0s 3ms/step - loss: 0.6752 - mae: 0.6240 - val_loss: 0.7093 - val_mae: 0.6863
Epoch 192/512
11/11 [==============================] - 0s 3ms/step - loss: 0.6722 - mae: 0.6212 - val_loss: 0.7019 - val_mae: 0.6822
Epoch 193/512
11/11 [==============================] - 0s 3ms/step - loss: 0.6682 - mae: 0.6193 - val_loss: 0.7084 - val_mae: 0.6854
Epoch 194/512
11/11 [==============================] - 0s 3ms/step - loss: 0.6656 - mae: 0.6197 - val_loss: 0.7107 - val_mae: 0.6878
Epoch 195/512
11/11 [==============================] - 0s 3ms/step - loss: 0.6615 - mae: 0.6182 - val_loss: 0.7066 - val_mae: 0.6848
Epoch 196/512
11/11 [==============================] - 0s 3ms/step - loss: 0.6575 - mae: 0.6162 - val_loss: 0.7037 - val_mae: 0.6826
Epoch 197/512
11/11 [==============================] - 0s 3ms/step - loss: 0.6538 - mae: 0.6143 - val_loss: 0.6956 - val_mae: 0.6784
Epoch 198/512
11/11 [==============================] - 0s 3ms/step - loss: 0.6497 - mae: 0.6114 - val_loss: 0.7015 - val_mae: 0.6808
Epoch 199/512
11/11 [==============================] - 0s 3ms/step - loss: 0.6461 - mae: 0.6098 - val_loss: 0.7035 - val_mae: 0.6805
Epoch 200/512
11/11 [==============================] - 0s 3ms/step - loss: 0.6438 - mae: 0.6083 - val_loss: 0.6907 - val_mae: 0.6768
Epoch 201/512
11/11 [==============================] - 0s 3ms/step - loss: 0.6418 - mae: 0.6066 - val_loss: 0.6978 - val_mae: 0.6777
Epoch 202/512
11/11 [==============================] - 0s 3ms/step - loss: 0.6360 - mae: 0.6043 - val_loss: 0.6879 - val_mae: 0.6758
Epoch 203/512
11/11 [==============================] - 0s 3ms/step - loss: 0.6329 - mae: 0.6027 - val_loss: 0.6905 - val_mae: 0.6773
Epoch 204/512
11/11 [==============================] - 0s 3ms/step - loss: 0.6307 - mae: 0.6008 - val_loss: 0.6856 - val_mae: 0.6760
Epoch 205/512
11/11 [==============================] - 0s 3ms/step - loss: 0.6284 - mae: 0.6017 - val_loss: 0.6940 - val_mae: 0.6762
Epoch 206/512
11/11 [==============================] - 0s 3ms/step - loss: 0.6235 - mae: 0.5991 - val_loss: 0.6864 - val_mae: 0.6741
Epoch 207/512
11/11 [==============================] - 0s 3ms/step - loss: 0.6188 - mae: 0.5957 - val_loss: 0.6796 - val_mae: 0.6723
Epoch 208/512
11/11 [==============================] - 0s 3ms/step - loss: 0.6166 - mae: 0.5936 - val_loss: 0.6779 - val_mae: 0.6720
Epoch 209/512
11/11 [==============================] - 0s 3ms/step - loss: 0.6138 - mae: 0.5920 - val_loss: 0.6726 - val_mae: 0.6696
Epoch 210/512
11/11 [==============================] - 0s 3ms/step - loss: 0.6096 - mae: 0.5887 - val_loss: 0.6757 - val_mae: 0.6695
Epoch 211/512
11/11 [==============================] - 0s 3ms/step - loss: 0.6074 - mae: 0.5898 - val_loss: 0.6790 - val_mae: 0.6701
Epoch 212/512
11/11 [==============================] - 0s 3ms/step - loss: 0.6055 - mae: 0.5863 - val_loss: 0.6695 - val_mae: 0.6659
Epoch 213/512
11/11 [==============================] - 0s 3ms/step - loss: 0.6019 - mae: 0.5854 - val_loss: 0.6683 - val_mae: 0.6653
Epoch 214/512
11/11 [==============================] - 0s 3ms/step - loss: 0.5983 - mae: 0.5848 - val_loss: 0.6716 - val_mae: 0.6671
Epoch 215/512
11/11 [==============================] - 0s 3ms/step - loss: 0.5961 - mae: 0.5840 - val_loss: 0.6731 - val_mae: 0.6668
Epoch 216/512
11/11 [==============================] - 0s 3ms/step - loss: 0.5949 - mae: 0.5807 - val_loss: 0.6578 - val_mae: 0.6613
Epoch 217/512
11/11 [==============================] - 0s 3ms/step - loss: 0.5913 - mae: 0.5794 - val_loss: 0.6719 - val_mae: 0.6676
Epoch 218/512
11/11 [==============================] - 0s 3ms/step - loss: 0.5877 - mae: 0.5783 - val_loss: 0.6654 - val_mae: 0.6653
Epoch 219/512
11/11 [==============================] - 0s 3ms/step - loss: 0.5843 - mae: 0.5753 - val_loss: 0.6561 - val_mae: 0.6606
Epoch 220/512
11/11 [==============================] - 0s 3ms/step - loss: 0.5812 - mae: 0.5734 - val_loss: 0.6560 - val_mae: 0.6600
Epoch 221/512
11/11 [==============================] - 0s 3ms/step - loss: 0.5786 - mae: 0.5727 - val_loss: 0.6593 - val_mae: 0.6612
Epoch 222/512
11/11 [==============================] - 0s 3ms/step - loss: 0.5775 - mae: 0.5723 - val_loss: 0.6509 - val_mae: 0.6579
Epoch 223/512
11/11 [==============================] - 0s 3ms/step - loss: 0.5748 - mae: 0.5720 - val_loss: 0.6574 - val_mae: 0.6609
Epoch 224/512
11/11 [==============================] - 0s 3ms/step - loss: 0.5717 - mae: 0.5709 - val_loss: 0.6589 - val_mae: 0.6625
Epoch 225/512
11/11 [==============================] - 0s 3ms/step - loss: 0.5689 - mae: 0.5671 - val_loss: 0.6504 - val_mae: 0.6590
Epoch 226/512
11/11 [==============================] - 0s 3ms/step - loss: 0.5667 - mae: 0.5645 - val_loss: 0.6489 - val_mae: 0.6578
Epoch 227/512
11/11 [==============================] - 0s 3ms/step - loss: 0.5650 - mae: 0.5643 - val_loss: 0.6484 - val_mae: 0.6570
Epoch 228/512
11/11 [==============================] - 0s 3ms/step - loss: 0.5614 - mae: 0.5635 - val_loss: 0.6490 - val_mae: 0.6585
Epoch 229/512
11/11 [==============================] - 0s 3ms/step - loss: 0.5594 - mae: 0.5613 - val_loss: 0.6478 - val_mae: 0.6589
Epoch 230/512
11/11 [==============================] - 0s 3ms/step - loss: 0.5566 - mae: 0.5592 - val_loss: 0.6467 - val_mae: 0.6581
Epoch 231/512
11/11 [==============================] - 0s 3ms/step - loss: 0.5550 - mae: 0.5579 - val_loss: 0.6413 - val_mae: 0.6556
Epoch 232/512
11/11 [==============================] - 0s 3ms/step - loss: 0.5520 - mae: 0.5569 - val_loss: 0.6405 - val_mae: 0.6550
Epoch 233/512
11/11 [==============================] - 0s 3ms/step - loss: 0.5508 - mae: 0.5560 - val_loss: 0.6418 - val_mae: 0.6557
Epoch 234/512
11/11 [==============================] - 0s 3ms/step - loss: 0.5476 - mae: 0.5541 - val_loss: 0.6405 - val_mae: 0.6550
Epoch 235/512
11/11 [==============================] - 0s 3ms/step - loss: 0.5454 - mae: 0.5533 - val_loss: 0.6385 - val_mae: 0.6546
Epoch 236/512
11/11 [==============================] - 0s 3ms/step - loss: 0.5436 - mae: 0.5521 - val_loss: 0.6348 - val_mae: 0.6524
Epoch 237/512
11/11 [==============================] - 0s 3ms/step - loss: 0.5416 - mae: 0.5509 - val_loss: 0.6366 - val_mae: 0.6541
Epoch 238/512
11/11 [==============================] - 0s 3ms/step - loss: 0.5446 - mae: 0.5503 - val_loss: 0.6356 - val_mae: 0.6543
Epoch 239/512
11/11 [==============================] - 0s 3ms/step - loss: 0.5373 - mae: 0.5482 - val_loss: 0.6356 - val_mae: 0.6541
Epoch 240/512
11/11 [==============================] - 0s 3ms/step - loss: 0.5385 - mae: 0.5510 - val_loss: 0.6368 - val_mae: 0.6543
Epoch 241/512
11/11 [==============================] - 0s 3ms/step - loss: 0.5341 - mae: 0.5473 - val_loss: 0.6227 - val_mae: 0.6455
Epoch 242/512
11/11 [==============================] - 0s 3ms/step - loss: 0.5316 - mae: 0.5435 - val_loss: 0.6284 - val_mae: 0.6500
Epoch 243/512
11/11 [==============================] - 0s 3ms/step - loss: 0.5300 - mae: 0.5433 - val_loss: 0.6373 - val_mae: 0.6560
Epoch 244/512
11/11 [==============================] - 0s 3ms/step - loss: 0.5275 - mae: 0.5424 - val_loss: 0.6305 - val_mae: 0.6522
Epoch 245/512
11/11 [==============================] - 0s 3ms/step - loss: 0.5286 - mae: 0.5412 - val_loss: 0.6233 - val_mae: 0.6485
Epoch 246/512
11/11 [==============================] - 0s 3ms/step - loss: 0.5237 - mae: 0.5393 - val_loss: 0.6268 - val_mae: 0.6503
Epoch 247/512
11/11 [==============================] - 0s 3ms/step - loss: 0.5221 - mae: 0.5379 - val_loss: 0.6209 - val_mae: 0.6471
Epoch 248/512
11/11 [==============================] - 0s 3ms/step - loss: 0.5231 - mae: 0.5392 - val_loss: 0.6232 - val_mae: 0.6482
Epoch 249/512
11/11 [==============================] - 0s 3ms/step - loss: 0.5192 - mae: 0.5365 - val_loss: 0.6147 - val_mae: 0.6429
Epoch 250/512
11/11 [==============================] - 0s 3ms/step - loss: 0.5163 - mae: 0.5343 - val_loss: 0.6185 - val_mae: 0.6461
Epoch 251/512
11/11 [==============================] - 0s 3ms/step - loss: 0.5147 - mae: 0.5327 - val_loss: 0.6176 - val_mae: 0.6459
Epoch 252/512
11/11 [==============================] - 0s 3ms/step - loss: 0.5139 - mae: 0.5324 - val_loss: 0.6194 - val_mae: 0.6474
Epoch 253/512
11/11 [==============================] - 0s 3ms/step - loss: 0.5126 - mae: 0.5324 - val_loss: 0.6164 - val_mae: 0.6464
Epoch 254/512
11/11 [==============================] - 0s 3ms/step - loss: 0.5105 - mae: 0.5316 - val_loss: 0.6129 - val_mae: 0.6441
Epoch 255/512
11/11 [==============================] - 0s 3ms/step - loss: 0.5085 - mae: 0.5278 - val_loss: 0.6094 - val_mae: 0.6422
Epoch 256/512
11/11 [==============================] - 0s 3ms/step - loss: 0.5060 - mae: 0.5263 - val_loss: 0.6135 - val_mae: 0.6451
Epoch 257/512
11/11 [==============================] - 0s 3ms/step - loss: 0.5046 - mae: 0.5259 - val_loss: 0.6077 - val_mae: 0.6417
Epoch 258/512
11/11 [==============================] - 0s 3ms/step - loss: 0.5047 - mae: 0.5265 - val_loss: 0.6131 - val_mae: 0.6452
Epoch 259/512
11/11 [==============================] - 0s 3ms/step - loss: 0.5051 - mae: 0.5249 - val_loss: 0.6021 - val_mae: 0.6382
Epoch 260/512
11/11 [==============================] - 0s 3ms/step - loss: 0.5005 - mae: 0.5233 - val_loss: 0.6183 - val_mae: 0.6501
Epoch 261/512
11/11 [==============================] - 0s 3ms/step - loss: 0.5004 - mae: 0.5241 - val_loss: 0.6054 - val_mae: 0.6417
Epoch 262/512
11/11 [==============================] - 0s 3ms/step - loss: 0.4971 - mae: 0.5203 - val_loss: 0.6040 - val_mae: 0.6409
Epoch 263/512
11/11 [==============================] - 0s 3ms/step - loss: 0.4959 - mae: 0.5206 - val_loss: 0.6044 - val_mae: 0.6406
Epoch 264/512
11/11 [==============================] - 0s 3ms/step - loss: 0.4938 - mae: 0.5196 - val_loss: 0.6046 - val_mae: 0.6417
Epoch 265/512
11/11 [==============================] - 0s 3ms/step - loss: 0.4930 - mae: 0.5180 - val_loss: 0.5979 - val_mae: 0.6376
Epoch 266/512
11/11 [==============================] - 0s 3ms/step - loss: 0.4914 - mae: 0.5159 - val_loss: 0.6024 - val_mae: 0.6415
Epoch 267/512
11/11 [==============================] - 0s 3ms/step - loss: 0.4886 - mae: 0.5160 - val_loss: 0.6087 - val_mae: 0.6452
Epoch 268/512
11/11 [==============================] - 0s 3ms/step - loss: 0.4882 - mae: 0.5168 - val_loss: 0.6009 - val_mae: 0.6407
Epoch 269/512
11/11 [==============================] - 0s 3ms/step - loss: 0.4861 - mae: 0.5148 - val_loss: 0.6006 - val_mae: 0.6410
Epoch 270/512
11/11 [==============================] - 0s 3ms/step - loss: 0.4863 - mae: 0.5121 - val_loss: 0.5940 - val_mae: 0.6377
Epoch 271/512
11/11 [==============================] - 0s 3ms/step - loss: 0.4848 - mae: 0.5098 - val_loss: 0.5935 - val_mae: 0.6388
Epoch 272/512
11/11 [==============================] - 0s 3ms/step - loss: 0.4813 - mae: 0.5098 - val_loss: 0.6008 - val_mae: 0.6420
Epoch 273/512
11/11 [==============================] - 0s 3ms/step - loss: 0.4808 - mae: 0.5106 - val_loss: 0.5995 - val_mae: 0.6419
Epoch 274/512
11/11 [==============================] - 0s 3ms/step - loss: 0.4807 - mae: 0.5104 - val_loss: 0.5930 - val_mae: 0.6375
Epoch 275/512
11/11 [==============================] - 0s 3ms/step - loss: 0.4780 - mae: 0.5094 - val_loss: 0.5947 - val_mae: 0.6389
Epoch 276/512
11/11 [==============================] - 0s 3ms/step - loss: 0.4774 - mae: 0.5091 - val_loss: 0.5990 - val_mae: 0.6419
Epoch 277/512
11/11 [==============================] - 0s 3ms/step - loss: 0.4758 - mae: 0.5061 - val_loss: 0.5926 - val_mae: 0.6387
Epoch 278/512
11/11 [==============================] - 0s 3ms/step - loss: 0.4747 - mae: 0.5055 - val_loss: 0.5887 - val_mae: 0.6359
Epoch 279/512
11/11 [==============================] - 0s 4ms/step - loss: 0.4729 - mae: 0.5032 - val_loss: 0.5855 - val_mae: 0.6351
Epoch 280/512
11/11 [==============================] - 0s 3ms/step - loss: 0.4722 - mae: 0.5032 - val_loss: 0.5926 - val_mae: 0.6394
Epoch 281/512
11/11 [==============================] - 0s 3ms/step - loss: 0.4700 - mae: 0.5019 - val_loss: 0.5843 - val_mae: 0.6333
Epoch 282/512
11/11 [==============================] - 0s 3ms/step - loss: 0.4701 - mae: 0.5034 - val_loss: 0.5899 - val_mae: 0.6375
Epoch 283/512
11/11 [==============================] - 0s 3ms/step - loss: 0.4680 - mae: 0.5005 - val_loss: 0.5848 - val_mae: 0.6350
Epoch 284/512
11/11 [==============================] - 0s 3ms/step - loss: 0.4689 - mae: 0.4988 - val_loss: 0.5809 - val_mae: 0.6321
Epoch 285/512
11/11 [==============================] - 0s 3ms/step - loss: 0.4667 - mae: 0.4986 - val_loss: 0.5898 - val_mae: 0.6383
Epoch 286/512
11/11 [==============================] - 0s 3ms/step - loss: 0.4643 - mae: 0.4997 - val_loss: 0.5874 - val_mae: 0.6362
Epoch 287/512
11/11 [==============================] - 0s 3ms/step - loss: 0.4642 - mae: 0.4999 - val_loss: 0.5855 - val_mae: 0.6355
Epoch 288/512
11/11 [==============================] - 0s 3ms/step - loss: 0.4611 - mae: 0.4958 - val_loss: 0.5795 - val_mae: 0.6322
Epoch 289/512
11/11 [==============================] - 0s 3ms/step - loss: 0.4608 - mae: 0.4948 - val_loss: 0.5805 - val_mae: 0.6325
Epoch 290/512
11/11 [==============================] - 0s 3ms/step - loss: 0.4591 - mae: 0.4951 - val_loss: 0.5861 - val_mae: 0.6367
Epoch 291/512
11/11 [==============================] - 0s 3ms/step - loss: 0.4575 - mae: 0.4939 - val_loss: 0.5777 - val_mae: 0.6316
Epoch 292/512
11/11 [==============================] - 0s 3ms/step - loss: 0.4583 - mae: 0.4919 - val_loss: 0.5791 - val_mae: 0.6332
Epoch 293/512
11/11 [==============================] - 0s 3ms/step - loss: 0.4560 - mae: 0.4910 - val_loss: 0.5785 - val_mae: 0.6323
Epoch 294/512
11/11 [==============================] - 0s 3ms/step - loss: 0.4544 - mae: 0.4921 - val_loss: 0.5796 - val_mae: 0.6341
Epoch 295/512
11/11 [==============================] - 0s 3ms/step - loss: 0.4535 - mae: 0.4903 - val_loss: 0.5756 - val_mae: 0.6304
Epoch 296/512
11/11 [==============================] - 0s 3ms/step - loss: 0.4526 - mae: 0.4903 - val_loss: 0.5782 - val_mae: 0.6321
Epoch 297/512
11/11 [==============================] - 0s 3ms/step - loss: 0.4504 - mae: 0.4889 - val_loss: 0.5745 - val_mae: 0.6301
Epoch 298/512
11/11 [==============================] - 0s 3ms/step - loss: 0.4494 - mae: 0.4862 - val_loss: 0.5752 - val_mae: 0.6312
Epoch 299/512
11/11 [==============================] - 0s 3ms/step - loss: 0.4495 - mae: 0.4857 - val_loss: 0.5737 - val_mae: 0.6302
Epoch 300/512
11/11 [==============================] - 0s 3ms/step - loss: 0.4473 - mae: 0.4861 - val_loss: 0.5782 - val_mae: 0.6330
Epoch 301/512
11/11 [==============================] - 0s 3ms/step - loss: 0.4475 - mae: 0.4872 - val_loss: 0.5691 - val_mae: 0.6266
Epoch 302/512
11/11 [==============================] - 0s 3ms/step - loss: 0.4455 - mae: 0.4851 - val_loss: 0.5768 - val_mae: 0.6317
Epoch 303/512
11/11 [==============================] - 0s 3ms/step - loss: 0.4440 - mae: 0.4837 - val_loss: 0.5707 - val_mae: 0.6278
Epoch 304/512
11/11 [==============================] - 0s 3ms/step - loss: 0.4453 - mae: 0.4853 - val_loss: 0.5739 - val_mae: 0.6304
Epoch 305/512
11/11 [==============================] - 0s 3ms/step - loss: 0.4416 - mae: 0.4825 - val_loss: 0.5691 - val_mae: 0.6264
Epoch 306/512
11/11 [==============================] - 0s 3ms/step - loss: 0.4409 - mae: 0.4810 - val_loss: 0.5696 - val_mae: 0.6276
Epoch 307/512
11/11 [==============================] - 0s 3ms/step - loss: 0.4407 - mae: 0.4809 - val_loss: 0.5732 - val_mae: 0.6305
Epoch 308/512
11/11 [==============================] - 0s 3ms/step - loss: 0.4392 - mae: 0.4804 - val_loss: 0.5684 - val_mae: 0.6281
Epoch 309/512
11/11 [==============================] - 0s 3ms/step - loss: 0.4383 - mae: 0.4771 - val_loss: 0.5648 - val_mae: 0.6255
Epoch 310/512
11/11 [==============================] - 0s 3ms/step - loss: 0.4379 - mae: 0.4764 - val_loss: 0.5682 - val_mae: 0.6272
Epoch 311/512
11/11 [==============================] - 0s 3ms/step - loss: 0.4391 - mae: 0.4815 - val_loss: 0.5735 - val_mae: 0.6308
Epoch 312/512
11/11 [==============================] - 0s 3ms/step - loss: 0.4353 - mae: 0.4784 - val_loss: 0.5639 - val_mae: 0.6230
Epoch 313/512
11/11 [==============================] - 0s 3ms/step - loss: 0.4350 - mae: 0.4754 - val_loss: 0.5621 - val_mae: 0.6236
Epoch 314/512
11/11 [==============================] - 0s 3ms/step - loss: 0.4329 - mae: 0.4753 - val_loss: 0.5690 - val_mae: 0.6284
Epoch 315/512
11/11 [==============================] - 0s 3ms/step - loss: 0.4323 - mae: 0.4742 - val_loss: 0.5639 - val_mae: 0.6253
Epoch 316/512
11/11 [==============================] - 0s 3ms/step - loss: 0.4310 - mae: 0.4736 - val_loss: 0.5650 - val_mae: 0.6257
Epoch 317/512
11/11 [==============================] - 0s 3ms/step - loss: 0.4313 - mae: 0.4751 - val_loss: 0.5613 - val_mae: 0.6218
Epoch 318/512
11/11 [==============================] - 0s 3ms/step - loss: 0.4293 - mae: 0.4709 - val_loss: 0.5650 - val_mae: 0.6261
Epoch 319/512
11/11 [==============================] - 0s 3ms/step - loss: 0.4275 - mae: 0.4729 - val_loss: 0.5656 - val_mae: 0.6262
Epoch 320/512
11/11 [==============================] - 0s 3ms/step - loss: 0.4274 - mae: 0.4739 - val_loss: 0.5658 - val_mae: 0.6253
Epoch 321/512
11/11 [==============================] - 0s 3ms/step - loss: 0.4257 - mae: 0.4715 - val_loss: 0.5601 - val_mae: 0.6204
Epoch 322/512
11/11 [==============================] - 0s 3ms/step - loss: 0.4279 - mae: 0.4737 - val_loss: 0.5644 - val_mae: 0.6244
Epoch 323/512
11/11 [==============================] - 0s 3ms/step - loss: 0.4246 - mae: 0.4690 - val_loss: 0.5560 - val_mae: 0.6188
Epoch 324/512
11/11 [==============================] - 0s 3ms/step - loss: 0.4231 - mae: 0.4663 - val_loss: 0.5594 - val_mae: 0.6222
Epoch 325/512
11/11 [==============================] - 0s 3ms/step - loss: 0.4220 - mae: 0.4675 - val_loss: 0.5590 - val_mae: 0.6214
Epoch 326/512
11/11 [==============================] - 0s 3ms/step - loss: 0.4211 - mae: 0.4675 - val_loss: 0.5605 - val_mae: 0.6223
Epoch 327/512
11/11 [==============================] - 0s 3ms/step - loss: 0.4206 - mae: 0.4674 - val_loss: 0.5574 - val_mae: 0.6190
Epoch 328/512
11/11 [==============================] - 0s 3ms/step - loss: 0.4191 - mae: 0.4658 - val_loss: 0.5570 - val_mae: 0.6203
Epoch 329/512
11/11 [==============================] - 0s 3ms/step - loss: 0.4193 - mae: 0.4660 - val_loss: 0.5587 - val_mae: 0.6223
Epoch 330/512
11/11 [==============================] - 0s 3ms/step - loss: 0.4184 - mae: 0.4667 - val_loss: 0.5552 - val_mae: 0.6171
Epoch 331/512
11/11 [==============================] - 0s 3ms/step - loss: 0.4186 - mae: 0.4653 - val_loss: 0.5606 - val_mae: 0.6208
Epoch 332/512
11/11 [==============================] - 0s 3ms/step - loss: 0.4157 - mae: 0.4615 - val_loss: 0.5548 - val_mae: 0.6187
Epoch 333/512
11/11 [==============================] - 0s 3ms/step - loss: 0.4150 - mae: 0.4608 - val_loss: 0.5588 - val_mae: 0.6219
Epoch 334/512
11/11 [==============================] - 0s 3ms/step - loss: 0.4143 - mae: 0.4619 - val_loss: 0.5590 - val_mae: 0.6219
Epoch 335/512
11/11 [==============================] - 0s 3ms/step - loss: 0.4139 - mae: 0.4623 - val_loss: 0.5543 - val_mae: 0.6180
Epoch 336/512
11/11 [==============================] - 0s 3ms/step - loss: 0.4124 - mae: 0.4631 - val_loss: 0.5504 - val_mae: 0.6149
Epoch 337/512
11/11 [==============================] - 0s 3ms/step - loss: 0.4117 - mae: 0.4613 - val_loss: 0.5545 - val_mae: 0.6174
Epoch 338/512
11/11 [==============================] - 0s 3ms/step - loss: 0.4104 - mae: 0.4600 - val_loss: 0.5526 - val_mae: 0.6163
Epoch 339/512
11/11 [==============================] - 0s 3ms/step - loss: 0.4100 - mae: 0.4593 - val_loss: 0.5530 - val_mae: 0.6160
Epoch 340/512
11/11 [==============================] - 0s 3ms/step - loss: 0.4097 - mae: 0.4573 - val_loss: 0.5521 - val_mae: 0.6169
Epoch 341/512
11/11 [==============================] - 0s 3ms/step - loss: 0.4082 - mae: 0.4579 - val_loss: 0.5516 - val_mae: 0.6165
Epoch 342/512
11/11 [==============================] - 0s 3ms/step - loss: 0.4086 - mae: 0.4577 - val_loss: 0.5517 - val_mae: 0.6163
Epoch 343/512
11/11 [==============================] - 0s 3ms/step - loss: 0.4071 - mae: 0.4566 - val_loss: 0.5529 - val_mae: 0.6178
Epoch 344/512
11/11 [==============================] - 0s 3ms/step - loss: 0.4103 - mae: 0.4550 - val_loss: 0.5462 - val_mae: 0.6129
Epoch 345/512
11/11 [==============================] - 0s 3ms/step - loss: 0.4032 - mae: 0.4538 - val_loss: 0.5510 - val_mae: 0.6159
Epoch 346/512
11/11 [==============================] - 0s 3ms/step - loss: 0.4043 - mae: 0.4564 - val_loss: 0.5479 - val_mae: 0.6123
Epoch 347/512
11/11 [==============================] - 0s 3ms/step - loss: 0.4027 - mae: 0.4552 - val_loss: 0.5510 - val_mae: 0.6127
Epoch 348/512
11/11 [==============================] - 0s 3ms/step - loss: 0.4028 - mae: 0.4542 - val_loss: 0.5568 - val_mae: 0.6168
Epoch 349/512
11/11 [==============================] - 0s 3ms/step - loss: 0.4022 - mae: 0.4559 - val_loss: 0.5584 - val_mae: 0.6182
Epoch 350/512
11/11 [==============================] - 0s 3ms/step - loss: 0.3999 - mae: 0.4541 - val_loss: 0.5521 - val_mae: 0.6138
Epoch 351/512
11/11 [==============================] - 0s 3ms/step - loss: 0.3992 - mae: 0.4524 - val_loss: 0.5455 - val_mae: 0.6113
Epoch 352/512
11/11 [==============================] - 0s 3ms/step - loss: 0.3996 - mae: 0.4522 - val_loss: 0.5458 - val_mae: 0.6098
Epoch 353/512
11/11 [==============================] - 0s 3ms/step - loss: 0.3984 - mae: 0.4509 - val_loss: 0.5482 - val_mae: 0.6125
Epoch 354/512
11/11 [==============================] - 0s 3ms/step - loss: 0.3967 - mae: 0.4502 - val_loss: 0.5464 - val_mae: 0.6110
Epoch 355/512
11/11 [==============================] - 0s 3ms/step - loss: 0.3977 - mae: 0.4489 - val_loss: 0.5447 - val_mae: 0.6101
Epoch 356/512
11/11 [==============================] - 0s 3ms/step - loss: 0.3965 - mae: 0.4500 - val_loss: 0.5556 - val_mae: 0.6176
Epoch 357/512
11/11 [==============================] - 0s 3ms/step - loss: 0.3959 - mae: 0.4487 - val_loss: 0.5437 - val_mae: 0.6088
Epoch 358/512
11/11 [==============================] - 0s 3ms/step - loss: 0.3941 - mae: 0.4471 - val_loss: 0.5449 - val_mae: 0.6089
Epoch 359/512
11/11 [==============================] - 0s 3ms/step - loss: 0.3939 - mae: 0.4468 - val_loss: 0.5411 - val_mae: 0.6072
Epoch 360/512
11/11 [==============================] - 0s 3ms/step - loss: 0.3918 - mae: 0.4462 - val_loss: 0.5471 - val_mae: 0.6103
Epoch 361/512
11/11 [==============================] - 0s 3ms/step - loss: 0.3929 - mae: 0.4456 - val_loss: 0.5424 - val_mae: 0.6089
Epoch 362/512
11/11 [==============================] - 0s 3ms/step - loss: 0.3915 - mae: 0.4465 - val_loss: 0.5461 - val_mae: 0.6084
Epoch 363/512
11/11 [==============================] - 0s 3ms/step - loss: 0.3909 - mae: 0.4461 - val_loss: 0.5463 - val_mae: 0.6090
Epoch 364/512
11/11 [==============================] - 0s 3ms/step - loss: 0.3893 - mae: 0.4461 - val_loss: 0.5484 - val_mae: 0.6098
Epoch 365/512
11/11 [==============================] - 0s 3ms/step - loss: 0.3890 - mae: 0.4459 - val_loss: 0.5464 - val_mae: 0.6075
Epoch 366/512
11/11 [==============================] - 0s 3ms/step - loss: 0.3881 - mae: 0.4441 - val_loss: 0.5421 - val_mae: 0.6057
Epoch 367/512
11/11 [==============================] - 0s 3ms/step - loss: 0.3880 - mae: 0.4450 - val_loss: 0.5446 - val_mae: 0.6074
Epoch 368/512
11/11 [==============================] - 0s 3ms/step - loss: 0.3876 - mae: 0.4421 - val_loss: 0.5398 - val_mae: 0.6047
Epoch 369/512
11/11 [==============================] - 0s 3ms/step - loss: 0.3850 - mae: 0.4416 - val_loss: 0.5429 - val_mae: 0.6065
Epoch 370/512
11/11 [==============================] - 0s 3ms/step - loss: 0.3842 - mae: 0.4428 - val_loss: 0.5404 - val_mae: 0.6044
Epoch 371/512
11/11 [==============================] - 0s 3ms/step - loss: 0.3853 - mae: 0.4396 - val_loss: 0.5383 - val_mae: 0.6035
Epoch 372/512
11/11 [==============================] - 0s 3ms/step - loss: 0.3829 - mae: 0.4389 - val_loss: 0.5395 - val_mae: 0.6038
Epoch 373/512
11/11 [==============================] - 0s 3ms/step - loss: 0.3834 - mae: 0.4421 - val_loss: 0.5411 - val_mae: 0.6035
Epoch 374/512
11/11 [==============================] - 0s 3ms/step - loss: 0.3818 - mae: 0.4407 - val_loss: 0.5375 - val_mae: 0.6018
Epoch 375/512
11/11 [==============================] - 0s 3ms/step - loss: 0.3808 - mae: 0.4396 - val_loss: 0.5437 - val_mae: 0.6045
Epoch 376/512
11/11 [==============================] - 0s 3ms/step - loss: 0.3805 - mae: 0.4411 - val_loss: 0.5447 - val_mae: 0.6044
Epoch 377/512
11/11 [==============================] - 0s 3ms/step - loss: 0.3812 - mae: 0.4376 - val_loss: 0.5450 - val_mae: 0.6057
Epoch 378/512
11/11 [==============================] - 0s 3ms/step - loss: 0.3785 - mae: 0.4382 - val_loss: 0.5457 - val_mae: 0.6070
Epoch 379/512
11/11 [==============================] - 0s 3ms/step - loss: 0.3780 - mae: 0.4375 - val_loss: 0.5417 - val_mae: 0.6031
Epoch 380/512
11/11 [==============================] - 0s 3ms/step - loss: 0.3793 - mae: 0.4356 - val_loss: 0.5367 - val_mae: 0.6010
Epoch 381/512
11/11 [==============================] - 0s 3ms/step - loss: 0.3764 - mae: 0.4371 - val_loss: 0.5379 - val_mae: 0.6007
Epoch 382/512
11/11 [==============================] - 0s 3ms/step - loss: 0.3773 - mae: 0.4351 - val_loss: 0.5384 - val_mae: 0.6017
Epoch 383/512
11/11 [==============================] - 0s 3ms/step - loss: 0.3751 - mae: 0.4357 - val_loss: 0.5432 - val_mae: 0.6036
Epoch 384/512
11/11 [==============================] - 0s 3ms/step - loss: 0.3737 - mae: 0.4361 - val_loss: 0.5357 - val_mae: 0.6000
Epoch 385/512
11/11 [==============================] - 0s 3ms/step - loss: 0.3742 - mae: 0.4332 - val_loss: 0.5339 - val_mae: 0.5978
Epoch 386/512
11/11 [==============================] - 0s 3ms/step - loss: 0.3734 - mae: 0.4347 - val_loss: 0.5371 - val_mae: 0.5991
Epoch 387/512
11/11 [==============================] - 0s 3ms/step - loss: 0.3714 - mae: 0.4340 - val_loss: 0.5386 - val_mae: 0.6005
Epoch 388/512
11/11 [==============================] - 0s 3ms/step - loss: 0.3711 - mae: 0.4323 - val_loss: 0.5382 - val_mae: 0.6008
Epoch 389/512
11/11 [==============================] - 0s 3ms/step - loss: 0.3707 - mae: 0.4308 - val_loss: 0.5390 - val_mae: 0.6002
Epoch 390/512
11/11 [==============================] - 0s 3ms/step - loss: 0.3705 - mae: 0.4306 - val_loss: 0.5405 - val_mae: 0.6014
Epoch 391/512
11/11 [==============================] - 0s 3ms/step - loss: 0.3696 - mae: 0.4312 - val_loss: 0.5327 - val_mae: 0.5987
Epoch 392/512
11/11 [==============================] - 0s 3ms/step - loss: 0.3681 - mae: 0.4292 - val_loss: 0.5347 - val_mae: 0.5980
Epoch 393/512
11/11 [==============================] - 0s 3ms/step - loss: 0.3672 - mae: 0.4303 - val_loss: 0.5369 - val_mae: 0.5988
Epoch 394/512
11/11 [==============================] - 0s 3ms/step - loss: 0.3672 - mae: 0.4291 - val_loss: 0.5388 - val_mae: 0.6010
Epoch 395/512
11/11 [==============================] - 0s 3ms/step - loss: 0.3655 - mae: 0.4271 - val_loss: 0.5380 - val_mae: 0.5991
Epoch 396/512
11/11 [==============================] - 0s 3ms/step - loss: 0.3655 - mae: 0.4295 - val_loss: 0.5318 - val_mae: 0.5970
Epoch 397/512
11/11 [==============================] - 0s 3ms/step - loss: 0.3645 - mae: 0.4277 - val_loss: 0.5309 - val_mae: 0.5955
Epoch 398/512
11/11 [==============================] - 0s 3ms/step - loss: 0.3651 - mae: 0.4269 - val_loss: 0.5317 - val_mae: 0.5974
Epoch 399/512
11/11 [==============================] - 0s 3ms/step - loss: 0.3635 - mae: 0.4273 - val_loss: 0.5393 - val_mae: 0.5983
Epoch 400/512
11/11 [==============================] - 0s 3ms/step - loss: 0.3638 - mae: 0.4303 - val_loss: 0.5353 - val_mae: 0.5964
Epoch 401/512
11/11 [==============================] - 0s 3ms/step - loss: 0.3631 - mae: 0.4271 - val_loss: 0.5340 - val_mae: 0.5998
Epoch 402/512
11/11 [==============================] - 0s 3ms/step - loss: 0.3621 - mae: 0.4259 - val_loss: 0.5390 - val_mae: 0.5990
Epoch 403/512
11/11 [==============================] - 0s 3ms/step - loss: 0.3619 - mae: 0.4258 - val_loss: 0.5333 - val_mae: 0.5993
Epoch 404/512
11/11 [==============================] - 0s 3ms/step - loss: 0.3603 - mae: 0.4250 - val_loss: 0.5358 - val_mae: 0.5973
Epoch 405/512
11/11 [==============================] - 0s 3ms/step - loss: 0.3602 - mae: 0.4242 - val_loss: 0.5317 - val_mae: 0.5979
Epoch 406/512
11/11 [==============================] - 0s 3ms/step - loss: 0.3590 - mae: 0.4245 - val_loss: 0.5317 - val_mae: 0.5954
Epoch 407/512
11/11 [==============================] - 0s 3ms/step - loss: 0.3584 - mae: 0.4256 - val_loss: 0.5368 - val_mae: 0.5968
Epoch 408/512
11/11 [==============================] - 0s 3ms/step - loss: 0.3572 - mae: 0.4229 - val_loss: 0.5354 - val_mae: 0.5995
Epoch 409/512
11/11 [==============================] - 0s 3ms/step - loss: 0.3566 - mae: 0.4213 - val_loss: 0.5346 - val_mae: 0.5966
Epoch 410/512
11/11 [==============================] - 0s 3ms/step - loss: 0.3561 - mae: 0.4242 - val_loss: 0.5369 - val_mae: 0.5976
Epoch 411/512
11/11 [==============================] - 0s 3ms/step - loss: 0.3560 - mae: 0.4198 - val_loss: 0.5324 - val_mae: 0.6005
Epoch 412/512
11/11 [==============================] - 0s 3ms/step - loss: 0.3545 - mae: 0.4193 - val_loss: 0.5374 - val_mae: 0.5977
Epoch 413/512
11/11 [==============================] - 0s 3ms/step - loss: 0.3551 - mae: 0.4207 - val_loss: 0.5314 - val_mae: 0.5962
Epoch 414/512
11/11 [==============================] - 0s 3ms/step - loss: 0.3542 - mae: 0.4191 - val_loss: 0.5284 - val_mae: 0.5973
Epoch 415/512
11/11 [==============================] - 0s 3ms/step - loss: 0.3526 - mae: 0.4209 - val_loss: 0.5368 - val_mae: 0.5978
Epoch 416/512
11/11 [==============================] - 0s 3ms/step - loss: 0.3513 - mae: 0.4205 - val_loss: 0.5342 - val_mae: 0.5944
Epoch 417/512
11/11 [==============================] - 0s 3ms/step - loss: 0.3531 - mae: 0.4195 - val_loss: 0.5290 - val_mae: 0.5977
Epoch 418/512
11/11 [==============================] - 0s 3ms/step - loss: 0.3503 - mae: 0.4182 - val_loss: 0.5338 - val_mae: 0.5946
Epoch 419/512
11/11 [==============================] - 0s 3ms/step - loss: 0.3500 - mae: 0.4190 - val_loss: 0.5318 - val_mae: 0.5972
Epoch 420/512
11/11 [==============================] - 0s 3ms/step - loss: 0.3493 - mae: 0.4199 - val_loss: 0.5313 - val_mae: 0.5946
Epoch 421/512
11/11 [==============================] - 0s 3ms/step - loss: 0.3481 - mae: 0.4154 - val_loss: 0.5287 - val_mae: 0.5969
Epoch 422/512
11/11 [==============================] - 0s 3ms/step - loss: 0.3509 - mae: 0.4140 - val_loss: 0.5363 - val_mae: 0.5995
Epoch 423/512
11/11 [==============================] - 0s 3ms/step - loss: 0.3485 - mae: 0.4199 - val_loss: 0.5367 - val_mae: 0.5942
Epoch 424/512
11/11 [==============================] - 0s 3ms/step - loss: 0.3473 - mae: 0.4174 - val_loss: 0.5313 - val_mae: 0.5989
Epoch 425/512
11/11 [==============================] - 0s 3ms/step - loss: 0.3460 - mae: 0.4159 - val_loss: 0.5367 - val_mae: 0.5984
Epoch 426/512
11/11 [==============================] - 0s 3ms/step - loss: 0.3447 - mae: 0.4162 - val_loss: 0.5351 - val_mae: 0.5971
Epoch 427/512
11/11 [==============================] - 0s 3ms/step - loss: 0.3442 - mae: 0.4134 - val_loss: 0.5305 - val_mae: 0.5984
Epoch 428/512
11/11 [==============================] - 0s 3ms/step - loss: 0.3430 - mae: 0.4126 - val_loss: 0.5317 - val_mae: 0.5941
Epoch 429/512
11/11 [==============================] - 0s 3ms/step - loss: 0.3436 - mae: 0.4153 - val_loss: 0.5332 - val_mae: 0.5958
Epoch 430/512
11/11 [==============================] - 0s 3ms/step - loss: 0.3439 - mae: 0.4120 - val_loss: 0.5282 - val_mae: 0.5986
Epoch 431/512
11/11 [==============================] - 0s 3ms/step - loss: 0.3421 - mae: 0.4135 - val_loss: 0.5299 - val_mae: 0.5928
Epoch 432/512
11/11 [==============================] - 0s 3ms/step - loss: 0.3418 - mae: 0.4140 - val_loss: 0.5315 - val_mae: 0.5964
Epoch 433/512
11/11 [==============================] - 0s 3ms/step - loss: 0.3416 - mae: 0.4115 - val_loss: 0.5266 - val_mae: 0.5962
Epoch 434/512
11/11 [==============================] - 0s 3ms/step - loss: 0.3399 - mae: 0.4114 - val_loss: 0.5349 - val_mae: 0.5971
Epoch 435/512
11/11 [==============================] - 0s 3ms/step - loss: 0.3395 - mae: 0.4105 - val_loss: 0.5332 - val_mae: 0.5985
Epoch 436/512
11/11 [==============================] - 0s 3ms/step - loss: 0.3404 - mae: 0.4079 - val_loss: 0.5287 - val_mae: 0.5970
Epoch 437/512
11/11 [==============================] - 0s 3ms/step - loss: 0.3386 - mae: 0.4132 - val_loss: 0.5288 - val_mae: 0.5918
Epoch 438/512
11/11 [==============================] - 0s 3ms/step - loss: 0.3387 - mae: 0.4093 - val_loss: 0.5256 - val_mae: 0.5943
Epoch 439/512
11/11 [==============================] - 0s 3ms/step - loss: 0.3371 - mae: 0.4107 - val_loss: 0.5334 - val_mae: 0.5965
Epoch 440/512
11/11 [==============================] - 0s 3ms/step - loss: 0.3368 - mae: 0.4080 - val_loss: 0.5319 - val_mae: 0.5961
Epoch 441/512
11/11 [==============================] - 0s 3ms/step - loss: 0.3362 - mae: 0.4101 - val_loss: 0.5320 - val_mae: 0.5977
Epoch 442/512
11/11 [==============================] - 0s 3ms/step - loss: 0.3349 - mae: 0.4058 - val_loss: 0.5325 - val_mae: 0.5969
Epoch 443/512
11/11 [==============================] - 0s 3ms/step - loss: 0.3360 - mae: 0.4114 - val_loss: 0.5370 - val_mae: 0.5967
Epoch 444/512
11/11 [==============================] - 0s 3ms/step - loss: 0.3334 - mae: 0.4078 - val_loss: 0.5293 - val_mae: 0.5976
Epoch 445/512
11/11 [==============================] - 0s 3ms/step - loss: 0.3340 - mae: 0.4072 - val_loss: 0.5293 - val_mae: 0.5943
Epoch 446/512
11/11 [==============================] - 0s 3ms/step - loss: 0.3327 - mae: 0.4038 - val_loss: 0.5285 - val_mae: 0.5990
Epoch 447/512
11/11 [==============================] - 0s 3ms/step - loss: 0.3317 - mae: 0.4038 - val_loss: 0.5280 - val_mae: 0.5945
Epoch 448/512
11/11 [==============================] - 0s 3ms/step - loss: 0.3318 - mae: 0.4062 - val_loss: 0.5323 - val_mae: 0.5933
Epoch 449/512
11/11 [==============================] - 0s 3ms/step - loss: 0.3314 - mae: 0.4041 - val_loss: 0.5286 - val_mae: 0.5990
Epoch 450/512
11/11 [==============================] - 0s 3ms/step - loss: 0.3304 - mae: 0.4009 - val_loss: 0.5291 - val_mae: 0.5967
Epoch 451/512
11/11 [==============================] - 0s 3ms/step - loss: 0.3338 - mae: 0.4095 - val_loss: 0.5344 - val_mae: 0.5918
Epoch 452/512
11/11 [==============================] - 0s 3ms/step - loss: 0.3291 - mae: 0.4068 - val_loss: 0.5270 - val_mae: 0.5955
Epoch 453/512
11/11 [==============================] - 0s 3ms/step - loss: 0.3299 - mae: 0.4027 - val_loss: 0.5355 - val_mae: 0.5984
Epoch 454/512
11/11 [==============================] - 0s 3ms/step - loss: 0.3320 - mae: 0.4009 - val_loss: 0.5329 - val_mae: 0.5993
Epoch 455/512
11/11 [==============================] - 0s 3ms/step - loss: 0.3263 - mae: 0.4032 - val_loss: 0.5299 - val_mae: 0.5906
Epoch 456/512
11/11 [==============================] - 0s 3ms/step - loss: 0.3285 - mae: 0.4079 - val_loss: 0.5268 - val_mae: 0.5905
Epoch 457/512
11/11 [==============================] - 0s 3ms/step - loss: 0.3328 - mae: 0.4024 - val_loss: 0.5293 - val_mae: 0.5993
Epoch 458/512
11/11 [==============================] - 0s 3ms/step - loss: 0.3289 - mae: 0.4069 - val_loss: 0.5370 - val_mae: 0.5918
In [17]:
# Predictions

n = 5
print("Actual - Predicted")
for i in range(n):
    print(yval[i][0],'\t', model.predict(np.expand_dims(Xval[i],0))[0][0])
Actual - Predicted
4.6 	 3.8444653
5.0 	 6.8193426
4.6 	 5.4087424
5.0 	 4.9384136
4.6 	 4.9501348

Plotting the metrics

In [18]:
def plot(history, variable1, variable2):
    plt.plot(range(len(history[variable1])), history[variable1])
    plt.plot(range(len(history[variable2])), history[variable2])
    plt.legend([variable1, variable2])
    plt.title(variable1)
In [19]:
plot(history.history, "loss", 'val_loss')
In [20]:
plot(history.history, "mae", 'val_mae')

deepC

In [21]:
model.save('fuel.h5')

!deepCC fuel.h5
[INFO]
Reading [keras model] 'fuel.h5'
[SUCCESS]
Saved 'fuel_deepC/fuel.onnx'
[INFO]
Reading [onnx model] 'fuel_deepC/fuel.onnx'
[INFO]
Model info:
  ir_vesion : 4
  doc       : 
[WARNING]
[ONNX]: terminal (input/output) dense_input's shape is less than 1. Changing it to 1.
[WARNING]
[ONNX]: terminal (input/output) dense_5's shape is less than 1. Changing it to 1.
WARN (GRAPH): found operator node with the same name (dense_5) as io node.
[INFO]
Running DNNC graph sanity check ...
[SUCCESS]
Passed sanity check.
[INFO]
Writing C++ file 'fuel_deepC/fuel.cpp'
[INFO]
deepSea model files are ready in 'fuel_deepC/' 
[RUNNING COMMAND]
g++ -std=c++11 -O3 -fno-rtti -fno-exceptions -I. -I/opt/tljh/user/lib/python3.7/site-packages/deepC-0.13-py3.7-linux-x86_64.egg/deepC/include -isystem /opt/tljh/user/lib/python3.7/site-packages/deepC-0.13-py3.7-linux-x86_64.egg/deepC/packages/eigen-eigen-323c052e1731 "fuel_deepC/fuel.cpp" -D_AITS_MAIN -o "fuel_deepC/fuel.exe"
[RUNNING COMMAND]
size "fuel_deepC/fuel.exe"
   text	   data	    bss	    dec	    hex	filename
2861881	   2336	    760	2864977	 2bb751	fuel_deepC/fuel.exe
[SUCCESS]
Saved model as executable "fuel_deepC/fuel.exe"
In [ ]: