Cainvas

Article category classification

Credit: AITS Cainvas Community

Photo by Mogilev Konstantin on Dribbble

What is this article talking about?

Too many documents, but what are they about? It is a tiring task to go through documents or files and categorise them.

In [1]:
import pandas as pd
import numpy as np
import nltk
import matplotlib.pyplot as plt
import re
from sklearn.model_selection import train_test_split
from sklearn.metrics import accuracy_score, precision_score, multilabel_confusion_matrix, f1_score
import random
import tensorflow.keras
from tensorflow.keras.models import Sequential
from tensorflow.keras.layers import Dense, Embedding, GlobalAveragePooling1D
from tensorflow.keras.optimizers import Adam
from tensorflow.keras.losses import BinaryCrossentropy
from tensorflow.keras.preprocessing.sequence import pad_sequences
from tensorflow.keras.preprocessing.text import Tokenizer
from tensorflow.keras.callbacks import EarlyStopping, ModelCheckpoint
from wordcloud import WordCloud
# stopwords
nltk.download('stopwords')
[nltk_data] Downloading package stopwords to /home/jupyter-
[nltk_data]     Rodio346/nltk_data...
[nltk_data]   Package stopwords is already up-to-date!
Out[1]:
True

Dataset

There are two CSV files, train and test with article titles, abstracts and the subjects they talk about.

The articles can belong to more than 1 subject.

In [2]:
df = pd.read_csv('https://cainvas-static.s3.amazonaws.com/media/user_data/cainvas-admin/researchArticles.csv')
df
Out[2]:
ID TITLE ABSTRACT Computer Science Physics Mathematics Statistics Quantitative Biology Quantitative Finance
0 1 Reconstructing Subject-Specific Effect Maps Predictive models allow subject-specific inf... 1 0 0 0 0 0
1 2 Rotation Invariance Neural Network Rotation invariance and translation invarian... 1 0 0 0 0 0
2 3 Spherical polyharmonics and Poisson kernels fo... We introduce and develop the notion of spher... 0 0 1 0 0 0
3 4 A finite element approximation for the stochas... The stochastic Landau--Lifshitz--Gilbert (LL... 0 0 1 0 0 0
4 5 Comparative study of Discrete Wavelet Transfor... Fourier-transform infra-red (FTIR) spectra o... 1 0 0 1 0 0
... ... ... ... ... ... ... ... ... ...
20967 20968 Contemporary machine learning: a guide for pra... Machine learning is finding increasingly bro... 1 1 0 0 0 0
20968 20969 Uniform diamond coatings on WC-Co hard alloy c... Polycrystalline diamond coatings have been g... 0 1 0 0 0 0
20969 20970 Analysing Soccer Games with Clustering and Con... We present a new approach for identifying si... 1 0 0 0 0 0
20970 20971 On the Efficient Simulation of the Left-Tail o... The sum of Log-normal variates is encountere... 0 0 1 1 0 0
20971 20972 Why optional stopping is a problem for Bayesians Recently, optional stopping has been a subje... 0 0 1 1 0 0

20972 rows × 9 columns

In [3]:
# Columns in the dataset
df.columns
Out[3]:
Index(['ID', 'TITLE', 'ABSTRACT', 'Computer Science', 'Physics', 'Mathematics',
       'Statistics', 'Quantitative Biology', 'Quantitative Finance'],
      dtype='object')
In [4]:
# Defining the list of subjects
 
subjects = ['Computer Science', 'Physics', 'Mathematics', 'Statistics', 'Quantitative Biology', 'Quantitative Finance']
In [5]:
# Distribution of subject values
 
for subject in subjects:
    print(subject, '-', list(df[subject]).count(1))
    print()
Computer Science - 8594

Physics - 6013

Mathematics - 5618

Statistics - 5206

Quantitative Biology - 587

Quantitative Finance - 249

It is an unbalanced dataset.

Data preprocessing

In [6]:
# Remove URLs
def removeURL(sentence):
    regex = re.compile('http[s]?://\S+')
    return re.sub(regex, ' ', sentence)
In [7]:
# remove numbers, punctuation and any special characters (keep only alphabets)
def onlyAlphabets(sentence):
    regex = re.compile('[^a-zA-Z]')
    return re.sub(regex, ' ', sentence)
In [8]:
# Defining stopwords
stop = nltk.corpus.stopwords.words('english')
#stop.remove('not')
 
print(len(stop))
179
In [9]:
sno = nltk.stem.SnowballStemmer('english')    # Initializing stemmer
 
subject_words = [[], [], [], [], [], []]
 
all_text = []
 
#print(len(df))
 
for x in range(len(df)):
    #print(x)
    title = df['TITLE'].values[x]
    abstract = df['ABSTRACT'].values[x]
 
    s = df[subjects].values[x]  
    s_num = np.where(s == 1)[0]
 
    cleaned_text = []
 
    title = removeURL(title) 
    title = onlyAlphabets(title) 
    title = title.lower()
 
    abstract = removeURL(abstract) 
    abstract = onlyAlphabets(abstract) 
    abstract = abstract.lower()    
 
    for word in title.split():
        if word not in stop:
            stemmed = sno.stem(word)
            cleaned_text.append(stemmed)
 
            for si in s_num:
                subject_words[si].append(word)
 
    for word in abstract.split():
        if word not in stop:
            stemmed = sno.stem(word)
            cleaned_text.append(stemmed)  
 
            for si in s_num:
                subject_words[si].append(word)
 
    all_text.append(' '.join(cleaned_text))
 
#pick only required columns 
df = df[subjects]
 
# add as column in dataframe
df['Cleaned_text'] = all_text
/opt/tljh/user/lib/python3.7/site-packages/ipykernel_launcher.py:49: SettingWithCopyWarning: 
A value is trying to be set on a copy of a slice from a DataFrame.
Try using .loc[row_indexer,col_indexer] = value instead

See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy
In [10]:
df
Out[10]:
Computer Science Physics Mathematics Statistics Quantitative Biology Quantitative Finance Cleaned_text
0 1 0 0 0 0 0 reconstruct subject specif effect map predict ...
1 1 0 0 0 0 0 rotat invari neural network rotat invari trans...
2 0 0 1 0 0 0 spheric polyharmon poisson kernel polyharmon f...
3 0 0 1 0 0 0 finit element approxim stochast maxwel landau ...
4 1 0 0 1 0 0 compar studi discret wavelet transform wavelet...
... ... ... ... ... ... ... ...
20967 1 1 0 0 0 0 contemporari machin learn guid practition phys...
20968 0 1 0 0 0 0 uniform diamond coat wc co hard alloy cut inse...
20969 1 0 0 0 0 0 analys soccer game cluster conceptor present n...
20970 0 0 1 1 0 0 effici simul left tail sum correl log normal v...
20971 0 0 1 1 0 0 option stop problem bayesian recent option sto...

20972 rows × 7 columns

In [11]:
df.to_csv('cleaned.csv', index=False)

Visualization

In [12]:
plt.figure(figsize=(40,40))

for i in range(len(subjects)):
    ax = plt.subplot(len(subjects), 1, i + 1)
    plt.imshow(WordCloud().generate(' '.join(subject_words[i])))
    plt.title(subjects[i])
    plt.axis("off")

Data preprocessing continued...

In [13]:
df = pd.read_csv('cleaned.csv')
df
Out[13]:
Computer Science Physics Mathematics Statistics Quantitative Biology Quantitative Finance Cleaned_text
0 1 0 0 0 0 0 reconstruct subject specif effect map predict ...
1 1 0 0 0 0 0 rotat invari neural network rotat invari trans...
2 0 0 1 0 0 0 spheric polyharmon poisson kernel polyharmon f...
3 0 0 1 0 0 0 finit element approxim stochast maxwel landau ...
4 1 0 0 1 0 0 compar studi discret wavelet transform wavelet...
... ... ... ... ... ... ... ...
20967 1 1 0 0 0 0 contemporari machin learn guid practition phys...
20968 0 1 0 0 0 0 uniform diamond coat wc co hard alloy cut inse...
20969 1 0 0 0 0 0 analys soccer game cluster conceptor present n...
20970 0 0 1 1 0 0 effici simul left tail sum correl log normal v...
20971 0 0 1 1 0 0 option stop problem bayesian recent option sto...

20972 rows × 7 columns

In [14]:
# check for any null values 
df.count()
Out[14]:
Computer Science        20972
Physics                 20972
Mathematics             20972
Statistics              20972
Quantitative Biology    20972
Quantitative Finance    20972
Cleaned_text            20972
dtype: int64
In [15]:
df = df.dropna()
 
df.count()
Out[15]:
Computer Science        20972
Physics                 20972
Mathematics             20972
Statistics              20972
Quantitative Biology    20972
Quantitative Finance    20972
Cleaned_text            20972
dtype: int64
In [16]:
# Definfing output columns
 
y = np.array(df[subjects])
In [17]:
input = "Cleaned_text"
 
X = df[input]
X
Out[17]:
0        reconstruct subject specif effect map predict ...
1        rotat invari neural network rotat invari trans...
2        spheric polyharmon poisson kernel polyharmon f...
3        finit element approxim stochast maxwel landau ...
4        compar studi discret wavelet transform wavelet...
                               ...                        
20967    contemporari machin learn guid practition phys...
20968    uniform diamond coat wc co hard alloy cut inse...
20969    analys soccer game cluster conceptor present n...
20970    effici simul left tail sum correl log normal v...
20971    option stop problem bayesian recent option sto...
Name: Cleaned_text, Length: 20972, dtype: object
In [18]:
split = int(0.8*len(df))
 
Xtrain, Xtest = X[:split], X[split:]
ytrain, ytest = y[:split], y[split:]
 
print("Train set - ", Xtrain.shape[0])
print("Test set - ", Xtest.shape[0])
Train set -  16777
Test set -  4195
In [19]:
# Tokenization
vocab = 15000
 
tokenizer = Tokenizer(num_words = vocab, oov_token = '<UNK>')
tokenizer.fit_on_texts(Xtrain)
word_index = tokenizer.word_index
 
# Padding
mlen = 600
padding_type = 'post'
trunc_type = 'post'
 
Xtrain = tokenizer.texts_to_sequences(Xtrain)
Xtrain = pad_sequences(Xtrain, maxlen=mlen, padding=padding_type, truncating=trunc_type)
 
Xtest = tokenizer.texts_to_sequences(Xtest)
Xtest = pad_sequences(Xtest, maxlen=mlen, padding=padding_type, truncating=trunc_type)

The model

In [20]:
# Build and train neural network
embedding_dim = 32
 
model = Sequential([
    Embedding(vocab, embedding_dim, input_length = mlen),
    GlobalAveragePooling1D(),
    Dense(32, activation = 'relu'),
    Dense(len(subjects), activation = 'sigmoid')
])
 
cb = [ModelCheckpoint('articles.h5', monitor = 'val_accuracy', save_best_only = True)]
model.summary()
Model: "sequential"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
embedding (Embedding)        (None, 600, 32)           480000    
_________________________________________________________________
global_average_pooling1d (Gl (None, 32)                0         
_________________________________________________________________
dense (Dense)                (None, 32)                1056      
_________________________________________________________________
dense_1 (Dense)              (None, 6)                 198       
=================================================================
Total params: 481,254
Trainable params: 481,254
Non-trainable params: 0
_________________________________________________________________
In [21]:
model.compile(optimizer = Adam(0.1), loss = BinaryCrossentropy(), metrics = 'Accuracy')
 
history = model.fit(Xtrain, ytrain, batch_size=64, epochs = 256, validation_data=(Xtest, ytest), callbacks = cb)
Epoch 1/256
263/263 [==============================] - 2s 7ms/step - loss: 0.2441 - accuracy: 9.9342e-06 - val_loss: 0.2191 - val_accuracy: 3.9730e-05
Epoch 2/256
263/263 [==============================] - 2s 7ms/step - loss: 0.1664 - accuracy: 5.2651e-04 - val_loss: 0.2079 - val_accuracy: 4.3703e-04
Epoch 3/256
263/263 [==============================] - 2s 7ms/step - loss: 0.1397 - accuracy: 0.0038 - val_loss: 0.2443 - val_accuracy: 0.0058
Epoch 4/256
263/263 [==============================] - 2s 7ms/step - loss: 0.1165 - accuracy: 0.0120 - val_loss: 0.2436 - val_accuracy: 0.0062
Epoch 5/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0981 - accuracy: 0.0164 - val_loss: 0.2800 - val_accuracy: 0.0182
Epoch 6/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0923 - accuracy: 0.0255 - val_loss: 0.2834 - val_accuracy: 0.0207
Epoch 7/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0810 - accuracy: 0.0340 - val_loss: 0.3430 - val_accuracy: 0.0232
Epoch 8/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0670 - accuracy: 0.0376 - val_loss: 0.4061 - val_accuracy: 0.0400
Epoch 9/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0648 - accuracy: 0.0497 - val_loss: 0.3634 - val_accuracy: 0.0346
Epoch 10/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0636 - accuracy: 0.0518 - val_loss: 0.4643 - val_accuracy: 0.0477
Epoch 11/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0573 - accuracy: 0.0591 - val_loss: 0.5192 - val_accuracy: 0.0533
Epoch 12/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0562 - accuracy: 0.0665 - val_loss: 0.4336 - val_accuracy: 0.0431
Epoch 13/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0557 - accuracy: 0.0627 - val_loss: 0.5229 - val_accuracy: 0.0578
Epoch 14/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0492 - accuracy: 0.0836 - val_loss: 0.5854 - val_accuracy: 0.0742
Epoch 15/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0399 - accuracy: 0.1010 - val_loss: 0.6250 - val_accuracy: 0.0901
Epoch 16/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0484 - accuracy: 0.1181 - val_loss: 0.6812 - val_accuracy: 0.1083
Epoch 17/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0426 - accuracy: 0.1329 - val_loss: 0.6301 - val_accuracy: 0.1060
Epoch 18/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0389 - accuracy: 0.1348 - val_loss: 0.7011 - val_accuracy: 0.1159
Epoch 19/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0345 - accuracy: 0.1321 - val_loss: 0.6560 - val_accuracy: 0.1064
Epoch 20/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0508 - accuracy: 0.1512 - val_loss: 0.6148 - val_accuracy: 0.1160
Epoch 21/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0382 - accuracy: 0.1379 - val_loss: 0.7266 - val_accuracy: 0.1285
Epoch 22/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0342 - accuracy: 0.1635 - val_loss: 0.7038 - val_accuracy: 0.1447
Epoch 23/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0401 - accuracy: 0.1656 - val_loss: 0.7682 - val_accuracy: 0.1460
Epoch 24/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0395 - accuracy: 0.1670 - val_loss: 0.7785 - val_accuracy: 0.1372
Epoch 25/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0331 - accuracy: 0.1715 - val_loss: 0.8851 - val_accuracy: 0.1442
Epoch 26/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0389 - accuracy: 0.1875 - val_loss: 0.8340 - val_accuracy: 0.1607
Epoch 27/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0303 - accuracy: 0.1910 - val_loss: 0.9511 - val_accuracy: 0.1610
Epoch 28/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0411 - accuracy: 0.1972 - val_loss: 1.0904 - val_accuracy: 0.2090
Epoch 29/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0507 - accuracy: 0.1916 - val_loss: 0.9424 - val_accuracy: 0.1555
Epoch 30/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0423 - accuracy: 0.1856 - val_loss: 0.8027 - val_accuracy: 0.1427
Epoch 31/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0265 - accuracy: 0.2085 - val_loss: 0.8646 - val_accuracy: 0.1644
Epoch 32/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0331 - accuracy: 0.2389 - val_loss: 1.0752 - val_accuracy: 0.2322
Epoch 33/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0357 - accuracy: 0.2283 - val_loss: 0.9063 - val_accuracy: 0.1724
Epoch 34/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0277 - accuracy: 0.2453 - val_loss: 1.0999 - val_accuracy: 0.2325
Epoch 35/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0258 - accuracy: 0.2637 - val_loss: 1.1805 - val_accuracy: 0.2253
Epoch 36/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0269 - accuracy: 0.2626 - val_loss: 1.1910 - val_accuracy: 0.2538
Epoch 37/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0560 - accuracy: 0.2549 - val_loss: 0.8845 - val_accuracy: 0.1483
Epoch 38/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0354 - accuracy: 0.2266 - val_loss: 1.1195 - val_accuracy: 0.2016
Epoch 39/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0309 - accuracy: 0.2383 - val_loss: 1.0901 - val_accuracy: 0.2102
Epoch 40/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0385 - accuracy: 0.2481 - val_loss: 1.1239 - val_accuracy: 0.2388
Epoch 41/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0265 - accuracy: 0.2722 - val_loss: 1.0720 - val_accuracy: 0.2230
Epoch 42/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0223 - accuracy: 0.2929 - val_loss: 1.1785 - val_accuracy: 0.2593
Epoch 43/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0418 - accuracy: 0.2765 - val_loss: 1.2445 - val_accuracy: 0.2313
Epoch 44/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0321 - accuracy: 0.2639 - val_loss: 1.2189 - val_accuracy: 0.2425
Epoch 45/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0452 - accuracy: 0.2612 - val_loss: 1.2671 - val_accuracy: 0.2350
Epoch 46/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0252 - accuracy: 0.2675 - val_loss: 0.9982 - val_accuracy: 0.1821
Epoch 47/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0198 - accuracy: 0.2733 - val_loss: 1.2325 - val_accuracy: 0.2221
Epoch 48/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0293 - accuracy: 0.2805 - val_loss: 1.3651 - val_accuracy: 0.2586
Epoch 49/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0311 - accuracy: 0.3029 - val_loss: 1.4255 - val_accuracy: 0.2840
Epoch 50/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0347 - accuracy: 0.2985 - val_loss: 1.2612 - val_accuracy: 0.2387
Epoch 51/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0341 - accuracy: 0.2844 - val_loss: 1.2445 - val_accuracy: 0.2533
Epoch 52/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0331 - accuracy: 0.3126 - val_loss: 1.3688 - val_accuracy: 0.2879
Epoch 53/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0297 - accuracy: 0.3289 - val_loss: 1.2322 - val_accuracy: 0.2348
Epoch 54/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0362 - accuracy: 0.3230 - val_loss: 1.4427 - val_accuracy: 0.2900
Epoch 55/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0311 - accuracy: 0.3515 - val_loss: 1.1514 - val_accuracy: 0.2130
Epoch 56/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0414 - accuracy: 0.3140 - val_loss: 1.1394 - val_accuracy: 0.2374
Epoch 57/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0308 - accuracy: 0.3371 - val_loss: 1.4124 - val_accuracy: 0.3008
Epoch 58/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0304 - accuracy: 0.3416 - val_loss: 1.3976 - val_accuracy: 0.3058
Epoch 59/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0259 - accuracy: 0.3663 - val_loss: 1.0681 - val_accuracy: 0.2415
Epoch 60/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0322 - accuracy: 0.3701 - val_loss: 1.5366 - val_accuracy: 0.3267
Epoch 61/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0368 - accuracy: 0.3803 - val_loss: 1.6787 - val_accuracy: 0.3378
Epoch 62/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0328 - accuracy: 0.3698 - val_loss: 1.7236 - val_accuracy: 0.3615
Epoch 63/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0228 - accuracy: 0.3802 - val_loss: 1.6446 - val_accuracy: 0.3385
Epoch 64/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0285 - accuracy: 0.3850 - val_loss: 1.4676 - val_accuracy: 0.2911
Epoch 65/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0375 - accuracy: 0.3638 - val_loss: 1.5397 - val_accuracy: 0.2923
Epoch 66/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0352 - accuracy: 0.3656 - val_loss: 1.6152 - val_accuracy: 0.3300
Epoch 67/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0375 - accuracy: 0.3942 - val_loss: 1.3311 - val_accuracy: 0.2757
Epoch 68/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0298 - accuracy: 0.4089 - val_loss: 1.5833 - val_accuracy: 0.3274
Epoch 69/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0289 - accuracy: 0.4199 - val_loss: 1.4721 - val_accuracy: 0.3128
Epoch 70/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0420 - accuracy: 0.3995 - val_loss: 1.4508 - val_accuracy: 0.3061
Epoch 71/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0221 - accuracy: 0.4054 - val_loss: 1.4164 - val_accuracy: 0.2671
Epoch 72/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0287 - accuracy: 0.4167 - val_loss: 1.7040 - val_accuracy: 0.3634
Epoch 73/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0259 - accuracy: 0.4340 - val_loss: 1.7503 - val_accuracy: 0.3374
Epoch 74/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0414 - accuracy: 0.4125 - val_loss: 1.3814 - val_accuracy: 0.2901
Epoch 75/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0342 - accuracy: 0.4085 - val_loss: 1.6033 - val_accuracy: 0.3285
Epoch 76/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0230 - accuracy: 0.4330 - val_loss: 1.7048 - val_accuracy: 0.3462
Epoch 77/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0186 - accuracy: 0.4521 - val_loss: 1.7367 - val_accuracy: 0.3592
Epoch 78/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0224 - accuracy: 0.4583 - val_loss: 1.7530 - val_accuracy: 0.3552
Epoch 79/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0557 - accuracy: 0.4325 - val_loss: 1.8281 - val_accuracy: 0.3919
Epoch 80/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0408 - accuracy: 0.4243 - val_loss: 1.6664 - val_accuracy: 0.3336
Epoch 81/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0275 - accuracy: 0.4370 - val_loss: 2.1527 - val_accuracy: 0.4494
Epoch 82/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0226 - accuracy: 0.4759 - val_loss: 1.9213 - val_accuracy: 0.4157
Epoch 83/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0327 - accuracy: 0.4764 - val_loss: 2.1304 - val_accuracy: 0.4462
Epoch 84/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0268 - accuracy: 0.4887 - val_loss: 2.7528 - val_accuracy: 0.4841
Epoch 85/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0494 - accuracy: 0.4329 - val_loss: 1.8969 - val_accuracy: 0.3725
Epoch 86/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0291 - accuracy: 0.4392 - val_loss: 2.0279 - val_accuracy: 0.4074
Epoch 87/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0332 - accuracy: 0.4791 - val_loss: 1.8207 - val_accuracy: 0.3613
Epoch 88/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0332 - accuracy: 0.4734 - val_loss: 1.9049 - val_accuracy: 0.3892
Epoch 89/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0336 - accuracy: 0.4686 - val_loss: 1.7929 - val_accuracy: 0.3576
Epoch 90/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0229 - accuracy: 0.4857 - val_loss: 2.2074 - val_accuracy: 0.4385
Epoch 91/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0184 - accuracy: 0.5017 - val_loss: 2.2730 - val_accuracy: 0.4083
Epoch 92/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0349 - accuracy: 0.5063 - val_loss: 2.0156 - val_accuracy: 0.4064
Epoch 93/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0402 - accuracy: 0.4824 - val_loss: 1.6876 - val_accuracy: 0.3410
Epoch 94/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0344 - accuracy: 0.4909 - val_loss: 2.3738 - val_accuracy: 0.4472
Epoch 95/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0266 - accuracy: 0.5072 - val_loss: 2.5280 - val_accuracy: 0.4879
Epoch 96/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0298 - accuracy: 0.4970 - val_loss: 2.2938 - val_accuracy: 0.4444
Epoch 97/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0273 - accuracy: 0.5047 - val_loss: 2.5565 - val_accuracy: 0.4487
Epoch 98/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0479 - accuracy: 0.4835 - val_loss: 2.3730 - val_accuracy: 0.4654
Epoch 99/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0212 - accuracy: 0.5162 - val_loss: 2.2304 - val_accuracy: 0.4420
Epoch 100/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0317 - accuracy: 0.5227 - val_loss: 2.2680 - val_accuracy: 0.4339
Epoch 101/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0272 - accuracy: 0.5098 - val_loss: 2.3836 - val_accuracy: 0.4043
Epoch 102/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0236 - accuracy: 0.5204 - val_loss: 2.4234 - val_accuracy: 0.4564
Epoch 103/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0263 - accuracy: 0.5457 - val_loss: 2.2321 - val_accuracy: 0.4474
Epoch 104/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0314 - accuracy: 0.5576 - val_loss: 2.4353 - val_accuracy: 0.4665
Epoch 105/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0297 - accuracy: 0.5511 - val_loss: 2.4804 - val_accuracy: 0.4894
Epoch 106/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0228 - accuracy: 0.5579 - val_loss: 2.2479 - val_accuracy: 0.4193
Epoch 107/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0351 - accuracy: 0.5508 - val_loss: 2.4211 - val_accuracy: 0.4225
Epoch 108/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0560 - accuracy: 0.5136 - val_loss: 2.4298 - val_accuracy: 0.4685
Epoch 109/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0385 - accuracy: 0.5278 - val_loss: 2.3529 - val_accuracy: 0.4858
Epoch 110/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0200 - accuracy: 0.5468 - val_loss: 2.4810 - val_accuracy: 0.4656
Epoch 111/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0258 - accuracy: 0.5540 - val_loss: 2.4442 - val_accuracy: 0.4580
Epoch 112/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0197 - accuracy: 0.5698 - val_loss: 2.6996 - val_accuracy: 0.4955
Epoch 113/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0320 - accuracy: 0.5628 - val_loss: 2.3796 - val_accuracy: 0.4557
Epoch 114/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0452 - accuracy: 0.5517 - val_loss: 3.0143 - val_accuracy: 0.4990
Epoch 115/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0412 - accuracy: 0.5466 - val_loss: 2.6245 - val_accuracy: 0.4700
Epoch 116/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0224 - accuracy: 0.5644 - val_loss: 2.9420 - val_accuracy: 0.4887
Epoch 117/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0375 - accuracy: 0.5555 - val_loss: 2.9750 - val_accuracy: 0.5218
Epoch 118/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0586 - accuracy: 0.4772 - val_loss: 2.5182 - val_accuracy: 0.4788
Epoch 119/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0254 - accuracy: 0.5373 - val_loss: 2.2205 - val_accuracy: 0.4413
Epoch 120/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0167 - accuracy: 0.5476 - val_loss: 2.4046 - val_accuracy: 0.4656
Epoch 121/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0184 - accuracy: 0.5703 - val_loss: 2.4864 - val_accuracy: 0.4747
Epoch 122/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0167 - accuracy: 0.5825 - val_loss: 2.7253 - val_accuracy: 0.4863
Epoch 123/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0501 - accuracy: 0.5619 - val_loss: 3.0250 - val_accuracy: 0.5348
Epoch 124/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0401 - accuracy: 0.6019 - val_loss: 2.7231 - val_accuracy: 0.4723
Epoch 125/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0372 - accuracy: 0.6104 - val_loss: 2.6782 - val_accuracy: 0.4444
Epoch 126/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0422 - accuracy: 0.5784 - val_loss: 2.6985 - val_accuracy: 0.4991
Epoch 127/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0204 - accuracy: 0.6071 - val_loss: 2.8472 - val_accuracy: 0.4994
Epoch 128/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0233 - accuracy: 0.6073 - val_loss: 2.9456 - val_accuracy: 0.5447
Epoch 129/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0249 - accuracy: 0.6030 - val_loss: 2.9692 - val_accuracy: 0.5189
Epoch 130/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0387 - accuracy: 0.6120 - val_loss: 2.5179 - val_accuracy: 0.4467
Epoch 131/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0232 - accuracy: 0.6099 - val_loss: 2.9641 - val_accuracy: 0.4991
Epoch 132/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0320 - accuracy: 0.5984 - val_loss: 3.1244 - val_accuracy: 0.5144
Epoch 133/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0331 - accuracy: 0.5742 - val_loss: 2.7676 - val_accuracy: 0.5001
Epoch 134/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0248 - accuracy: 0.5997 - val_loss: 2.7203 - val_accuracy: 0.4644
Epoch 135/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0275 - accuracy: 0.5692 - val_loss: 2.7448 - val_accuracy: 0.4456
Epoch 136/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0280 - accuracy: 0.5752 - val_loss: 2.9343 - val_accuracy: 0.4382
Epoch 137/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0353 - accuracy: 0.5659 - val_loss: 3.3181 - val_accuracy: 0.5359
Epoch 138/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0234 - accuracy: 0.5821 - val_loss: 2.8362 - val_accuracy: 0.4948
Epoch 139/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0313 - accuracy: 0.5886 - val_loss: 3.3265 - val_accuracy: 0.5481
Epoch 140/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0198 - accuracy: 0.6201 - val_loss: 3.2678 - val_accuracy: 0.5577
Epoch 141/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0218 - accuracy: 0.6335 - val_loss: 3.0668 - val_accuracy: 0.5210
Epoch 142/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0783 - accuracy: 0.5958 - val_loss: 3.2050 - val_accuracy: 0.5294
Epoch 143/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0372 - accuracy: 0.6276 - val_loss: 2.9648 - val_accuracy: 0.5302
Epoch 144/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0214 - accuracy: 0.6671 - val_loss: 3.8403 - val_accuracy: 0.5918
Epoch 145/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0289 - accuracy: 0.6424 - val_loss: 3.3880 - val_accuracy: 0.5248
Epoch 146/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0453 - accuracy: 0.5954 - val_loss: 3.1157 - val_accuracy: 0.5436
Epoch 147/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0195 - accuracy: 0.6418 - val_loss: 3.6744 - val_accuracy: 0.5766
Epoch 148/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0360 - accuracy: 0.6372 - val_loss: 2.9954 - val_accuracy: 0.4966
Epoch 149/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0234 - accuracy: 0.6519 - val_loss: 2.8569 - val_accuracy: 0.4804
Epoch 150/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0353 - accuracy: 0.6424 - val_loss: 2.4378 - val_accuracy: 0.4224
Epoch 151/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0403 - accuracy: 0.6297 - val_loss: 3.2850 - val_accuracy: 0.5227
Epoch 152/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0272 - accuracy: 0.6504 - val_loss: 3.4086 - val_accuracy: 0.5584
Epoch 153/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0230 - accuracy: 0.6546 - val_loss: 3.3946 - val_accuracy: 0.5482
Epoch 154/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0237 - accuracy: 0.6658 - val_loss: 3.8148 - val_accuracy: 0.6305
Epoch 155/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0355 - accuracy: 0.6514 - val_loss: 2.7468 - val_accuracy: 0.4830
Epoch 156/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0334 - accuracy: 0.6507 - val_loss: 2.6237 - val_accuracy: 0.4362
Epoch 157/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0486 - accuracy: 0.6316 - val_loss: 3.1262 - val_accuracy: 0.5644
Epoch 158/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0292 - accuracy: 0.6583 - val_loss: 3.5040 - val_accuracy: 0.5714
Epoch 159/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0215 - accuracy: 0.6692 - val_loss: 4.4156 - val_accuracy: 0.6346
Epoch 160/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0478 - accuracy: 0.6538 - val_loss: 4.0318 - val_accuracy: 0.6196
Epoch 161/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0334 - accuracy: 0.6717 - val_loss: 3.2889 - val_accuracy: 0.5603
Epoch 162/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0246 - accuracy: 0.7152 - val_loss: 3.8566 - val_accuracy: 0.5944
Epoch 163/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0408 - accuracy: 0.6887 - val_loss: 3.6271 - val_accuracy: 0.5115
Epoch 164/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0482 - accuracy: 0.6852 - val_loss: 3.8220 - val_accuracy: 0.6253
Epoch 165/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0534 - accuracy: 0.6879 - val_loss: 3.8893 - val_accuracy: 0.5872
Epoch 166/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0400 - accuracy: 0.6511 - val_loss: 3.5805 - val_accuracy: 0.5744
Epoch 167/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0227 - accuracy: 0.6927 - val_loss: 3.4195 - val_accuracy: 0.5574
Epoch 168/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0272 - accuracy: 0.6829 - val_loss: 3.4754 - val_accuracy: 0.5476
Epoch 169/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0284 - accuracy: 0.6845 - val_loss: 4.2753 - val_accuracy: 0.6397
Epoch 170/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0184 - accuracy: 0.7180 - val_loss: 4.3559 - val_accuracy: 0.6475
Epoch 171/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0272 - accuracy: 0.7188 - val_loss: 3.5238 - val_accuracy: 0.5727
Epoch 172/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0539 - accuracy: 0.6805 - val_loss: 3.8677 - val_accuracy: 0.6151
Epoch 173/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0402 - accuracy: 0.6980 - val_loss: 3.5307 - val_accuracy: 0.5595
Epoch 174/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0352 - accuracy: 0.7046 - val_loss: 3.8077 - val_accuracy: 0.6164
Epoch 175/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0320 - accuracy: 0.7257 - val_loss: 4.0856 - val_accuracy: 0.6293
Epoch 176/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0308 - accuracy: 0.7315 - val_loss: 4.5366 - val_accuracy: 0.6520
Epoch 177/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0366 - accuracy: 0.7131 - val_loss: 3.8589 - val_accuracy: 0.5561
Epoch 178/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0299 - accuracy: 0.7290 - val_loss: 3.6625 - val_accuracy: 0.5797
Epoch 179/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0202 - accuracy: 0.7178 - val_loss: 3.4339 - val_accuracy: 0.5159
Epoch 180/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0349 - accuracy: 0.6974 - val_loss: 3.9301 - val_accuracy: 0.6027
Epoch 181/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0478 - accuracy: 0.7053 - val_loss: 4.2494 - val_accuracy: 0.5927
Epoch 182/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0438 - accuracy: 0.6780 - val_loss: 3.7959 - val_accuracy: 0.5436
Epoch 183/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0410 - accuracy: 0.6785 - val_loss: 3.6808 - val_accuracy: 0.5407
Epoch 184/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0212 - accuracy: 0.7051 - val_loss: 4.3036 - val_accuracy: 0.6387
Epoch 185/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0301 - accuracy: 0.7036 - val_loss: 4.6420 - val_accuracy: 0.6305
Epoch 186/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0274 - accuracy: 0.7258 - val_loss: 4.3197 - val_accuracy: 0.6108
Epoch 187/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0255 - accuracy: 0.7005 - val_loss: 4.5309 - val_accuracy: 0.6199
Epoch 188/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0293 - accuracy: 0.7052 - val_loss: 4.6349 - val_accuracy: 0.6028
Epoch 189/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0237 - accuracy: 0.7123 - val_loss: 4.7337 - val_accuracy: 0.6241
Epoch 190/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0238 - accuracy: 0.7149 - val_loss: 3.9626 - val_accuracy: 0.5853
Epoch 191/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0595 - accuracy: 0.6707 - val_loss: 4.0032 - val_accuracy: 0.5829
Epoch 192/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0488 - accuracy: 0.6663 - val_loss: 4.2530 - val_accuracy: 0.5481
Epoch 193/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0339 - accuracy: 0.7093 - val_loss: 4.6677 - val_accuracy: 0.6368
Epoch 194/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0318 - accuracy: 0.7029 - val_loss: 3.5390 - val_accuracy: 0.5278
Epoch 195/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0245 - accuracy: 0.7230 - val_loss: 4.9649 - val_accuracy: 0.6602
Epoch 196/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0547 - accuracy: 0.6918 - val_loss: 5.0202 - val_accuracy: 0.6696
Epoch 197/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0326 - accuracy: 0.7162 - val_loss: 3.6844 - val_accuracy: 0.5611
Epoch 198/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0434 - accuracy: 0.7231 - val_loss: 4.3609 - val_accuracy: 0.6051
Epoch 199/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0265 - accuracy: 0.7586 - val_loss: 4.9151 - val_accuracy: 0.6507
Epoch 200/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0290 - accuracy: 0.7690 - val_loss: 5.0217 - val_accuracy: 0.6859
Epoch 201/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0297 - accuracy: 0.7479 - val_loss: 4.9659 - val_accuracy: 0.6410
Epoch 202/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0316 - accuracy: 0.7493 - val_loss: 5.0679 - val_accuracy: 0.6662
Epoch 203/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0393 - accuracy: 0.7515 - val_loss: 5.1795 - val_accuracy: 0.6443
Epoch 204/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0469 - accuracy: 0.7054 - val_loss: 4.3778 - val_accuracy: 0.5921
Epoch 205/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0704 - accuracy: 0.6998 - val_loss: 4.8686 - val_accuracy: 0.6408
Epoch 206/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0218 - accuracy: 0.7598 - val_loss: 4.3856 - val_accuracy: 0.5714
Epoch 207/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0466 - accuracy: 0.7439 - val_loss: 5.2094 - val_accuracy: 0.6825
Epoch 208/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0312 - accuracy: 0.7631 - val_loss: 5.6072 - val_accuracy: 0.6865
Epoch 209/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0185 - accuracy: 0.7852 - val_loss: 5.2408 - val_accuracy: 0.6890
Epoch 210/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0233 - accuracy: 0.7738 - val_loss: 4.8732 - val_accuracy: 0.6550
Epoch 211/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0248 - accuracy: 0.7767 - val_loss: 5.5956 - val_accuracy: 0.6943
Epoch 212/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0501 - accuracy: 0.7641 - val_loss: 4.6866 - val_accuracy: 0.6282
Epoch 213/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0382 - accuracy: 0.7649 - val_loss: 4.6439 - val_accuracy: 0.6522
Epoch 214/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0272 - accuracy: 0.7884 - val_loss: 4.5182 - val_accuracy: 0.6105
Epoch 215/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0321 - accuracy: 0.7625 - val_loss: 3.8475 - val_accuracy: 0.5405
Epoch 216/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0335 - accuracy: 0.7648 - val_loss: 5.6832 - val_accuracy: 0.6708
Epoch 217/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0461 - accuracy: 0.7602 - val_loss: 4.6695 - val_accuracy: 0.6100
Epoch 218/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0361 - accuracy: 0.7676 - val_loss: 6.1972 - val_accuracy: 0.7380
Epoch 219/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0600 - accuracy: 0.7583 - val_loss: 6.7353 - val_accuracy: 0.7031
Epoch 220/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0409 - accuracy: 0.8012 - val_loss: 5.5523 - val_accuracy: 0.6542
Epoch 221/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0352 - accuracy: 0.7831 - val_loss: 5.8413 - val_accuracy: 0.6886
Epoch 222/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0435 - accuracy: 0.7765 - val_loss: 5.5479 - val_accuracy: 0.6717
Epoch 223/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0438 - accuracy: 0.7864 - val_loss: 4.9042 - val_accuracy: 0.6115
Epoch 224/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0395 - accuracy: 0.7778 - val_loss: 4.8346 - val_accuracy: 0.6062
Epoch 225/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0306 - accuracy: 0.7741 - val_loss: 5.7630 - val_accuracy: 0.6949
Epoch 226/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0276 - accuracy: 0.8166 - val_loss: 5.0575 - val_accuracy: 0.6542
Epoch 227/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0270 - accuracy: 0.7914 - val_loss: 5.1482 - val_accuracy: 0.6615
Epoch 228/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0264 - accuracy: 0.8043 - val_loss: 7.3859 - val_accuracy: 0.7552
Epoch 229/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0789 - accuracy: 0.7609 - val_loss: 5.4473 - val_accuracy: 0.6343
Epoch 230/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0442 - accuracy: 0.7837 - val_loss: 6.9764 - val_accuracy: 0.6941
Epoch 231/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0271 - accuracy: 0.7985 - val_loss: 5.8816 - val_accuracy: 0.6686
Epoch 232/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0214 - accuracy: 0.7968 - val_loss: 4.8165 - val_accuracy: 0.6007
Epoch 233/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0392 - accuracy: 0.7722 - val_loss: 5.1646 - val_accuracy: 0.6228
Epoch 234/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0295 - accuracy: 0.7882 - val_loss: 6.2459 - val_accuracy: 0.6837
Epoch 235/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0317 - accuracy: 0.7897 - val_loss: 5.8430 - val_accuracy: 0.6872
Epoch 236/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0379 - accuracy: 0.7804 - val_loss: 5.1673 - val_accuracy: 0.6521
Epoch 237/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0401 - accuracy: 0.7654 - val_loss: 4.9300 - val_accuracy: 0.6607
Epoch 238/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0527 - accuracy: 0.7652 - val_loss: 4.0163 - val_accuracy: 0.5747
Epoch 239/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0414 - accuracy: 0.7985 - val_loss: 6.1678 - val_accuracy: 0.7002
Epoch 240/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0248 - accuracy: 0.8077 - val_loss: 6.3531 - val_accuracy: 0.6834
Epoch 241/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0285 - accuracy: 0.8169 - val_loss: 5.8537 - val_accuracy: 0.7008
Epoch 242/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0329 - accuracy: 0.7996 - val_loss: 6.3130 - val_accuracy: 0.7184
Epoch 243/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0369 - accuracy: 0.7845 - val_loss: 5.4371 - val_accuracy: 0.6580
Epoch 244/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0409 - accuracy: 0.7799 - val_loss: 5.1018 - val_accuracy: 0.6309
Epoch 245/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0427 - accuracy: 0.7790 - val_loss: 5.7485 - val_accuracy: 0.7080
Epoch 246/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0269 - accuracy: 0.8022 - val_loss: 6.2171 - val_accuracy: 0.7192
Epoch 247/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0344 - accuracy: 0.7995 - val_loss: 5.2885 - val_accuracy: 0.6756
Epoch 248/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0252 - accuracy: 0.7938 - val_loss: 6.0351 - val_accuracy: 0.6930
Epoch 249/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0215 - accuracy: 0.8096 - val_loss: 5.9067 - val_accuracy: 0.6957
Epoch 250/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0374 - accuracy: 0.7940 - val_loss: 5.5991 - val_accuracy: 0.6692
Epoch 251/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0574 - accuracy: 0.7698 - val_loss: 7.2554 - val_accuracy: 0.6950
Epoch 252/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0405 - accuracy: 0.7901 - val_loss: 5.5288 - val_accuracy: 0.6248
Epoch 253/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0636 - accuracy: 0.7783 - val_loss: 4.8827 - val_accuracy: 0.5941
Epoch 254/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0331 - accuracy: 0.7793 - val_loss: 5.6976 - val_accuracy: 0.6618
Epoch 255/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0293 - accuracy: 0.7960 - val_loss: 6.0817 - val_accuracy: 0.6669
Epoch 256/256
263/263 [==============================] - 2s 7ms/step - loss: 0.0238 - accuracy: 0.8049 - val_loss: 6.1198 - val_accuracy: 0.6796
In [22]:
model.load_weights('articles.h5')
In [23]:
ypred = model.predict(Xtest)
ypred = (ypred>0.5).astype('int')
acc = accuracy_score(ytest, ypred)
f1 = f1_score(ytest,ypred, average='samples')

print("Accuracy = ", acc)
print("F1 score = ", f1)
Accuracy =  0.6078665077473182
F1 score =  0.7652761223678982
In [24]:
cm = multilabel_confusion_matrix(ytest, ypred)

plt.figure(figsize=(40,40))

for k in range(cm.shape[0]):
    cmi = cm[k].astype('float') / cm[k].sum(axis=1)[:, np.newaxis]

    ax = plt.subplot(len(subjects), 1, k+1)

    for i in range(cmi.shape[1]):
        for j in range(cmi.shape[0]):
            plt.text(j, i, format(cmi[i, j], '.2f'), horizontalalignment="center", color="black")

    plt.title(subjects[k])
    plt.imshow(cmi, cmap=plt.cm.Blues)

The low level of true positives for quantitaive biology and quantitative finanace is due to the low number of training samples.

Plotting the metrics

In [25]:
def plot(history, variable, variable2):
    plt.plot(range(len(history[variable])), history[variable])
    plt.plot(range(len(history[variable2])), history[variable2])
    plt.legend([variable, variable2])
    plt.title(variable)
In [26]:
plot(history.history, "accuracy", 'val_accuracy')
In [27]:
plot(history.history, "loss", 'val_loss')

Prediction

In [28]:
df = pd.read_csv('https://cainvas-static.s3.amazonaws.com/media/user_data/cainvas-admin/researchArticles.csv')
In [29]:
x = np.random.randint(0, Xtest.shape[0] - 1)

title = df['TITLE'].values[x]
abstract = df['ABSTRACT'].values[x]

print("Title: ", title)
print("\nAbstract: ", abstract)

cleaned_text = []

title = removeURL(title) 
title = onlyAlphabets(title) 
title = title.lower()

abstract = removeURL(abstract) 
abstract = onlyAlphabets(abstract) 
abstract = abstract.lower()    

for word in title.split():
    if word not in stop:
        stemmed = sno.stem(word)
        cleaned_text.append(stemmed)

for word in abstract.split():
    if word not in stop:
        stemmed = sno.stem(word)
        cleaned_text.append(stemmed)  

cleaned_text = [' '.join(cleaned_text)]

print("Cleaned text: ", cleaned_text[0])

cleaned_text = tokenizer.texts_to_sequences(cleaned_text)
cleaned_text = pad_sequences(cleaned_text, maxlen=mlen, padding=padding_type, truncating=trunc_type)

s = df[subjects].values[x]  
s_num = np.where(s == 1)[0]

print("\nTrue subjects: ")
for si in s_num:
    print(subjects[si])

pred = model.predict(cleaned_text)[0]
predn = (pred>0.5).astype('int')
s_num = np.where(predn == 1)[0]

print("\nPredicted subjects: ")
for si in s_num:
    print(subjects[si], '(', pred[si], ')')
Title:  Dissolution of topological Fermi arcs in a dirty Weyl semimetal

Abstract:    Weyl semimetals (WSMs) have recently attracted a great deal of attention as
they provide condensed matter realization of chiral anomaly, feature
topologically protected Fermi arc surface states and sustain sharp chiral Weyl
quasiparticles up to a critical disorder at which a continuous quantum phase
transition (QPT) drives the system into a metallic phase. We here numerically
demonstrate that with increasing strength of disorder the Fermi arc gradually
looses its sharpness, and close to the WSM-metal QPT it completely dissolves
into the metallic bath of the bulk. Predicted topological nature of the
WSM-metal QPT and the resulting bulk-boundary correspondence across this
transition can directly be observed in
angle-resolved-photo-emmision-spectroscopy (ARPES) and Fourier transformed
scanning-tunneling-microscopy (STM) measurements by following the continuous
deformation of the Fermi arcs with increasing disorder in recently discovered
Weyl materials.

Cleaned text:  dissolut topolog fermi arc dirti weyl semimet weyl semimet wsms recent attract great deal attent provid condens matter realize chiral anomali featur topolog protect fermi arc surfac state sustain sharp chiral weyl quasiparticl critic disord continu quantum phase transit qpt drive system metal phase numer demonstr increas strength disord fermi arc gradual loos sharp close wsm metal qpt complet dissolv metal bath bulk predict topolog natur wsm metal qpt result bulk boundari correspond across transit direct observ angl resolv photo emmis spectroscopi arp fourier transform scan tunnel microscopi stm measur follow continu deform fermi arc increas disord recent discov weyl materi

True subjects: 
Physics

Predicted subjects: 
Physics ( 1.0 )

deepC

In [31]:
#!deepCC articles.h5