2
$\begingroup$

During the training process of the convolutional neural network, the network outputs the training/validation accuracy/loss after each epoch as shown below:

Epoch 1/100 691/691 [==============================] - 2174s 3s/step - loss: 0.6473 - acc: 0.6257 - val_loss: 0.5394 - val_acc: 0.8258 Epoch 2/100 691/691 [==============================] - 2145s 3s/step - loss: 0.5364 - acc: 0.7692 - val_loss: 0.4283 - val_acc: 0.8675 Epoch 3/100 691/691 [==============================] - 2124s 3s/step - loss: 0.4341 - acc: 0.8423 - val_loss: 0.3381 - val_acc: 0.9024 Epoch 4/100 691/691 [==============================] - 2126s 3s/step - loss: 0.3467 - acc: 0.8880 - val_loss: 0.2643 - val_acc: 0.9267 Epoch 5/100 691/691 [==============================] - 2123s 3s/step - loss: 0.2769 - acc: 0.9202 - val_loss: 0.2077 - val_acc: 0.9455 Epoch 6/100 691/691 [==============================] - 2118s 3s/step - loss: 0.2207 - acc: 0.9431 - val_loss: 0.1654 - val_acc: 0.9575 Epoch 7/100 691/691 [==============================] - 2125s 3s/step - loss: 0.1789 - acc: 0.9562 - val_loss: 0.1348 - val_acc: 0.9663 Epoch 8/100 691/691 [==============================] - 2120s 3s/step - loss: 0.1472 - acc: 0.9655 - val_loss: 0.1117 - val_acc: 0.9719 Epoch 9/100 691/691 [==============================] - 2119s 3s/step - loss: 0.1220 - acc: 0.9728 - val_loss: 0.0956 - val_acc: 0.9746 Epoch 10/100 691/691 [==============================] - 2119s 3s/step - loss: 0.1037 - acc: 0.9774 - val_loss: 0.0828 - val_acc: 0.9781 Epoch 11/100 691/691 [==============================] - 2110s 3s/step - loss: 0.0899 - acc: 0.9806 - val_loss: 0.0747 - val_acc: 0.9793 Epoch 12/100 691/691 [==============================] - 2123s 3s/step - loss: 0.0785 - acc: 0.9835 - val_loss: 0.0651 - val_acc: 0.9825 Epoch 13/100 691/691 [==============================] - 2130s 3s/step - loss: 0.0689 - acc: 0.9860 - val_loss: 0.0557 - val_acc: 0.9857 Epoch 14/100 691/691 [==============================] - 2124s 3s/step - loss: 0.0618 - acc: 0.9874 - val_loss: 0.0509 - val_acc: 0.9869 Epoch 15/100 691/691 [==============================] - 2122s 3s/step - loss: 0.0555 - acc: 0.9891 - val_loss: 0.0467 - val_acc: 0.9876 Epoch 16/100 152/691 [=====>........................] - ETA: 22:10 - loss: 0.0515 - acc: 0.9892 

My plan was to get the history variable and plot the accuracy/loss as follows:

history=model.fit_generator( .... ) plt.plot(history.history["acc"]) ... 

But my training just stopped due to some hardware issues. Therefore, the graphs were not plotted. But I have the log of 15 epochs as mentioned above. Can I plot the accuracy/loss graph from the above log?

$\endgroup$
2
  • 1
    $\begingroup$ You should be able to, but if I understand correctly you do not have the accuracy and loss saved to a variable so you'd have to manually take them from the log you've shown. $\endgroup$ Commented Feb 1, 2020 at 13:01
  • $\begingroup$ @Oxbowerce I assumed the same way as you described! I will share the script to read the log and plot the curves. It might be helpful for others. $\endgroup$ Commented Feb 2, 2020 at 12:45

3 Answers 3

1
$\begingroup$

I think this covers your issue in the Keras documentation https://keras.io/callbacks/#create-a-callback

class LossHistory(keras.callbacks.Callback): def on_train_begin(self, logs={}): self.losses = [] def on_batch_end(self, batch, logs={}): self.losses.append(logs.get('loss')) model = Sequential() model.add(Dense(10, input_dim=784, kernel_initializer='uniform')) model.add(Activation('softmax')) model.compile(loss='categorical_crossentropy', optimizer='rmsprop') history = LossHistory() model.fit(x_train, y_train, batch_size=128, epochs=20, verbose=0, callbacks=[history]) print(history.losses) # outputs ''' [0.66047596406559383, 0.3547245744908703, ..., 0.25953155204159617, 0.25901699725311789] ``` 
$\endgroup$
2
  • $\begingroup$ You are right but as the training disconnected, I lost all the variables. Hence, I think I need to write a python script to manually collect losses and accuracies from the above log and plot the graph as suggested by Oxbowerce $\endgroup$ Commented Feb 2, 2020 at 12:43
  • $\begingroup$ Let us know how that works. Cheers $\endgroup$ Commented Feb 2, 2020 at 22:17
1
$\begingroup$

I came to a custom parser of logs. Simpler to run than to setup saving of statistics for TensorBoard sometimes. Then also inserted printing of losses with higher precision and also parsed it... Fast and quite convenient to run in parallel Jupyter notebook.

import matplotlib.pyplot as plt import numpy as np import os import pandas as pd import re def getLogAsTable(srcFilePath): table = [] fieldNames = ['epochNum', 'trainLoss', 'trainAcc', 'valLoss', 'valAcc'] with open(srcFilePath, 'r') as file: preciseCorrection = False epochNum = 0 for line in file: # Parsing "- 9s - loss: 9.9986e-04 - acc: 0.0000e+00 - val_loss: 9.9930e-04 - val_acc: 0.0000e+00" match = re.match(r'\s*- .+?s - loss\: (\d.*?) - acc\: (\d.*?)' ' - val_loss: (\d.*?) - val_acc\: (\d.*)', line) if match: epochNum += 1 row = [epochNum] + [float(valStr) for valStr in match.groups()] if len(row) != len(fieldNames): raise Exception('Value count mismatch (%s)' % line) table.append(row) return pd.DataFrame(table, columns=fieldNames) if __name__ == '__main__': logTable = getLogAsTable('log.txt') xs = logTable['epochNum'] ys = logTable['trainLoss'] plt.plot(xs, ys) plt.show() 
$\endgroup$
0
$\begingroup$

Replace with your actual accuracy and loss values from training logs.

import matplotlib.pyplot as plt train_accuracy = [0.6257, 0.7692, 0.8423, 0.8880, 0.9202, 0.9431, 0.9562, 0.9655, 0.9728, 0.9774, 0.9806, 0.9835, 0.9860, 0.9874, 0.9891] val_accuracy = [0.8258, 0.8675, 0.9024, 0.9267, 0.9455, 0.9575, 0.9663, 0.9719, 0.9746, 0.9781, 0.9793, 0.9825, 0.9857, 0.9869, 0.9876] train_loss = [0.6473, 0.5364, 0.4341, 0.3467, 0.2769, 0.2207, 0.1789, 0.1472, 0.1220, 0.1037, 0.0899, 0.0785, 0.0689, 0.0618, 0.0555] val_loss = [0.5394, 0.4283, 0.3381, 0.2643, 0.2077, 0.1654, 0.1348, 0.1117, 0.0956, 0.0828, 0.0747, 0.0651, 0.0557, 0.0509, 0.0467] epochs = range(1, 16) # Assuming 15 epochs plt.figure(figsize=(12, 4)) plt.subplot(1, 2, 1) plt.plot(epochs, train_accuracy, label='Training Accuracy') plt.plot(epochs, val_accuracy, label='Validation Accuracy') plt.title('Training and Validation Accuracy') plt.xlabel('Epochs') plt.ylabel('Accuracy') plt.legend() plt.subplot(1, 2, 2) plt.plot(epochs, train_loss, label='Training Loss') plt.plot(epochs, val_loss, label='Validation Loss') plt.title('Training and Validation Loss') plt.xlabel('Epochs') plt.ylabel('Loss') plt.legend() plt.tight_layout() plt.show() 

Using this you should be able to plot the accuracy/loss graph using only your epochs.

$\endgroup$

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.