keras 绘制CNN学习历史记录时内核死亡

pinkon5k  于 2022-11-13  发布在  其他
关注(0)|答案(1)|浏览(152)

我当时正在用keras创建一个CNN。一切都被罚款,直到我试图绘制模型的学习历史。

# Train the CNN
history = model.fit_generator(train_generator, steps_per_epoch = 143, epochs = 20, validation_data = validation_generator, validation_steps = 18)

# Display the loss and accuracy during training
acc = history['acc']
val_acc = history['val_acc']
loss = history['loss']
val_loss = history['val_loss']
epochs = range(1,len(acc) + 1)

为绘图准备好数据后,我编写了以下代码:

plt.plot(epochs, acc, 'bo', label = 'Training acc')
plt.plot(epochs, val_acc, 'b', label = 'Validation acc')
plt.title('Training and validation accuracy')
plt.legend()
plt.figure()
plt.plot(epochs, loss, 'bo', label='Training loss')
plt.plot(epochs, val_loss, 'b', label='Validation loss')
plt.title('Training and validation loss')
plt.legend()
plt.show()

然后Jupyter笔记本显示“内核似乎已经死亡。它将自动重新启动”

ffvjumwh

ffvjumwh1#

我已经解决了这个问题。将python和tensorflow分别更新到3.9和2.10。

相关问题