ROras曲线错误且精度低的Keras CNN模型

时间:2019-07-06 13:50:34

标签: python keras neural-network conv-neural-network

我正在学习使用在那里找到的数据集之一在Kaggle的Keras上编写CNN。

我的笔记本的链接是

https://www.kaggle.com/vj6978/brain-tumor-vimal?scriptVersionId=16814133

链接上提供了代码,数据集和ROC曲线。 ROC曲线本身看起来好像该模型只是在进行猜测,而不是学习到的预测。

测试精度似乎也仅在60%到70%左右达到峰值,这非常安静。任何帮助将不胜感激。

谢谢 邪恶的詹姆斯

2 个答案:

答案 0 :(得分:0)

我相信您的上一次激活应该是S型而不是softmax。

更新:

只需在Kaggle上分叉内核,并进行如下修改即可得到更好的结果:

model = Sequential()
model.add(Conv2D(128, (3,3), input_shape = data_set.shape[1:]))
model.add(Activation("relu"))
model.add(AveragePooling2D(pool_size = (2,2)))

model.add(Conv2D(128, (3,3)))
model.add(Activation("relu"))
model.add(AveragePooling2D(pool_size = (2,2)))

model.add(Flatten())
model.add(Dense(64))

model.add(Dense(1))
model.add(Activation("sigmoid")) # Last activation should be sigmoid for binary classification

model.compile(optimizer = "adam", loss = "binary_crossentropy", metrics = ['accuracy'])

得出以下结果:

rain on 204 samples, validate on 23 samples
Epoch 1/15
204/204 [==============================] - 2s 11ms/step - loss: 2.8873 - acc: 0.6373 - val_loss: 0.8000 - val_acc: 0.8261
Epoch 2/15
204/204 [==============================] - 1s 3ms/step - loss: 0.7292 - acc: 0.7206 - val_loss: 0.6363 - val_acc: 0.7391
Epoch 3/15
204/204 [==============================] - 1s 3ms/step - loss: 0.4731 - acc: 0.8088 - val_loss: 0.5417 - val_acc: 0.8261
Epoch 4/15
204/204 [==============================] - 1s 3ms/step - loss: 0.3605 - acc: 0.8775 - val_loss: 0.6820 - val_acc: 0.8696
Epoch 5/15
204/204 [==============================] - 1s 3ms/step - loss: 0.2986 - acc: 0.8529 - val_loss: 0.8356 - val_acc: 0.8696
Epoch 6/15
204/204 [==============================] - 1s 3ms/step - loss: 0.2151 - acc: 0.9020 - val_loss: 0.7592 - val_acc: 0.8696
Epoch 7/15
204/204 [==============================] - 1s 3ms/step - loss: 0.1305 - acc: 0.9657 - val_loss: 1.2486 - val_acc: 0.8696
Epoch 8/15
204/204 [==============================] - 1s 3ms/step - loss: 0.0565 - acc: 0.9853 - val_loss: 1.2668 - val_acc: 0.8696
Epoch 9/15
204/204 [==============================] - 1s 3ms/step - loss: 0.0426 - acc: 0.9853 - val_loss: 1.4674 - val_acc: 0.8696
Epoch 10/15
204/204 [==============================] - 1s 3ms/step - loss: 0.0141 - acc: 1.0000 - val_loss: 1.7379 - val_acc: 0.8696
Epoch 11/15
204/204 [==============================] - 1s 3ms/step - loss: 0.0063 - acc: 1.0000 - val_loss: 1.7232 - val_acc: 0.8696
Epoch 12/15
204/204 [==============================] - 1s 3ms/step - loss: 0.0023 - acc: 1.0000 - val_loss: 1.8291 - val_acc: 0.8696
Epoch 13/15
204/204 [==============================] - 1s 3ms/step - loss: 0.0014 - acc: 1.0000 - val_loss: 1.9164 - val_acc: 0.8696
Epoch 14/15
204/204 [==============================] - 1s 3ms/step - loss: 8.6263e-04 - acc: 1.0000 - val_loss: 1.8946 - val_acc: 0.8696
Epoch 15/15
204/204 [==============================] - 1s 3ms/step - loss: 6.8785e-04 - acc: 1.0000 - val_loss: 1.9596 - val_acc: 0.8696
Test loss: 3.079359292984009
Test accuracy: 0.807692289352417

答案 1 :(得分:0)

您正在对单个神经元使用softmax激活,由于softmax中使用了归一化,这将总是产生恒定的1.0输出,因此没有任何意义。对于二进制分类,您必须对单个输出神经元使用sigmoid激活。