在Keras中使用BatchNormalization时出现奇怪的损耗曲线

时间:2018-11-17 14:36:00

标签: keras keras-layer batch-normalization

部分代码:

mobilenetv2 = MobileNetV2(input_shape=(IMG_SIZE, IMG_SIZE, CHANNELS),
                          alpha=1.0,
                          depth_multiplier=1,
                          include_top=False,
                          weights='imagenet',
                          input_tensor=None,
                          pooling=None,
                          classes=12)

for layer in mobilenetv2.layers:
    layer.trainable = False

last = mobilenetv2.layers[-1].output
x = Flatten()(last)

x = Dense(120, use_bias=False)(x)
x = BatchNormalization()(x)
x = Activation('relu')(x)

x = Dense(84, use_bias=False)(x)
x = BatchNormalization()(x)
x = Activation('relu')(x)

preds = Dense(12, activation='softmax')(x)


model = Model(inputs=mobilenetv2.input, outputs=preds)

但损失曲线:

enter image description here

enter image description here

以上曲线正常吗?我没有使用辍学层,因为要求我将辍学层与BatchNormalization进行比较。但是曲线看起来很奇怪。我的密码对吗?还是缺少什​​么? 谢谢

0 个答案:

没有答案
相关问题