具有Lambda图层的Keras,用于自定义激活

时间:2020-03-05 23:46:42

标签: keras lambda

为了将3种不同的激活函数应用于最后一个Dense层的3列,我正在使用Lambda层。首先,我拆分然后重新组装输出。到目前为止,尚未应用任何实际的激活,因为我发现以下内容在模型拟合步骤给了我一个错误:

def createModel():
in_put = Input(shape=(3300,))
layer1 = Dense(512, activation='relu')(in_put)
layer2 = Dense(1024, activation='relu')(layer1)
layer3 = Dense(32, activation='relu')(layer2)
out_put = Dense(3)(layer3)

#splitting output into 3 columns for further activation
theta = Lambda(lambda x: x[:,0], output_shape=(1,))(out_put)
phi = Lambda(lambda x: x[:,1], output_shape=(1,))(out_put)
r = Lambda(lambda x: x[:,2], output_shape=(1,))(out_put)

#combining the activated layers (activation has not been implemented yet) 
out_put_a = Lambda(lambda x: K.stack([x[0], x[1], x[2]]),output_shape=(3,),  name="output")([theta, phi, r])

model = Model(inputs=in_put, outputs=out_put_a)
return model

my_network=createModel()

batch_size = 1000
epochs = 1

my_network.compile(optimizer='rmsprop', loss='mean_squared_error')
history = my_network.fit_generator(generator=training_generator, epochs=epochs, 
                validation_data=testing_generator)

它崩溃并显示以下错误:

InvalidArgumentError:找到2个根错误。 (0)无效的参数:形状不兼容:[3]与[1000] [[{{节点培训/ RMSprop / gradients / loss / output_loss / mul_grad / BroadcastGradientArgs}}]] [[loss / mul / _55]] (1)无效的参数:不兼容的形状:[3]与[1000] [[{{节点培训/ RMSprop / gradients / loss / output_loss / mul_grad / BroadcastGradientArgs}}]] 0次成功操作。 忽略0个派生错误。>

同时,下面的代码(没有Lambda层也可以正常工作):

def createModel():
in_put = Input(shape=(3300,))
layer1 = Dense(512, activation='relu')(in_put)
layer2 = Dense(1024, activation='relu')(layer1)
layer3 = Dense(32, activation='relu')(layer2)
out_put = Dense(3)(layer3)

model = Model(inputs=in_put, outputs=out_put)
return model

my_network=createModel()

batch_size = 1000
epochs = 1

my_network.compile(optimizer='rmsprop', loss='mean_squared_error')
history = my_network.fit_generator(generator=training_generator, epochs=epochs, 
                validation_data=testing_generator)

谁能指出lambda层出了什么问题?

0 个答案:

没有答案