平整层的输入必须是张量

时间:2019-05-28 07:31:05

标签: python tensorflow keras deep-learning keras-layer

我有以下运行正常的keras模型:

model = Sequential()
model.add(Flatten(input_shape=(1,1,68)))
model.add(Dense(35,activation='linear'))
model.add(LeakyReLU(alpha=.001))
model.add(Dense(nb_actions))
model.add(Activation('linear'))

然后,我尝试做一些更详细的说明,如下所示:

model = Sequential()
input1 = keras.layers.Flatten(input_shape=(1,1,68))
x1 = keras.layers.Dense(68, activation='linear')(input1)
x2 = keras.layers.Dense(68, activation='relu')(input1)
x3 = keras.layers.Dense(68, activation='sigmoid')(input1)
add1 = keras.layers.Add()([x1, x2, x3])
activ1 = keras.layers.advanced_activations.LeakyReLU(add1)

x4 = keras.layers.Dense(34, activation='linear')(activ1)
x5 = keras.layers.Dense(34, activation='relu')(activ1)
x6 = keras.layers.Dense(34, activation='sigmoid')(activ1)
add2 = keras.layers.Add()([x4, x5, x6])
activ2 = keras.layers.advanced_activations.LeakyReLU(add2)

x7 = keras.layers.Dense(17, activation='linear')(activ2)
x8 = keras.layers.Dense(17, activation='relu')(activ2)
x9 = keras.layers.Dense(17, activation='sigmoid')(activ2)
add2 = keras.layers.Add()([x4, x5, x6])
activ3 = keras.layers.advanced_activations.LeakyReLU(add3)

final_layer=keras.layers.Dense(nb_actions, activation='linear')(activ3)
model = keras.models.Model(inputs=input1, outputs=final_layer)

正如您在上面的代码中看到的那样,我保留了Flatten层的相同输入,只是对具有相同数量神经元但激活方式不同的层进行了求和。我的问题是当我尝试运行此代码时。我总是遇到以下错误:

Using TensorFlow backend. Traceback (most recent call last):   File "/home/anselmo/virtualenvironment/virtualenvironment_anselmo2/lib/python3.5/site-packages/keras/engine/base_layer.py", line 279, in assert_input_compatibility
    K.is_keras_tensor(x)   File "/home/anselmo/virtualenvironment/virtualenvironment_anselmo2/lib/python3.5/site-packages/keras/backend/tensorflow_backend.py", line 474, in is_keras_tensor
    str(type(x)) + '`. ' ValueError: Unexpectedly found an instance of type class keras.layers.core.Flatten. Expected a symbolic tensor instance.

During handling of the above exception, another exception occurred:

Traceback (most recent call last):   File "main.py", line 64, in <module>
    x1 = keras.layers.Dense(68, activation='linear')(input1)   File "/home/anselmo/virtualenvironment/virtualenvironment_anselmo2/lib/python3.5/site-packages/keras/engine/base_layer.py", line 414, in __call__
    self.assert_input_compatibility(inputs)   File "/home/anselmo/virtualenvironment/virtualenvironment_anselmo2/lib/python3.5/site-packages/keras/engine/base_layer.py", line 285, in assert_input_compatibility
    str(inputs) + '. All inputs to the layer ' ValueError: Layer dense_1 was called with an input that isn't a symbolic tensor. Received type: class keras.layers.core.Flatten. Full input: [keras.layers.core.Flatten object at 0x7f0a145d6438]. All inputs to the layer should be tensors.

当我运行前面的代码时,没有发生错误。那么,为什么更改网络设计会出现此错误?我该如何解决?我的错误在哪里?

1 个答案:

答案 0 :(得分:3)

您在第二代码中尝试的是Keras Functional模型,而不是顺序模型。您应该将第一行从model = Sequential()更改为input1 = Input(shape=(1, 1, 68))

official documentation上的更多详细信息。