Keras错误:[0,0]中的预期大小[1],但得到1

时间:2018-08-14 22:35:39

标签: python keras

我正在尝试在Keras中的大型seq2seq模型中构建解码器,但是在运行fit函数时,我始终收到以下错误消息。否则,模型可以很好地构建。

InvalidArgumentError: Expected size[1] in [0, 0], but got 1
[[Node: lambda_2/Slice = Slice[Index=DT_INT32, T=DT_FLOAT, 
_device="/job:localhost/replica:0/task:0/device:CPU:0"](lambda_1/Slice, 
metrics/acc/Const, lambda_2/Slice/size)]]

lambda_x/Slice似乎在循环中引用了lambda函数。

我的模型有4个形状为(N, 11), (N, 3), (N, 11), (N, 3)的输入,并输出形状为(N, 11, 1163)的softmax分布。

下面是我的解码器代码,它是使用分离器层的地方:

def _decoder_serial_input(self, encoder_states, state_h, state_c):
    """
    Compute one-by-one input to decoder, taking output from previous time-step as input
    :param encoder_states: All the encoder states
    :param state_h: starting hidden state
    :param state_c: starting cell state
    :return: Concatenated output which is shape = (N, Timestep, Input dims)
    """

    all_outputs = []
    states = [state_h, state_c] 
    inputs = self.decoder_inputs  # Shape = N x num_timestep

    repeat = RepeatVector(1, name="decoder_style")
    conc_1 = Concatenate(axis=-1, name="concatenate_decoder")
    conc_att = Concatenate(axis=-1, name="concatenate_attention")

    for t in range(self.max_timestep):

        # This slices the input. -1 is to accept everything in that dimension
        inputs = Lambda(lambda x: K.slice(x, start=[0, t], size=[-1, 1]))(inputs)

        embedding_output = self.embedding_decoder(inputs)
        style_labels = repeat(self.decoder_style_label) 

        concat = conc_1([embedding_output, style_labels])  # Join to style label

        decoder_output_forward, state_h, state_c = self.decoder(concat, initial_state=states)

        if self.attention:
            context, _ = self._one_step_attention(encoder_states, state_h)  # Size of latent dims
            decoder_output_forward = conc_att([context, decoder_output_forward])

        outputs = self.decoder_softmax_output(decoder_output_forward)  # Shape = (N, 1, input dims)

        all_outputs.append(outputs)
        states = [state_h, state_c]

    return Concatenate(axis=1, name="conc_dec_forward")(all_outputs)

有人知道我为什么收到此错误吗?谢谢。

1 个答案:

答案 0 :(得分:1)

我解决了这个问题。问题是我将Lambda层的输出设置为inputs变量,这是错误的。这将输入张量的形状更改为lambda层。根据需要,在第一次迭代中它是(N, 11),但是在循环的后续迭代中,它变成了(N, 1),这导致了错误。