TensorFlow批量标准化维度

时间:2017-11-27 04:36:20

标签: tensorflow

我试图在conv2d_transpose中使用批量规范化,如下所示:

h1 = tf.layers.conv2d_transpose(inputs, 64, 4, 2, padding='SAME',
    kernel_initializer=tf.variance_scaling_initializer,
    bias_initializer=tf.ones_initializer,
    activity_regularizer=tf.layers.batch_normalization,
)
h2 = tf.layers.conv2d_transpose(h1, 3, 4, 2, padding='SAME',
    kernel_initializer=tf.variance_scaling_initializer,
    bias_initializer=tf.ones_initializer,
    activity_regularizer=tf.layers.batch_normalization,
)

我收到以下错误:

ValueError: Dimension 1 in both shapes must be equal, but are 32 and 64
From merging shape 2 with other shapes. for 'tower0/AddN' (op: 'AddN') with input shapes: [?,32,32,64], [?,64,64,3].

我已经看到其他人在Keras中遇到此错误,因为TensorFlow和Theano之间的维度顺序不同。但是,我使用的是纯TensorFlow,我的所有变量都采用TensorFlow维度格式(batch_size, height, width, channels),而conv2d_transpose图层的data_format应该是默认值'channels_last'。我在这里缺少什么?

1 个答案:

答案 0 :(得分:1)

tf.layers.batch_normalization应添加为图层,而不是正则图。 activity_regularizer是一个接受活动(图层输出)的函数,并产生一个额外的损失项,该术语被添加到整个网络的整体损失项中。例如,您可能希望惩罚产生高激活的网络。您可以看到如何在输出中调用activity_regularizer,并将其结果添加到损失here