ValueError:尺寸必须相等,但在自制图层期间为64和32

2024-09-29 23:31:15 发布

您现在位置:Python中文网/ 问答频道 /正文

我在google colab笔记本中使用tensorflow 2.2.0。我有一些自制的激活层,但当我尝试使用特定的激活层时,问题出现了:

class MPELU(tf.keras.layers.Layer):
"""
Multiple Parametric Exponential Linear Units.
""" 
def __init__(self, channel_wise=True, **kwargs):
    super(MPELU, self).__init__(**kwargs)
    self.channel_wise = channel_wise

def build(self, input_shape):
    shape = [1]

    if self.channel_wise:
        shape = [int(input_shape[-1])]  # Number of channels

    self.alpha = self.add_weight(name='alpha', shape=shape, dtype=K.floatx(),
                                 initializer=tf.keras.initializers.RandomUniform(minval=-1, maxval=1),
                                 trainable=True)
    self.beta = self.add_weight(name='beta', shape=shape, dtype=K.floatx(),
                                initializer=tf.keras.initializers.RandomUniform(minval=0.0, maxval=1),
                                trainable=True)

    # Finish buildidng
    super(MPELU, self).build(input_shape)


@tf.function
def call(self, inputs, **kwargs):
    positive = tf.keras.activations.relu(inputs)
    negative = self.alpha * (K.exp(-tf.keras.activations.relu(-inputs) * cons_greater_zero(self.beta)) - 1)

    return positive + negative

def compute_output_shape(self, input_shape):
    return input_shape

错误代码如下所示:

    ValueError: Dimensions must be equal, but are 64 and 32 for '{{node mpelu/mul_10}} = Mul[T=DT_FLOAT](mpelu/Neg_11, mpelu/add_10)' with input shapes: [?,4,4,64], [32].

我在每次调用中使用打印函数打印形状,结果如下:

(None, 16, 16, 32)
(None, 7, 7, 32)
(None, 7, 7, 32)
(None, 7, 7, 32)
(None, 7, 7, 32)
(None, 4, 4, 64)

Tags: selfalphanoneaddtrueinputtfdef

热门问题