训练多个模型:ValueError:变量hidden1/kernel已存在,不允许

2024-10-03 09:15:39 发布

您现在位置:Python中文网/ 问答频道 /正文

我正在进行超参数搜索:

for layer1_filters, layer1_kernels \
    in product(layer1_filters_list,layer1_kernels_list):
  cm = CifarModel()
  cm.build_train_validate_model(layer1_filters, layer1_kernels)

其中build_train_validate_model定义了层:

self.hidden1 = tf.layers.conv2d(self.input_layer, layer1_filters,
        layer1_kernels, activation=activation, name='hidden1')

在超参数搜索循环的第二次迭代中,当定义第二个候选模型时,我得到以下错误:

ValueError: Variable hidden1/kernel already exists, disallowed.
Did you mean to set reuse=True or reuse=tf.AUTO_REUSE in VarScope?

我发现这很混乱,因为超参数循环的每次迭代都会在cm = CifarModel()行中创建自己的模型对象,因此在超参数搜索循环的不同迭代中对self.hidden1 = tf.layers.conv2d的多次调用之间不应该存在名称冲突。因此,问题是:为什么TF声称同一类的不同对象的成员之间存在名称冲突?你知道吗

而且,在我添加

tf.AUTO_REUSE = True

在超参数搜索循环之前,问题一直存在。你知道吗


以下是图形定义代码:

# Build the model
self.flat_features, self.flat_label = self.iterator.get_next()
self.input_layer = self.flat_features
self.hidden1 = tf.layers.conv2d(self.input_layer, layer1_filters,
        layer1_kernels, activation=activation)
self.normal1 = tf.layers.batch_normalization(self.hidden1)
self.hidden2 = tf.layers.conv2d(self.normal1, layer2_filters,
        layer2_kernels, activation=activation)
self.normal2 = tf.layers.batch_normalization(self.hidden2)
self.maxpool1 = tf.layers.max_pooling2d(self.normal2,
        (maxpool1_stride,maxpool1_stride), (maxpool1_stride,maxpool1_stride))
self.hidden3 = tf.layers.conv2d(self.maxpool1, layer3_filters,
        layer3_kernels, activation=activation)
self.maxpool2 = tf.layers.max_pooling2d(self.hidden3,
        (self.drop3.shape[1],self.drop3.shape[1]), (self.drop3.shape[1],self.drop3.shape[1]) )
self.flat = tf.layers.flatten(self.maxpool2)
self.logits = tf.layers.dense(self.flat, len(self.label_names))

Tags: self参数layerstfactivationfiltersshapeflat