多类图像分割中的多重输出与损失

2024-10-16 17:19:48 发布

您现在位置:Python中文网/ 问答频道 /正文

我使用Keras和Tensorflow来解决多类图像分割问题。对于每个训练对{(x_i,y_i)}^N,我的模型有不同的辅助损失(out_aux),它们与一个主损失函数(out_main)一起添加。你知道吗

                      Input(256,256,5)
                              |
                              |
                    something happening here

         /          |           |           |          \
        |           |           |           |           |
      out_main  out_aux(0)  out_aux(1)  out_aux(2)  out_aux(3)

目前,我对所有输出使用相同的损耗(categorical_crossentropy),因此基本上,对所有损耗都评估单个(x,y)对。我在定义模型时遇到了一些问题。产生辅助损失的方法如下:

def aux_branch(self, x, nb_labels=5):
    # x is a list of feature maps.
    x_aux = []
    for nf in range(len(x)):
        x_in = x[nf]
        out_aux = _conv(filters=nb_labels, kernel_size=(1, 1), activation='linear')(x_in)
        out_aux = Reshape((im_rows * im_cols, nb_labels))(out_aux)
        out_aux = Activation('softmax')(out_aux)
        out_aux = Reshape((im_rows, im_cols, nb_labels), name='out_aux_{}'.format(nf))(out_aux)

        x_aux.append(out_aux)

    return x_aux

创建和编译模型的方法定义为:

def create_model(self, x, lossfunc, nb_labels=5):
    ...
    ... 

    out_aux = self.aux_branch(x)
    ...
    out_main = _conv(filters=nb_labels, kernel_size=(1, 1), activation='linear')(out_main)
    out_main = Reshape((im_rows * im_cols, nb_labels))(out_main)
    out_main = Activation('softmax')(out_main)
    out_main = Reshape((im_rows, im_cols, nb_labels), name='out_main')(out_main)

    model = Model(inputs=inputs, outputs=[out_main, out_aux[0], out_aux[1], out_aux[2], out_aux[3], out_aux[4]])


    model.compile(loss={'out_aux_0': lossfunc,
                        'out_aux_1': lossfunc,
                        'out_aux_2': lossfunc,
                        'out_aux_3': lossfunc,
                        'out_aux_4': lossfunc,
                        'out_main': lossfunc},
                  # loss_weights={'out_aux_0': 1.0,
                  #               'out_aux_1': 1.0,
                  #               'out_aux_2': 1.0,
                  #               'out_aux_3': 1.0,
                  #               'out_aux_4': 1.0,
                  #               'out_main': 1.0},
                    optimizer=optimizer,
                    metrics={'out_main': 'accuracy',
                           'out_aux_0': 'accuracy',
                           'out_aux_1': 'accuracy',
                           'out_aux_2': 'accuracy',
                           'out_aux_3': 'accuracy',
                           'out_aux_4': 'accuracy'})

    ...
    results = model.fit_generator(
        generator=train_generator,
        steps_per_epoch=steps_per_epoch,
        epochs=args.epochs,
        callbacks=callbacks,
        validation_data=val_generator,
        #nb_val_samples = nb_val_samples,
        validation_steps=validation_steps,
        #class_weight=class_weight
        initial_epoch=epoch_num+1
       )

至于损失函数,我用的是keras的标准分类交叉熵函数。 我得到的错误消息是:

File "/home/user/anaconda3/lib/python3.6/site-packages/keras/engine/training_utils.py", line 101, in standardize_input_data
str(len(data)) + ' arrays: ' + str(data)[:200] + '...')
ValueError: Error when checking model target: the list of Numpy arrays that you are passing to your model is not the size the model expected. Expected to see 6 array(s), but instead got the following list of 1 arrays: [array([[[[1., 0.],
     [1., 0.],
     [1., 0.],
     ...,
     [0., 1.],
     [0., 1.],
     [0., 1.]],

    [[1., 0.],
     [1., 0.],
     [1., 0.],
     ......

你知道我做错了什么吗?你知道吗


Tags: inlabelsmodelmainoutgeneratorrowscols