如何将编码器输出的某些层提取到解码器中?

2024-09-26 17:42:15 发布

您现在位置:Python中文网/ 问答频道 /正文

我正在构建一个带有一个编码器和两个解码器的自编码器模型。我希望编码器部分的输出像U-net一样连接到解码器。但是当我尝试返回一个具有四层输出的列表时,要连接解码器,会出现一个“Graph disconnected:”错误。keras是否具有类似Pytork的Modellist

我不能只连接编码器和解码器。因为我需要为编码器输出做些事情。该模型有两个解码器。是否有任何keras API用于执行此操作

keras:2.2.4 张量流:1.12 python:3.68

编码器部分


                def enc_flow(e_dims, ae_dims, lowest_dense_res):
                    def func(inp):

                        x0 = downscale(e_dims, 3, 1,False)(inp)

                        x1 = downscale(e_dims * 2, 3, 1,True)(x0)
                        x2 = downscale(e_dims * 4, 3, 1,True)(x1)
                        x3 = downscale(e_dims * 8, 3, 1,True)(x2)
                        x3 = Dense(lowest_dense_res * lowest_dense_res * ae_dims)(x3)
                        x3 = Reshape((lowest_dense_res, lowest_dense_res, ae_dims))(x3)
                        x4 = upscale(ae_dims,True)(x3)
                        par_list=[x0,x2,x3,x4]



                        return x4
                    return func

译码器部分

                def dec_flow(output_nc, d_ch_dims, add_residual_blocks=True):
                    dims = output_nc * d_ch_dims

                    def ResidualBlock(dim):
                        def func(inp):
                            x = Conv2D(dim, kernel_size=3, padding='same')(inp)
                            x = LeakyReLU(0.2)(x)
                            x = Conv2D(dim, kernel_size=3, padding='same')(x)
                            x = Add()([x, inp])
                            x = LeakyReLU(0.2)(x)
                            return x

                        return func


                    def func(inp): # input
                        print(type(inp))

                        x = upscale(dims * 8,True)(inp)
                        x = ResidualBlock(dims * 8)(x)
                        # x = Concatenate()([x,par_list[1]])
                        x = upscale(dims * 4,True)(x)
                        x = ResidualBlock(dims * 4)(x)
                        # x = Concatenate()([x,par_list[0]])
                        x = upscale(dims * 2,True)(x)
                        x = ResidualBlock(dims * 2)(x)

                        return Conv2D(output_nc, kernel_size=5, padding='same', activation='sigmoid')(x)

                    return func

当我尝试时,我将收到错误消息

Graph disconnected: cannot obtain value for tensor Tensor("
input_1:0", shape=(?, 128, 128, 3), 
dtype=float32) at layer "input_1". The following previous layers were
 accessed without issue: ['input_2', 'conv2d_10', 'space_attention_5', 
'channel_attention_5', 'concatenate_5', 'conv2d_11', 'leaky_re_lu_6',
 'pixel_shuffler_2', 'conv2d_12', 'leaky_re_lu_7', 'conv2d_13', 'add_1',
 'leaky_re_lu_8', 'conv2d_14', 'space_attention_6', 'channel_attention_6',
 'concatenate_6', 'conv2d_15', 'leaky_re_lu_9', 'pixel_shuffler_3', 
'conv2d_16']


Tags: truereturndefres编码器解码器densefunc

热门问题