实施新的TFGAN modu

2024-10-01 19:19:00 发布

您现在位置:Python中文网/ 问答频道 /正文

我正致力于在tensorflowTFGAN Module中实现新的TFGAN模块

有人能真正让它工作吗?我遇到问题了随机噪声变成一个简单的生成器:

tfgan = tf.contrib.gan
noise = tf.random_normal([BATCH_SIZE, 28,28])

def my_generator(z, out_dim=28*28, n_units=128, reuse=False, alpha=0.01):    
    with tf.variable_scope('generator', reuse=reuse):
        # Hidden layer
        h1 = tf.layers.dense(z, n_units, activation=None)

        # Leaky ReLU
        h1 = tf.maximum(h1, alpha*h1)

        # Logits and tanh output
        logits = tf.layers.dense(h1, out_dim, activation=None)
        out = tf.nn.tanh(logits)

    return out, logits

然后tfgan打电话:

^{pr2}$

Error: "tuple' object has no attribute 'dtype'",指向我的生成器输入行。在

(顺便说一句,我几乎在keras级别的抽象上完成了NN的所有工作,所以我知道这是一个简单的问题)


根据kvorobiev的评论编辑(非常感谢)

不包括数据生成器的代码(与github上的post基本相同)

tfgan = tf.contrib.gan
noise = tf.random_normal([28,28])


def unconditional_generator(z, out_dim=28*28, n_units=128, reuse=False,  alpha=0.01):    
    with tf.variable_scope('generator', reuse=reuse):
        # Hidden layer
        h1 = tf.layers.dense(z, n_units, activation=None)
        # Leaky ReLU
        h1 = tf.maximum(h1, alpha*h1)

        # Logits and tanh output
        logits = tf.layers.dense(h1, out_dim, activation=None)
        out = tf.nn.tanh(logits)

        return out, logits

def unconditional_discriminator(x, n_units=128, reuse=False, alpha=0.01):
     with tf.variable_scope('discriminator', reuse=reuse):
        # Hidden layer
        h1 = tf.layers.dense(x, n_units, activation=None)

        # Leaky ReLU
        h1 = tf.maximum(h1, alpha*h1)

        logits = tf.layers.dense(h1, 1, activation=None)
        out = tf.nn.sigmoid(logits)

        return out, logits

# Build the generator and discriminator.
gan_model = tfgan.gan_model(
    generator_fn= unconditional_generator,  # you define
    discriminator_fn = unconditional_discriminator,  # you define
    real_data=img_generator,
    generator_inputs=noise)

# Build the GAN loss.
gan_loss = tfgan.gan_loss(
    gan_model,
    generator_loss_fn=tfgan_losses.wasserstein_generator_loss,
    discriminator_loss_fn=tfgan_losses.wasserstein_discriminator_loss)

# Create the train ops, which calculate gradients and apply updates to weights.
train_ops = tfgan.gan_train_ops(
    gan_model,
    gan_loss,
    generator_optimizer=tf.train.AdamOptimizer(gen_lr, 0.5),
    discriminator_optimizer=tf.train.AdamOptimizer(dis_lr, 0.5))

# Run the train ops in the alternating training scheme.
tfgan.gan_train(
    train_ops,
    hooks=[tf.train.StopAtStepHook(num_steps=100)],
    logdir=FLAGS.train_log_dir)

回溯:

-------------------------------------------------------------------------- AttributeError                            Traceback (most recent call last) <ipython-input-3-2c570c5257d0> in <module>()
     37     discriminator_fn = unconditional_discriminator,  # you define
     38     real_data=img_generator,
---> 39     generator_inputs=noise)
     40 
     41 # Build the GAN loss.

~/tf_1.4/lib/python3.5/site-packages/tensorflow/contrib/gan/python/train.py in gan_model(generator_fn, discriminator_fn, real_data, generator_inputs, generator_scope, discriminator_scope, check_shapes)
    105   with variable_scope.variable_scope(discriminator_scope) as dis_scope:
    106     discriminator_gen_outputs = discriminator_fn(generated_data,
--> 107                                                  generator_inputs)
    108   with variable_scope.variable_scope(dis_scope, reuse=True):
    109     real_data = ops.convert_to_tensor(real_data)

<ipython-input-3-2c570c5257d0> in unconditional_discriminator(x, n_units, reuse, alpha)
     19      with tf.variable_scope('discriminator', reuse=reuse):
     20         # Hidden layer
---> 21         h1 = tf.layers.dense(x, n_units, activation=None)
     22 
     23         # Leaky ReLU

~/tf_1.4/lib/python3.5/site-packages/tensorflow/python/layers/core.py in dense(inputs, units, activation, use_bias, kernel_initializer, bias_initializer, kernel_regularizer, bias_regularizer, activity_regularizer, kernel_constraint, bias_constraint, trainable, name, reuse)
    245                 trainable=trainable,
    246                 name=name,
--> 247                 dtype=inputs.dtype.base_dtype,
    248                 _scope=name,
    249                 _reuse=reuse)

AttributeError: 'tuple' object has no attribute 'dtype'

Tags: layerstftrainoutgeneratorh1variablescope
1条回答
网友
1楼 · 发布于 2024-10-01 19:19:00

2分:

1)我相信你的错误来自于你的鉴别器的第二个论点。如果使用库调用,TFGAN希望第二个参数是您想要的任何条件(可以是无条件情况下的输入噪声、条件情况下的类、InfoGAN中的结构化噪声等)。您的定义使用noise作为n_units,这很可能导致类型不匹配。要解决这个问题,只需使第二个鉴别器参数不使用n_dims。在

2)我正在审查一些有用的/说明性的例子(关于MNIST的无条件/有条件的/InfoGAN、CIFAR的分布式培训、图像压缩的对抗性损失、图像到图像的转换等)。他们很快就会出现在这里:https://github.com/tensorflow/models/tree/master/research。在

相关问题 更多 >

    热门问题