如何在TensorFlow模式中使用复变量?

2024-10-02 12:31:10 发布

您现在位置:Python中文网/ 问答频道 /正文

在非急切模式下,我可以毫无问题地运行:

s = tf.complex(tf.Variable(1.0), tf.Variable(1.0))
train_op = tf.train.AdamOptimizer(0.01).minimize(tf.abs(s))

with tf.Session() as sess:
    sess.run(tf.global_variables_initializer())
    for i in range(5):
        _, s_ = sess.run([train_op, s])
        print(s_)

>(1+1j)
(0.99+0.99j)
(0.98+0.98j)
(0.9700001+0.9700001j)
(0.9600001+0.9600001j)

但我似乎找不到在渴望模式下的等价表达。我试过以下方法,但TF抱怨:

tfe = tf.contrib.eager
s = tf.complex(tfe.Variable(1.0), tfe.Variable(1.0))
def obj(s):
    return tf.abs(s)
with tf.GradientTape() as tape:
    loss = obj(s)
    grads = tape.gradient(loss, [s])
    optimizer.apply_gradients(zip(grads, [s]))

The dtype of the source tensor must be floating (e.g. tf.float32) when calling GradientTape.gradient, got tf.complex64

以及

No gradients provided for any variable: ['tf.Tensor((1+1j), shape=(), dtype=complex64)']

如何在急切模式下训练复杂变量?你知道吗


Tags: runobjfortfaswith模式train
1条回答
网友
1楼 · 发布于 2024-10-02 12:31:10

使用Tensorflow 2中的渴望模式,可以将实部和虚部作为实变量:

r, i = tf.Variable(1.0), tf.Variable(1.0)
def obj(s):
    return tf.abs(s)
with tf.GradientTape() as tape:
    s = tf.complex(r, i)
    loss = obj(s)
    grads = tape.gradient(loss, [r, i])
    optimizer.apply_gradients(zip(grads, [r, i]))

相关问题 更多 >

    热门问题