我在tf.global_variables_initializer()
中遇到FailedPremissionError错误。我已将重点放在代码的以下部分:
def __init__(...):
...
self.global_step = tf.get_variable(initializer=tf.zeros_initializer(), trainable=False, shape=(), name='global_step')
...
step_rampup_value = self.step_rampup(self.global_step, self.rampup_length)
def step_rampup(self, global_step, rampup_length):
result = tf.cond(global_step < rampup_length,
lambda: tf.constant(0.0),
lambda: tf.constant(1.0))
return tf.identity(result, name="step_rampup")
session.run(tf.global_variables_initilizer())
self.global_step
将由优化器在每次迭代中增加1
。它的价值必须改变。所以,这就是我想要的行为。你知道吗
错误消息:
FailedPreconditionError ...
506 with tf.Session(graph=highgraph) as session:
--> 507 session.run(tf.global_variables_initializer())
...
FailedPreconditionError: Attempting to use uninitialized value global_step
[[node global_step/read (defined at NML_U/sNeural.py:103) = Identity[T=DT_FLOAT, _device="/job:localhost/replica:0/task:0/device:CPU:0"](global_step)]]
为什么这部分代码是罪魁祸首? 因为,下面的代码是有效的
def __init__(...):
...
self.global_step = tf.get_variable(initializer=tf.zeros_initializer(), trainable=False, shape=(), name='global_step')
...
step_rampup_value = self.step_rampup(self.global_step, self.rampup_length)
def step_rampup(self, global_step, rampup_length):
result = tf.cond(global_step.initialized_value() < rampup_length,
lambda: tf.constant(0.0),
lambda: tf.constant(1.0))
return tf.identity(result, name="step_rampup")
session.run(tf.global_variables_initilizer())
但每次都会用初始化值self.global_step(=0)
来计算条件,这不是预期的行为
还有
此代码也适用:
def __init__(...):
...
self.global_step = tf.get_variable(initializer=tf.zeros_initializer(), trainable=False, shape=(), name='global_step')
self.global_step = tf.assign(self.global_step,0.)
...
step_rampup_value = self.step_rampup(self.global_step, self.rampup_length)
def step_rampup(self, global_step, rampup_length):
result = tf.cond(global_step < rampup_length,
lambda: tf.constant(0.0),
lambda: tf.constant(1.0))
return tf.identity(result, name="step_rampup")
session.run(tf.global_variables_initilizer())
但是(也许)这不会再次导致对global_step
的依赖,而是对assign op的依赖,assign op将继续将0
赋值给self.global_step
我该如何实现这种行为?你知道吗
您没有提供完整的代码,因此我只能猜测您可能在
__init__()
之前调用了tf.global_variables_initializer()
。实际上,前者不会初始化在调用后创建的变量。你知道吗相关问题 更多 >
编程相关推荐