After some stumbling I think I’ve noticed a single use case that fits all the examples I’ve seen. If there are other use cases, please elaborate with an example.
Use case:
Suppose you’d like to run an operator every time a particular Variable is evaluated. For example, say you’d like to add one to x
every time the variable y
is evaluated. It might seem like this will work:
x = tf.Variable(0.0)
x_plus_1 = tf.assign_add(x, 1)
with tf.control_dependencies([x_plus_1]):
y = x
init = tf.initialize_all_variables()
with tf.Session() as session:
init.run()
for i in xrange(5):
print(y.eval())
It doesn’t: it’ll print 0, 0, 0, 0, 0. Instead, it seems that we need to add a new node to the graph within the control_dependencies
block. So we use this trick:
x = tf.Variable(0.0)
x_plus_1 = tf.assign_add(x, 1)
with tf.control_dependencies([x_plus_1]):
y = tf.identity(x)
init = tf.initialize_all_variables()
with tf.Session() as session:
init.run()
for i in xrange(5):
print(y.eval())
This works: it prints 1, 2, 3, 4, 5.
If in the CIFAR-10 tutorial we dropped tf.identity
, then loss_averages_op
would never run.