What’s the difference between tensorflow dynamic_rnn and rnn?

From RNNs in Tensorflow, a Practical Guide and Undocumented Features by Denny Britz, published in August 21, 2016.

tf.nn.rnn creates an unrolled graph for a fixed RNN length. That
means, if you call tf.nn.rnn with inputs having 200 time steps you are
creating a static graph with 200 RNN steps. First, graph creation is
slow. Second, you’re unable to pass in longer sequences (> 200) than
you’ve originally specified.

tf.nn.dynamic_rnn solves this. It uses a tf.While loop to dynamically
construct the graph when it is executed. That means graph creation is
faster and you can feed batches of variable size.

Leave a Comment