Skip to content

Commit d101fdd

Browse files
Fix Layer class name in docstring (#2118)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
1 parent e921444 commit d101fdd

File tree

1 file changed

+2
-2
lines changed

1 file changed

+2
-2
lines changed

tensorflow_addons/seq2seq/attention_wrapper.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1664,8 +1664,8 @@ def __init__(
16641664
`get_initial_state` which does not match the batch size of
16651665
`initial_cell_state`, proper behavior is not guaranteed.
16661666
name: Name to use when creating ops.
1667-
attention_layer: A list of `tf.tf.keras.layers.Layer` instances or a
1668-
single `tf.tf.keras.layers.Layer` instance taking the context
1667+
attention_layer: A list of `tf.keras.layers.Layer` instances or a
1668+
single `tf.keras.layers.Layer` instance taking the context
16691669
and cell output as inputs to generate attention at each time step.
16701670
If None (default), use the context as attention at each time step.
16711671
If attention_mechanism is a list, attention_layer must be a list of

0 commit comments

Comments
 (0)