Skip to content
#

attention-is-all-you-need

Here are 197 public repositories matching this topic...

A recurrent attention module consisting of an LSTM cell which can query its own past cell states by the means of windowed multi-head attention. The formulas are derived from the BN-LSTM and the Transformer Network. The LARNN cell with attention can be easily used inside a loop on the cell state, just like any other RNN. (LARNN)

  • Updated Aug 20, 2018
  • Jupyter Notebook

Improve this page

Add a description, image, and links to the attention-is-all-you-need topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the attention-is-all-you-need topic, visit your repo's landing page and select "manage topics."

Learn more