sooftware / attentions Sponsor Star 489 Code Issues Pull requests PyTorch implementation of some attentions for Deep Learning Researchers. pytorch attention multi-head-attention location-sensitive-attension dot-product-attention location-aware-attention additive-attention relative-positional-encoding relative-multi-head-attention Updated Mar 4, 2022 Python
shawnhan108 / Attention-LSTMs Star 10 Code Issues Pull requests A set of notebooks that explores the power of Recurrent Neural Networks (RNNs), with a focus on LSTM, BiLSTM, seq2seq, and Attention. machine-translation keras lstm rnn seq2seq music-generation attention-mechanism lstm-neural-networks keras-tensorflow bidirectional-lstm attention-model encoder-decoder-model recurrent-neural-network additive-attention Updated Aug 31, 2020 Jupyter Notebook
mtanghu / LEAP Star 4 Code Issues Pull requests LEAP: Linear Explainable Attention in Parallel for causal language modeling with O(1) path length, and O(1) inference deep-learning parallel transformers pytorch transformer rnn attention-mechanism softmax local-attention dot-product-attention additive-attention linear-attention Updated Jun 18, 2023 Jupyter Notebook