Skip to content

Implementations and resources related to Attention Mechanisms in Natural Language Processing (NLP)

Notifications You must be signed in to change notification settings

Oriolac/attention-mechanisms

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 

Repository files navigation

Attention Mechanisms from Scratch

About

This repository contains implementations and resources related to "Attention Mechanisms in Natural Language Processing (NLP)". It serves as a comprehensive guide for those looking to understand and implement attention models from the ground up. Originating from the Coursera course on "Attention models in NLP", the content here provides practical Jupyter Notebook examples and insights into the workings of attention mechanisms, a pivotal concept in modern deep learning architectures for NLP tasks.

The repository is based on the course called "Attention models in NLP" from Coursera. Whether you're a student, researcher, or NLP enthusiast, this repository offers valuable insights into the world of attention, helping you grasp the intricacies of this powerful mechanism.

Resources

Topics

  • NLP
  • Natural Language Processing
  • Transformers
  • Attention Mechanism
  • Transformer Architecture

About

Implementations and resources related to Attention Mechanisms in Natural Language Processing (NLP)

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published