Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Swapped naive dot product attention for flash attention #24

Open
wants to merge 7 commits into
base: main
Choose a base branch
from

Commits on Mar 31, 2023

  1. incorporated fast attention into attention

    Matthew Smith committed Mar 31, 2023
    Configuration menu
    Copy the full SHA
    37b64d4 View commit details
    Browse the repository at this point in the history

Commits on Apr 24, 2023

  1. Configuration menu
    Copy the full SHA
    a5a9419 View commit details
    Browse the repository at this point in the history
  2. Configuration menu
    Copy the full SHA
    412a1a3 View commit details
    Browse the repository at this point in the history
  3. Configuration menu
    Copy the full SHA
    d4a62cc View commit details
    Browse the repository at this point in the history
  4. Configuration menu
    Copy the full SHA
    62cedb9 View commit details
    Browse the repository at this point in the history

Commits on May 9, 2023

  1. Masks are now optional, and not created. Fixes required to support Fl…

    …ashAttention (e.g. no mask, fp/bf16)
    mranzinger committed May 9, 2023
    Configuration menu
    Copy the full SHA
    29c6ead View commit details
    Browse the repository at this point in the history

Commits on May 17, 2023

  1. Merge pull request #1 from mranzinger/efficient

    Oh I must have overlooked that
    usryokousha committed May 17, 2023
    Configuration menu
    Copy the full SHA
    dd69dcb View commit details
    Browse the repository at this point in the history