Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

add rope alibi to encoder #1687

Open
wants to merge 3 commits into
base: master
Choose a base branch
from
Open

Conversation

vince62s
Copy link
Member

No description provided.

@LynxPDA
Copy link

LynxPDA commented May 22, 2024

@vince62s I updated Ctranslate and the conversion went without errors.
The only thing is that I saved the changes by adding the line "gated-gelu" to _SUPPORTED_ACTIVATIONS:

_SUPPORTED_ACTIVATIONS = {
    "gelu": common_spec.Activation.GELU,
    "fast_gelu": common_spec.Activation.GELUTanh,
    "relu": common_spec.Activation.RELU,
    "silu": common_spec.Activation.SWISH,
    "gated-gelu": common_spec.Activation.GELU,
}

without this it still gave the error:

- Option --pos_ffn_activation_fn gated-gelu is not supported (supported activations are: gelu, fast_gelu, relu, silu)

@vince62s
Copy link
Member Author

vince62s commented May 22, 2024

yes you're correct. Were you able to process inference in ct2 without any issue?

I am not merging yet because for AliBi it requires additional changes in C

@LynxPDA
Copy link

LynxPDA commented May 23, 2024

So far tried inference only with gated-gelu activation on Ctranslate2 v3.20.0, no issues. I plan to try output with ROPE in the next month after training the corresponding model.

Copy link

@lecoqnicolas lecoqnicolas left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hello,
Would like to insert a line between lines 8 and 9 to support gated-gelu activation
"gated-gelu": common_spec.Activation.GELU,
Also, gated-gelu does not feature in the "transformers.py" script. Might want to add it after line 30, and modify line 1308 to include gated-gelu.

@lecoqnicolas
Copy link

Hello, Would like to insert a line between lines 8 and 9 to support gated-gelu activation "gated-gelu": common_spec.Activation.GELU, Also, gated-gelu does not feature in the "transformers.py" script. Might want to add it after line 30, and modify line 1308 to include gated-gelu.

Forget the transformers.py script : GEMMA transformers implement GeGLU, but with a GELUTanh approximation (just read an article about it) so no need to update.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants