-
Notifications
You must be signed in to change notification settings - Fork 20
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
How to use this model to generate text? #10
Comments
We will try to release the implementation code in a few weeks. Basically, you can refer to transformer XL and replace the attention layer with SGConv. |
Thank you so much! |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Hello. I love the idea of SGConv. Would it be possible to modify the code so it can generate sequences of tokens, like in a Seq2Seq model?
Thank you
The text was updated successfully, but these errors were encountered: