Skip to content
This repository has been archived by the owner on Nov 12, 2021. It is now read-only.

Why not using the first embedding of nn.Embedding ? #48

Open
zhenyuhe00 opened this issue Nov 10, 2020 · 1 comment
Open

Why not using the first embedding of nn.Embedding ? #48

zhenyuhe00 opened this issue Nov 10, 2020 · 1 comment

Comments

@zhenyuhe00
Copy link

Thanks for make your code public.
I have been wondering why b = self.embedding.weight[1:] in model.py rather than b = self.embedding.weight[:] ?

@zhenyuhe00
Copy link
Author

Thanks for make your code public.
I have been wondering why b = self.embedding.weight[1:] in model.py rather than b = self.embedding.weight[:] ?

in pytorch_code, model.py line87

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant