Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

TextCNN_Torch have wrong comment #44

Open
jnakor opened this issue Jan 23, 2020 · 3 comments
Open

TextCNN_Torch have wrong comment #44

jnakor opened this issue Jan 23, 2020 · 3 comments

Comments

@jnakor
Copy link

jnakor commented Jan 23, 2020

def forward(self, X): embedded_chars = self.W[X] # [batch_size, sequence_length, sequence_length]

I think the shape is [batch_size, sequence_length,embedding_size]

@endeavor11
Copy link

yes,I think so

@Yuhuishishishi
Copy link
Contributor

Filed PR #49

@AgaigetS
Copy link

can somebody tell me, why need three conv layer to convolve the word embedding matrix? I dont understand.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants