Skip to content
This repository has been archived by the owner on Jul 7, 2023. It is now read-only.

Pre-training of Seq2edits for grammar correction task #1896

Open
cingtiye opened this issue Aug 25, 2021 · 0 comments
Open

Pre-training of Seq2edits for grammar correction task #1896

cingtiye opened this issue Aug 25, 2021 · 0 comments

Comments

@cingtiye
Copy link

Description

In this paper, Seq2Edits: Sequence Transduction Using Span-level Edit Operations, I found all tasks are trained for 1M i terations on 170M sentences extracted from English Wikipedia revisions and 176M sentences from English Wikipedia round-trip translated via German. Whether to get tag and position information between source and target sentences during pre-training? Another question about start and end position, target position is used in code rather than souce position. Which position information should be selected?

Thanks!

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant