Title:
|
USING NEURAL MACHINE TRANSLATION
FOR DETECTING AND CORRECTING GRAMMATICAL
ERRORS |
Author(s):
|
Dongqiang Yang, Xiaodong Sun and Pikun Wang |
ISBN:
|
978-989-8704-34-4 |
Editors:
|
Pedro IsaĆas and Hans Weghorn |
Year:
|
2021 |
Edition:
|
Single |
Type:
|
Full |
First Page:
|
11 |
Last Page:
|
18 |
Language:
|
English |
Cover:
|
|
Full Contents:
|
click to dowload
|
Paper Abstract:
|
Computer assisted language learning can help ESL/EFL learners facilitate their writings in multiple ways such as spell
checking, grammar checking, and style checking. Owning to the complexity of various linguistic errors intertwining in a
sentence, it is still a challenging task to detect and correct grammatical errors automatically. Different from previous
studies on using pattern matching or statistical language models on this task, we design a Transformer-based neural
sequence transduction model to detect and correct grammatical errors. Neural language models are often data-hungry and
their performance is also data-dependent. Given the limited size of standard learner corpora and its enormous annotating
cost, we employ another Transformer-based encoder-decoder structure to back-translate an error-free sentence into an
erroneous one, automating data augmentation for training neural models. We first design some artificial rules to produce
a noisy learner dataset to train the back-translation model. The model can then generate more synthesized learner data for
training the Transformer-based correction model. In addition to that we also propose an iterative training scheme that
unifies the process of error generation and correction. Our state-of-the-art model can reach F0.5, the harmony vale of
precision and recall, at 64.3% in the shared task of CoNLL-2014, surpassing all the participating systems. |
|
|
|
|