RISS 학술연구정보서비스

검색
다국어 입력

http://chineseinput.net/에서 pinyin(병음)방식으로 중국어를 변환할 수 있습니다.

변환된 중국어를 복사하여 사용하시면 됩니다.

예시)
  • 中文 을 입력하시려면 zhongwen을 입력하시고 space를누르시면됩니다.
  • 北京 을 입력하시려면 beijing을 입력하시고 space를 누르시면 됩니다.
닫기
    인기검색어 순위 펼치기

    RISS 인기검색어

      Self-feeding Semi-supervised Training Method for Grammatical Error Correction

      한글로보기

      https://www.riss.kr/link?id=T16757297

      • 저자
      • 발행사항

        포항 : 포항공과대학교 컴퓨터공학과, 2022

      • 학위논문사항

        학위논문(박사) -- 포항공과대학교 컴퓨터공학과 , 컴퓨터공학과 , 2023. 2

      • 발행연도

        2022

      • 작성언어

        영어

      • 발행국(도시)

        경상북도

      • 형태사항

        26 cm

      • 일반주기명

        지도교수: 이근배

      • UCI식별코드

        I804:47020-200000660074

      • 소장기관
        • 포항공과대학교 박태준학술정보관 소장기관정보
      • 0

        상세조회
      • 0

        다운로드
      서지정보 열기
      • 내보내기
      • 내책장담기
      • 공유하기
      • 오류접수

      부가정보

      다국어 초록 (Multilingual Abstract)

      Grammatical error correction (GEC) has been successful with deep and complex neural machine translation models, but published annotated datasets to train the large models are scarce. In this dissertation, I propose a novel self-feeding training method that generates incorrect sentences from correct sentences. The proposed training method can generate appropriate wrong sentences from unlabeled sentences, using a data generation model trained as an autoencoder. It can also add artificial noise to correct sentences to automatically generate noisy sentences. I show that the GEC models trained with the self-feeding training method are successful without extra annotated data or deeper neural network-based models, achieving F0.5 score of 0.5982 on the CoNLL-2014 Shared Task test data with a transformer model. The results also show that fully unlabeled training is possible for data-scarce domains and languages.
      번역하기

      Grammatical error correction (GEC) has been successful with deep and complex neural machine translation models, but published annotated datasets to train the large models are scarce. In this dissertation, I propose a novel self-feeding training method...

      Grammatical error correction (GEC) has been successful with deep and complex neural machine translation models, but published annotated datasets to train the large models are scarce. In this dissertation, I propose a novel self-feeding training method that generates incorrect sentences from correct sentences. The proposed training method can generate appropriate wrong sentences from unlabeled sentences, using a data generation model trained as an autoencoder. It can also add artificial noise to correct sentences to automatically generate noisy sentences. I show that the GEC models trained with the self-feeding training method are successful without extra annotated data or deeper neural network-based models, achieving F0.5 score of 0.5982 on the CoNLL-2014 Shared Task test data with a transformer model. The results also show that fully unlabeled training is possible for data-scarce domains and languages.

      더보기

      목차 (Table of Contents)

      • I. Introduction 1
      • II. Related Studies 4
      • III. Self-Feeding Training 7
      • 3.1 Model-based data generation . . . . . . . . . . . . . . . . . . . . . . 8
      • 3.2 Candidate selection process . . . . . . . . . . . . . . . . . . . . . . . 10
      • I. Introduction 1
      • II. Related Studies 4
      • III. Self-Feeding Training 7
      • 3.1 Model-based data generation . . . . . . . . . . . . . . . . . . . . . . 8
      • 3.2 Candidate selection process . . . . . . . . . . . . . . . . . . . . . . . 10
      • 3.3 Unlabeled denoising training . . . . . . . . . . . . . . . . . . . . . . 12
      • 3.4 Self-feeding training . . . . . . . . . . . . . . . . . . . . . . . . . . 15
      • IV. Semi-Supervised Training of Self-Feeding Model 16
      • 4.1 Mixed training . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16
      • 4.2 Interleaved training . . . . . . . . . . . . . . . . . . . . . . . . . . . 17
      • 4.3 Fine-tuned . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17
      • V. Experiments 18
      • 5.1 Dataset . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18
      • 5.1.1 NUCLE . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19
      • 5.1.2 CLC FCE . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21
      • 5.1.3 W&I . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21
      • 5.1.4 Lang-8 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22
      • 5.1.5 Data preprocessing . . . . . . . . . . . . . . . . . . . . . . . 22
      • 5.1.6 Unlabeled data . . . . . . . . . . . . . . . . . . . . . . . . . 23
      • 5.1.7 Evaluation datasets . . . . . . . . . . . . . . . . . . . . . . . 24
      • 5.2 Grammatical error correction models . . . . . . . . . . . . . . . . . . 24
      • 5.3 Data Generation Methods . . . . . . . . . . . . . . . . . . . . . . . . 25
      • 5.4 Model-based data generation . . . . . . . . . . . . . . . . . . . . . . 26
      • 5.5 Self-feeding training . . . . . . . . . . . . . . . . . . . . . . . . . . 30
      • 5.6 Evaluation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30
      • 5.7 Results on the self-feeding method . . . . . . . . . . . . . . . . . . . 31
      • 5.8 More experiments . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33
      • 5.9 Fully unsupervised training . . . . . . . . . . . . . . . . . . . . . . . 34
      • VI. Conclusions 35
      • VII. Appendices 37
      • 7.1 Data Generation Examples . . . . . . . . . . . . . . . . . . . . . . . 37
      • 7.2 CoNLL result comparison . . . . . . . . . . . . . . . . . . . . . . . 41
      • Summary (in Korean) 43
      • References 44
      더보기

      분석정보

      View

      상세정보조회

      0

      Usage

      원문다운로드

      0

      대출신청

      0

      복사신청

      0

      EDDS신청

      0

      동일 주제 내 활용도 TOP

      더보기

      주제

      연도별 연구동향

      연도별 활용동향

      연관논문

      연구자 네트워크맵

      공동연구자 (7)

      유사연구자 (20) 활용도상위20명

      이 자료와 함께 이용한 RISS 자료

      나만을 위한 추천자료

      해외이동버튼