RISS 학술연구정보서비스

검색
다국어 입력

http://chineseinput.net/에서 pinyin(병음)방식으로 중국어를 변환할 수 있습니다.

변환된 중국어를 복사하여 사용하시면 됩니다.

예시)
  • 中文 을 입력하시려면 zhongwen을 입력하시고 space를누르시면됩니다.
  • 北京 을 입력하시려면 beijing을 입력하시고 space를 누르시면 됩니다.
닫기
    인기검색어 순위 펼치기

    RISS 인기검색어

      KCI등재 SCOPUS

      Improving Abstractive Summarization by Training Masked Out-of-Vocabulary Words

      한글로보기

      https://www.riss.kr/link?id=A108177657

      • 0

        상세조회
      • 0

        다운로드
      서지정보 열기
      • 내보내기
      • 내책장담기
      • 공유하기
      • 오류접수

      부가정보

      다국어 초록 (Multilingual Abstract)

      Text summarization is the task of producing a shorter version of a long document while accurately preservingthe main contents of the original text. Abstractive summarization generates novel words and phrases using alanguage generation method through t...

      Text summarization is the task of producing a shorter version of a long document while accurately preservingthe main contents of the original text. Abstractive summarization generates novel words and phrases using alanguage generation method through text transformation and prior-embedded word information. However,newly coined words or out-of-vocabulary words decrease the performance of automatic summarization becausethey are not pre-trained in the machine learning process. In this study, we demonstrated an improvement insummarization quality through the contextualized embedding of BERT with out-of-vocabulary masking. Inaddition, explicitly providing precise pointing and an optional copy instruction along with BERT embedding,we achieved an increased accuracy than the baseline model. The recall-based word-generation metric ROUGE-1 score was 55.11 and the word-order-based ROUGE-L score was 39.65.

      더보기

      참고문헌 (Reference)

      1 J. Howard, "Universal language model fine-tuning for text classification" 328-339, 2018

      2 M. Allahyari, "Text summarization techniques: a brief survey"

      3 G. Huang, "Snapshot ensembles: train 1, get m for free"

      4 I. Sutskever, "Sequence to sequence learning with neural networks" 27 : 3104-3112, 2014

      5 S. Xu, "Self-attention guided copy mechanism for abstractive summarization" 1355-1362, 2020

      6 M. Yasunaga, "ScisummNet : a large annotated corpus and content-impact models for scientific paper summarization with citation networks" 7386-7393, 2019

      7 I. Loshchilov, "SGDR: stochastic gradient descent with warm restarts"

      8 K. Ganesan, "Rouge 2.0: updated and improved measures for evaluation of summarization tasks"

      9 S. Narayan, "Ranking sentences for extractive summarization with reinforcement learning"

      10 C. Gulcehre, "Pointing the unknown words" 140-149, 2016

      1 J. Howard, "Universal language model fine-tuning for text classification" 328-339, 2018

      2 M. Allahyari, "Text summarization techniques: a brief survey"

      3 G. Huang, "Snapshot ensembles: train 1, get m for free"

      4 I. Sutskever, "Sequence to sequence learning with neural networks" 27 : 3104-3112, 2014

      5 S. Xu, "Self-attention guided copy mechanism for abstractive summarization" 1355-1362, 2020

      6 M. Yasunaga, "ScisummNet : a large annotated corpus and content-impact models for scientific paper summarization with citation networks" 7386-7393, 2019

      7 I. Loshchilov, "SGDR: stochastic gradient descent with warm restarts"

      8 K. Ganesan, "Rouge 2.0: updated and improved measures for evaluation of summarization tasks"

      9 S. Narayan, "Ranking sentences for extractive summarization with reinforcement learning"

      10 C. Gulcehre, "Pointing the unknown words" 140-149, 2016

      11 J. Cheng, "Neural summarization by extracting sentences and words" 484-494, 2016

      12 D. Bahdanau, "Neural machine translation by jointly learning to align and translate"

      13 Tian Shi, "Neural Abstractive Text Summarization with Sequence-to-Sequence Models" Association for Computing Machinery (ACM) 2 (2): 1-37, 2021

      14 J. Gu, "Incorporating copying mechanism in sequence-to-sequence learning" 1631-1640, 2016

      15 L. N. Smith, "Cyclical learning rates for training neural networks" 464-472, 2017

      16 W. Wang, "Concept pointer network for abstractive summarization" 3076-3085, 2019

      17 G. Rossiello, "Centroid-based text summarization through compositionality of word embeddings" 12-21, 2017

      18 J. Devlin, "BERT: pre-training of deep bidirectional transformers for language understanding"

      19 이태석 ; 강승식, "BERT 임베딩과 선택적 OOV 복사 방법을 사용한 문서요약" 한국정보과학회 47 (47): 36-44, 2020

      20 M. Hu, "Attention-guided answer distillation for machine reading comprehension"

      21 A. Vaswani, "Attention is all you need" 30 : 5998-6008, 2017

      22 R. Nallapati, "Abstractive text summarization using sequence-tosequence RNNs and beyond"

      23 Y. Dong, "A survey on neural network-based summarization methods"

      24 N. Nazari, "A survey on automatic text summarization" 7 (7): 121-135, 2019

      25 S. Erera, "A summarization system for scientific documents" 211-216, 2019

      26 K. Goyal, "A continuous relaxation of beam search for end-toend training of neural sequence models" 3045-3052, 2018

      더보기

      동일학술지(권/호) 다른 논문

      동일학술지 더보기

      더보기

      분석정보

      View

      상세정보조회

      0

      Usage

      원문다운로드

      0

      대출신청

      0

      복사신청

      0

      EDDS신청

      0

      동일 주제 내 활용도 TOP

      더보기

      주제

      연도별 연구동향

      연도별 활용동향

      연관논문

      연구자 네트워크맵

      공동연구자 (7)

      유사연구자 (20) 활용도상위20명

      인용정보 인용지수 설명보기

      학술지 이력

      학술지 이력
      연월일 이력구분 이력상세 등재구분
      2023 평가예정 해외DB학술지평가 신청대상 (해외등재 학술지 평가)
      2020-01-01 평가 등재학술지 유지 (해외등재 학술지 평가) KCI등재
      2012-01-01 평가 등재학술지 선정 (등재후보2차) KCI등재
      2011-01-01 평가 등재후보 1차 PASS (등재후보1차) KCI등재후보
      2009-01-01 평가 등재후보학술지 선정 (신규평가) KCI등재후보
      더보기

      학술지 인용정보

      학술지 인용정보
      기준연도 WOS-KCI 통합IF(2년) KCIF(2년) KCIF(3년)
      2016 0.09 0.09 0.09
      KCIF(4년) KCIF(5년) 중심성지수(3년) 즉시성지수
      0.07 0.06 0.254 0.59
      더보기

      이 자료와 함께 이용한 RISS 자료

      나만을 위한 추천자료

      해외이동버튼