RISS 학술연구정보서비스

검색
다국어 입력

http://chineseinput.net/에서 pinyin(병음)방식으로 중국어를 변환할 수 있습니다.

변환된 중국어를 복사하여 사용하시면 됩니다.

예시)
  • 中文 을 입력하시려면 zhongwen을 입력하시고 space를누르시면됩니다.
  • 北京 을 입력하시려면 beijing을 입력하시고 space를 누르시면 됩니다.
닫기
    인기검색어 순위 펼치기

    RISS 인기검색어

      KCI등재

      Investigating syntactic transfer in second language learning of neural language models

      한글로보기

      https://www.riss.kr/link?id=A108911903

      • 0

        상세조회
      • 0

        다운로드
      서지정보 열기
      • 내보내기
      • 내책장담기
      • 공유하기
      • 오류접수

      부가정보

      다국어 초록 (Multilingual Abstract)

      Second language acquisition (SLA) research has extensively delved into cross-linguistic transfer, examining the impact of the linguistic structure of a native language on the acquisition of a second language. Such transfer effect can be either positive or negative, impeding the acquisition. In this paper, we employ transfer learning as a methodology for analyzing the encoding of grammatical structure in neural language models. This approach, transfer learning, involves pre-training the neural language model which is the Transformer-based language model, BabyRoberta, on Korean as the first language (L1). Afterward, we fine-tune the model with English as a second language (L2). Our task includes using the BLiMP test suite (Warstadt, 2020), broadly known as a benchmark for measuring the syntactic ability of neural language models. This allows us to provide insights into how neural language models represent abstract syntactic structures of English, incorporating the structural inductive biases acquired from Korean.
      번역하기

      Second language acquisition (SLA) research has extensively delved into cross-linguistic transfer, examining the impact of the linguistic structure of a native language on the acquisition of a second language. Such transfer effect can be either positiv...

      Second language acquisition (SLA) research has extensively delved into cross-linguistic transfer, examining the impact of the linguistic structure of a native language on the acquisition of a second language. Such transfer effect can be either positive or negative, impeding the acquisition. In this paper, we employ transfer learning as a methodology for analyzing the encoding of grammatical structure in neural language models. This approach, transfer learning, involves pre-training the neural language model which is the Transformer-based language model, BabyRoberta, on Korean as the first language (L1). Afterward, we fine-tune the model with English as a second language (L2). Our task includes using the BLiMP test suite (Warstadt, 2020), broadly known as a benchmark for measuring the syntactic ability of neural language models. This allows us to provide insights into how neural language models represent abstract syntactic structures of English, incorporating the structural inductive biases acquired from Korean.

      더보기

      참고문헌 (Reference)

      1 Conneau, A., "Word translation without parallel data"

      2 Ettinger, A., "What BERT is not : Lessons from a new suite of psycholinguistic diagnostics for language models" 8 : 34-48, 2020

      3 Derwing, T. M., "Teaching native speakers to listen to foreign-accented speech" 23 (23): 245-259, 2002

      4 Yadavalli, A., "SLABERT talk pretty one day : Modeling second language acquisition with BERT"

      5 Liu, Y., "RoBERTa: A robustly optimized BERT pretraining approach"

      6 Berzak, Y., "Reconstructing native language typology from foreign language usage"

      7 Schwab, J., "On the acquisition of polarity items : 11-to 12-year-olds' comprehension of German NPIs and PPIs" 50 (50): 1487-1509, 2021

      8 Van der Wal, S., "Negative polarity items and negation: Tandem acquisition" Rijksuniversiteit Groningen 1996

      9 Kassner, N., "Negated and misprimed probes for pretrained language models : Birds can talk, but cannot fly"

      10 Tieu, L., "Logic and grammar in child language: How children acquire the semantics of polarity sensitivity" University of Connecticut 2013

      1 Conneau, A., "Word translation without parallel data"

      2 Ettinger, A., "What BERT is not : Lessons from a new suite of psycholinguistic diagnostics for language models" 8 : 34-48, 2020

      3 Derwing, T. M., "Teaching native speakers to listen to foreign-accented speech" 23 (23): 245-259, 2002

      4 Yadavalli, A., "SLABERT talk pretty one day : Modeling second language acquisition with BERT"

      5 Liu, Y., "RoBERTa: A robustly optimized BERT pretraining approach"

      6 Berzak, Y., "Reconstructing native language typology from foreign language usage"

      7 Schwab, J., "On the acquisition of polarity items : 11-to 12-year-olds' comprehension of German NPIs and PPIs" 50 (50): 1487-1509, 2021

      8 Van der Wal, S., "Negative polarity items and negation: Tandem acquisition" Rijksuniversiteit Groningen 1996

      9 Kassner, N., "Negated and misprimed probes for pretrained language models : Birds can talk, but cannot fly"

      10 Tieu, L., "Logic and grammar in child language: How children acquire the semantics of polarity sensitivity" University of Connecticut 2013

      11 Papadimitriou, I., "Learning music helps you read : Using transfer to study linguistic structure in language models"

      12 Sprouse, J., "Experimental syntax and island effects" Cambridge University Press 1-20, 2013

      13 Dulay, H. C., "Errors and strategies in child second language acquisition" 8 (8): 129-136, 1974

      14 Wu, S., "Emerging cross-lingual structure in pretrained language models"

      15 Jarvis, S., "Crosslinguistic influence in language and cognition" Routledge 2008

      16 Ringbom, H, "Cross-linguistic similarity in foreign language learning (Vol. 21)" Multilingual Matters 2007

      17 Wu, S., "Beto, bentz, becas : The surprising cross-lingual effectiveness of BERT"

      18 Huebner, P. A., "BabyBERTa: Learning more grammar with small-scale child-directed language" Association for Computational Linguistics 624-646, 2021

      19 Warstadt, A., "BLiMP : The benchmark of linguistic minimal pairs for English" 8 : 377-392, 2020

      20 Ruder, S., "A survey of cross-lingual word embedding models" 65 : 569-631, 2019

      21 Artetxe, M., "A robust self-learning method for fully unsupervised cross-lingual mappings of word embeddings"

      더보기

      분석정보

      View

      상세정보조회

      0

      Usage

      원문다운로드

      0

      대출신청

      0

      복사신청

      0

      EDDS신청

      0

      동일 주제 내 활용도 TOP

      더보기

      주제

      연도별 연구동향

      연도별 활용동향

      연관논문

      연구자 네트워크맵

      공동연구자 (7)

      유사연구자 (20) 활용도상위20명

      이 자료와 함께 이용한 RISS 자료

      나만을 위한 추천자료

      해외이동버튼