RISS 학술연구정보서비스

검색
다국어 입력

http://chineseinput.net/에서 pinyin(병음)방식으로 중국어를 변환할 수 있습니다.

변환된 중국어를 복사하여 사용하시면 됩니다.

예시)
  • 中文 을 입력하시려면 zhongwen을 입력하시고 space를누르시면됩니다.
  • 北京 을 입력하시려면 beijing을 입력하시고 space를 누르시면 됩니다.
닫기
    인기검색어 순위 펼치기

    RISS 인기검색어

      SCOPUS KCI등재

      Cross-Domain Text Sentiment Classification Method Based on the CNN-BiLSTM-TE Model = Cross-Domain Text Sentiment Classification Method Based on the CNN-BiLSTM-TE Model

      한글로보기

      https://www.riss.kr/link?id=A107844629

      • 0

        상세조회
      • 0

        다운로드
      서지정보 열기
      • 내보내기
      • 내책장담기
      • 공유하기
      • 오류접수

      부가정보

      다국어 초록 (Multilingual Abstract)

      To address the problems of low precision rate, insufficient feature extraction, and poor contextual ability in existing text sentiment analysis methods, a mixed model account of a CNN-BiLSTM-TE (convolutional neural network, bidirectional long short-t...

      To address the problems of low precision rate, insufficient feature extraction, and poor contextual ability in existing text sentiment analysis methods, a mixed model account of a CNN-BiLSTM-TE (convolutional neural network, bidirectional long short-term memory, and topic extraction) model was proposed. First, Chinese text data was converted into vectors through the method of transfer learning by Word2Vec. Second, local features were extracted by the CNN model. Then, contextual information was extracted by the BiLSTM neural network and the emotional tendency was obtained using softmax. Finally, topics were extracted by the term frequency-inverse document frequency and K-means. Compared with the CNN, BiLSTM, and gate recurrent unit (GRU) models, the CNN-BiLSTM-TE model’s F1-score was higher than other models by 0.0147, 0.006, and 0.0052, respectively. Then compared with CNN-LSTM, LSTM-CNN, and BiLSTM-CNN models, the F1-score was higher by 0.0071, 0.0038, and 0.0049, respectively. Experimental results showed that the CNN-BiLSTM-TE model can effectively improve various indicators in application. Lastly, performed scalability verification through a takeaway dataset, which has great value in practical applications.

      더보기

      참고문헌 (Reference)

      1 G. A. Miller, "WordNet : a lexical database for English" 38 (38): 39-41, 1995

      2 J. Lu, "Transfer learning using computational intelligence : a survey" 80 : 14-23, 2015

      3 B. Pang, "Thumbs up? Sentiment classification using machine learning techniques" 79-86, 2002

      4 H. Zhao, "Text sentiment analysis based on serial hybrid model of bi-directional long short-term memory and convolutional neural network" 40 (40): 16-20, 2020

      5 Y. Zhang, "Study of sentiment classification for Chinese microblog based on recurrent neural network" 25 (25): 601-607, 2016

      6 J. Li, "Research on product feature extraction and sentiment classification of short online review based on deep learning" 41 (41): 143-148, 2018

      7 J. Liang, "Polarity shifting and LSTM based recursive networks for sentiment analysis" 29 (29): 152-159, 2015

      8 S. F. Zeng, "New method of text representation model based on neural network" 38 (38): 86-98, 2017

      9 L. W. Ku, "Mining opinions from the web : beyond relevance retrieval" 58 (58): 1838-1850, 2007

      10 S. Hochreiter, "Long short-term memory" 9 (9): 1735-1780, 1997

      1 G. A. Miller, "WordNet : a lexical database for English" 38 (38): 39-41, 1995

      2 J. Lu, "Transfer learning using computational intelligence : a survey" 80 : 14-23, 2015

      3 B. Pang, "Thumbs up? Sentiment classification using machine learning techniques" 79-86, 2002

      4 H. Zhao, "Text sentiment analysis based on serial hybrid model of bi-directional long short-term memory and convolutional neural network" 40 (40): 16-20, 2020

      5 Y. Zhang, "Study of sentiment classification for Chinese microblog based on recurrent neural network" 25 (25): 601-607, 2016

      6 J. Li, "Research on product feature extraction and sentiment classification of short online review based on deep learning" 41 (41): 143-148, 2018

      7 J. Liang, "Polarity shifting and LSTM based recursive networks for sentiment analysis" 29 (29): 152-159, 2015

      8 S. F. Zeng, "New method of text representation model based on neural network" 38 (38): 86-98, 2017

      9 L. W. Ku, "Mining opinions from the web : beyond relevance retrieval" 58 (58): 1838-1850, 2007

      10 S. Hochreiter, "Long short-term memory" 9 (9): 1735-1780, 1997

      11 J. Qiu, "Leveraging sentiment analysis at the aspects level to predict ratings of reviews" 451 : 295-309, 2018

      12 G. E. Hinton, "Learning distributed representations of concepts" 1986

      13 G. E. Hinton, "Improving neural networks by preventing co-adaption of feature detectors"

      14 A. Graves, "Hybrid speech recognition with deep bidirectional LSTM" 273-278, 2013

      15 J. F. Xu, "Hybrid algorithm framework for sentiment classification of chinese based on semantic comprehension and machine learning" 42 (42): 61-66, 2015

      16 P. D. Turney, "From frequency to meaning : vector space models of semantics" 37 : 141-188, 2010

      17 T. Mikolov, "Efficient estimation of word representations in vector space"

      18 Y. Kim, "Convolutional neural networks for sentence classification" 1746-1751, 2014

      19 X. D. Guo, "Consumer reviews sentiment analysis based on CNN-BiLSTM" 40 : 653-663, 2020

      20 A. Joshi, "C-Feel-It: a sentiment analyzer for microblogs" 127-132, 2011

      21 S. P. Zhai, "Bilingual text sentiment analysis based on attention mechanism Bi-LSTM" 36 (36): 251-255, 2019

      22 S. Tan, "An empirical study of sentiment analysis for Chinese documents" 34 (34): 2622-2629, 2008

      23 H. Tang, "A survey on sentiment detection of reviews" 36 (36): 10760-10773, 2009

      24 Y. Bengio, "A neural probabilistic language model" 3 : 1137-1155, 2003

      25 E. Boiy, "A machine learning approach to sentiment analysis in multilingual web texts" 12 (12): 526-558, 2009

      26 Y. X. He, "A deep learning model enhanced with emotion semantics for microblog sentiment analysis" 40 (40): 773-790, 2017

      27 N. Kalchbrenner, "A convolutional neural network for modelling sentences" 655-665, 2014

      더보기

      동일학술지(권/호) 다른 논문

      동일학술지 더보기

      더보기

      분석정보

      View

      상세정보조회

      0

      Usage

      원문다운로드

      0

      대출신청

      0

      복사신청

      0

      EDDS신청

      0

      동일 주제 내 활용도 TOP

      더보기

      주제

      연도별 연구동향

      연도별 활용동향

      연관논문

      연구자 네트워크맵

      공동연구자 (7)

      유사연구자 (20) 활용도상위20명

      인용정보 인용지수 설명보기

      학술지 이력

      학술지 이력
      연월일 이력구분 이력상세 등재구분
      2023 평가예정 해외DB학술지평가 신청대상 (해외등재 학술지 평가)
      2020-01-01 평가 등재학술지 유지 (해외등재 학술지 평가) KCI등재
      2012-01-01 평가 등재학술지 선정 (등재후보2차) KCI등재
      2011-01-01 평가 등재후보 1차 PASS (등재후보1차) KCI등재후보
      2009-01-01 평가 등재후보학술지 선정 (신규평가) KCI등재후보
      더보기

      학술지 인용정보

      학술지 인용정보
      기준연도 WOS-KCI 통합IF(2년) KCIF(2년) KCIF(3년)
      2016 0.09 0.09 0.09
      KCIF(4년) KCIF(5년) 중심성지수(3년) 즉시성지수
      0.07 0.06 0.254 0.59
      더보기

      이 자료와 함께 이용한 RISS 자료

      나만을 위한 추천자료

      해외이동버튼