<P>To address class imbalance in data, we propose a new weight adjustment factor that is applied to a weighted support vector machine (SVM) as a weak learner of the AdaBoost algorithm. Different factor scores are computed by categorizing instanc...
http://chineseinput.net/에서 pinyin(병음)방식으로 중국어를 변환할 수 있습니다.
변환된 중국어를 복사하여 사용하시면 됩니다.
https://www.riss.kr/link?id=A107735909
2017
-
SCOPUS,SCIE
학술저널
92-103(12쪽)
0
상세조회0
다운로드다국어 초록 (Multilingual Abstract)
<P>To address class imbalance in data, we propose a new weight adjustment factor that is applied to a weighted support vector machine (SVM) as a weak learner of the AdaBoost algorithm. Different factor scores are computed by categorizing instanc...
<P>To address class imbalance in data, we propose a new weight adjustment factor that is applied to a weighted support vector machine (SVM) as a weak learner of the AdaBoost algorithm. Different factor scores are computed by categorizing instances based on the SVM margin and are assigned to related instances. The SVM margin is used to define borderline and noisy instances, and the factor scores are assigned to only borderline instances and positive noise. The adjustment factor is then employed as a multiplier to the instance weight in the AdaBoost algorithm when learning a weighted SVM. Using 10 real class-imbalanced datasets, we compare the proposed method to a standard SVM and other SVMs combined with various sampling and boosting methods. Numerical experiments show that the proposed method outperforms existing approaches in terms of F-measure and area under the receiver operating characteristic curve, which means that the proposed method is useful for relaxing the class-imbalance problem by addressing well-known degradation issues such as overlap, small disjunct, and data shift problems. (C) 2016 Elsevier Inc. All rights reserved.</P>