http://chineseinput.net/에서 pinyin(병음)방식으로 중국어를 변환할 수 있습니다.
변환된 중국어를 복사하여 사용하시면 됩니다.
Multiclass Least Squares Twin Support Vector Machine for Pattern Classification
Divya Tomar,Sonali Agarwal 보안공학연구지원센터 2015 International Journal of Database Theory and Appli Vol.8 No.6
This paper proposes a Multiclass Least Squares Twin Support Vector Machine (MLSTSVM) classifier for multi-class classification problems. The formulation of MLSTSVM is obtained by extending the formulation of recently proposed binary Least Squares Twin Support Vector Machine (LSTSVM) classifier. For M-class classification problem, the proposed classifier seeks M-non parallel hyper-planes, one for each class, by solving M-linear equations. A regularization term is also added to improve the generalization ability. MLSTSVM works well for both linear and non-linear type of datasets. It is relatively simple and fast algorithm as compared to the other existing approaches. The performance of proposed approach has been evaluated on twelve benchmark datasets. The experimental result demonstrates the validity of proposed MLSTSVM classifier as compared to the typical multi-classifiers based on ‘Support Vector Machine’ and ‘Twin Support Vector Machine’. Statistical analysis of the proposed classifier with existing classifiers is also performed by using Friedman’s Test statistic and Nemenyi post hoc techniques.
정강모(Kang-Mo Jung) 한국자료분석학회 2023 Journal of the Korean Data Analysis Society Vol.25 No.5
고전적인 지지기계벡터가 어떤 부등식 제약 조건에서 최적화 문제의 해를 구하는 것에 비해 최소 제곱 지지기계벡터는 이 부등식 제약 조건을 등식 제약 조건으로 변환하여 문제의 해를 구한다. 따라서 최소 제곱 지지기계벡터는 행렬을 이용하여 정확 해를 구할 수 있어 회귀와 분류문제의 많은 분야에서 탁월한 성과를 이뤘다. 그러나 최소 제곱 지지기계벡터에서 구한 해는 이상치에 민감하고, 고전적인 지지기계벡터의 장점인 희박한 지지벡터를 제공하지 못한다는 단점이 있다. 이를 해결하기 위해 본 논문에서는 최소 절댓값 손실함수를 이용함으로써 이상치에 강건한 최소 절대 편차 지지기계벡터의 해를 구한다. 또한, 지지벡터의 희박성을 위해 재귀적 축소 최소 제곱 지지기계벡터를 이용하는 방법을 제시하고자 한다. 최소 절댓값 손실함수의 최적화 문제를 해결하기 위해 분리-브레그만 반복 방법을 사용하여 정확한 해를 구하였다. 본 논문에서 제시한 방법은 기존의 최소 제곱 지지기계벡터가 가지는 단점을 극복하는 효율적인 방법으로 간단한 수치 자료와 벤치마크 자료의 분석 결과가 해의 강건성과 희박성 측면에서 기존 결과와 비교할 만한 수준을 보였다. The support vector machine solves a quadratic programming problem with linear inequality and equality constraints. However, it is not trivial to solve the quadratic problem. The least squares support vector machine(LS-SVM) solves a linear system by equality constraints instead of inequality constraints. LS-SVM is a popular method in regression and classification problems, because it effectively solves simple linear systems. There are two issues with the LS-SVM solution : the lack of robustness to outliers and the absence of sparseness. In this paper, we propose a sparse and robust support vector machine for regression problems using the least absolute deviation support vector machine (LAD-SVM) and recursive reduced LS-SVM (RR-LS-SVM). The split-Bregman iteration gives the exact solution for the LAD-SVM problem, while RR-LS-SVM gives a sparse solution with a much smaller number of all support vectors. Numerical experiments with simulation and benchmark data demonstrate that the proposed algorithm can achieve comparable performance to other methods in terms of robustness and sparseness.
A transductive least squares support vector machine with the difference convex algorithm
Shim, Jooyong,Seok, Kyungha The Korean Data and Information Science Society 2014 한국데이터정보과학회지 Vol.25 No.2
Unlabeled examples are easier and less expensive to obtain than labeled examples. Semisupervised approaches are used to utilize such examples in an eort to boost the predictive performance. This paper proposes a novel semisupervised classication method named transductive least squares support vector machine (TLS-SVM), which is based on the least squares support vector machine. The proposed method utilizes the dierence convex algorithm to derive nonconvex minimization solutions for the TLS-SVM. A generalized cross validation method is also developed to choose the hyperparameters that aect the performance of the TLS-SVM. The experimental results conrm the successful performance of the proposed TLS-SVM.
A transductive least squares support vector machine with the difference convex algorithm
Jooyong Shim,Kyungha Seok 한국데이터정보과학회 2014 한국데이터정보과학회지 Vol.25 No.2
Unlabeled examples are easier and less expensive to obtain than labeled exam-ples. Semisupervised approaches are used to utilize such exam-ples in an effort to boost the predictive performance. This paper proposes a novel semisupervised classification method named transductive least squares support vector machine (TLS-SVM), which is based on the least squares support vector machine. The proposed method utilizes the difference convex algorithm to derive nonconvex minimization solutions for the TLS-SVM. A generalized cross validation method is also developed to choose the hy-perparameters that affect the performance of the TLS-SVM. The experimental results confirm the successful performance of the proposed TLS-SVM.
A transductive least squares support vector machine with the difference convex algorithm
심주용,석경하 한국데이터정보과학회 2014 한국데이터정보과학회지 Vol.25 No.2
Unlabeled examples are easier and less expensive to obtain than labeled examples. Semisupervised approaches are used to utilize such examples in an effort to boost the predictive performance. This paper proposes a novel semisupervised classification method named transductive least squares support vector machine (TLS-SVM), whichis based on the least squares support vector machine. The proposed method utilizesthe difference convex algorithm to derive nonconvex minimization solutions for the TLS-SVM. A generalized cross validation method is also developed to choose the hyperparameters that affect the performance of the TLS-SVM. The experimental results confirm the successful performance of the proposed TLS-SVM.
양파의 생구무게 예측을 위한 여러 가지 일반회귀모형의 성능 비교
강윤정(Yunjeong Kang),나명환(Myung Hwan Na),조완현(Wanhyun Cho),고현석(Hyeon Seok Ko) 한국자료분석학회 2021 Journal of the Korean Data Analysis Society Vol.23 No.1
양파는 우리나라의 5대 채소 중 하나로 1인당 연간 소비량이 30kg에 달할 정도로 인기가 좋다. 양파의 수급은 전반적으로 국내 생산에 의존하므로 안정적인 가격으로 소비자에게 공급하기 위해서는 생산량 예측이 필요하다. 일반적으로 양파는 노지에서 재배되는데, 노지재배 특성상 급격한 기후 변화나 자연재해는 채소 생장에 큰 문제를 야기한다. 양파 재배 농가 및 관계기관에 서는 수확량 예측 및 증대를 위해 생육 최적의 환경 조건을 파악해야 하고 이를 위해 영향력이큰 환경 요인이 무엇인지 밝혀야 한다. 본 연구에서 우리는 주성분분석(PCA)과 부분최소제곱 (PLS)을 이용하여 재배기간 동안 양파의 생구 무게에 유의미한 영향을 주는 환경요소가 어떤 것인지 도출하고, 이를 통해 재배 시 생육 관리에 대한 전략을 제시하였다. 부분최소제곱 주성분이 현실적으로 우리가 예상하는 결과와 부합하였는데 생구 무게와 강수량, 습도는 음의 상관을 가지고 기온 및 지온의 차이, 일사량과는 양의 상관을 가짐을 확인하였다. 따라서 양파 생구의 안정적인 생장을 위해서는 습도, 일조, 기온 차에 관한 관리가 필요하다. 또한 일반 회귀모형인 주성분회귀모형(PCAR), 부분최소제곱회귀모형(PLSR) 그리고 서포트벡터머신회귀모형(SVMR)을 이용하여 수확 시기의 생구 무게를 예측하고 이들의 성능을 비교 해보았다. 서포트벡터머신회귀모 형이 월등히 우수한 성능을 보였고, 부분최소제곱회귀모형, 주성분 다항회귀 모형이었으나 큰 차이를 보이지는 않았다. Onions are one of Korea s five major vegetables and are so popular that the annual consumption per person reaches 30kg. Since the supply and demand of onions is entirely dependent on domestic production, it is necessary to predict the amount of production in order to supply them to consumers at a stable price. In general, onions are cultivated in the open field. Due to the characteristic of the field cultivation, rapid climate change or natural disasters cause a big problem in vegetable growth. Onion cultivation farms and related organizations must identify the optimal environmental conditions for growth in order to predict and increase the yield. In this study, we use principal component analysis (PCA) and partial least squares (PLS), to find environmental factors that have a significant influence on the weight of onions during the cultivation period, and through this, we present an environmental management strategy during cultivation. The partial least-squares principal component was realistically consistent with the results we expected. It was confirmed that the weight of bulbs, precipitation, and humidity had a negative correlation, and had a positive correlation with the difference between temperature and ground temperature, and the amount of insolation. Therefore, it is necessary to manage the difference in humidity, sunlight, and temperature for stable growth of onions. In addition, using the general regression model, the principal component regression model (PCAR), the partial least squares regression model (PLSR), and the support vector machine regression model (SVMR), we predicted the weight compared their performance. The model that predicted the weight of onion bulb best through experiments was the support vector machine, followed by the partial least squares regression model and the principal component polynomial regression model, but there was no significant difference.
Least-Squares Support Vector Machine for Regression Model with Crisp Inputs-Gaussian Fuzzy Output
Chang Ha Hwang 한국데이터정보과학회 2004 한국데이터정보과학회지 Vol.15 No.2
Least-squares support vector machine (LS-SVM) has been very successful in pattern recognition and function estimation problems for crisp data. In this paper, we propose LS-SVM approach to evaluating fuzzy regression model with multiple crisp inputs and a Gaussian fuzzy output. The proposed algorithm here is model-free method in the sense that we do not need assume the underlying model function. Experimental result is then presented which indicate the performance of this algorithm.
Least-Squares Support Vector Machine for Regression Model with Crisp Inputs-Gaussian Fuzzy Output
Hwang, Chang-Ha Korean Data and Information Science Society 2004 한국데이터정보과학회지 Vol.15 No.2
Least-squares support vector machine (LS-SVM) has been very successful in pattern recognition and function estimation problems for crisp data. In this paper, we propose LS-SVM approach to evaluating fuzzy regression model with multiple crisp inputs and a Gaussian fuzzy output. The proposed algorithm here is model-free method in the sense that we do not need assume the underlying model function. Experimental result is then presented which indicate the performance of this algorithm.
Least-squares support vector machine for regression model with crisp inputs-gaussian fuzzy output
황창하 한국데이터정보과학회 2004 한국데이터정보과학회지 Vol.15 No.2
Least-squares support vector machine (LS-SVM) has been very successful in pattern recognition and function estimation problems for crisp data. In this paper, we propose LS-SVM approach to evaluating fuzzy regression model with multiple crisp inputs and a Gaussian fuzzy output. The proposed algorithm here is model-free method in the sense that we do not need assume the underlying model function. Experimental result is then presented which indicate the performance of this algorithm.
A Study on Support Vectors of Least Squares Support Vector Machine
Seok, Kyungha,Cho, Daehyun 한국통계학회 2003 Communications for statistical applications and me Vol.10 No.3
LS-SVM(Least-Squares Support Vector Machine) has been used as a promising method for regression as well as classification. Suykens et al.(2000) used only the magnitude of residuals to obtain SVs(Support Vectors). Suykens' method behaves well for homogeneous model. But in a heteroscedastic model, the method shows a poor behavior. The present paper proposes a new method to get SVs. The proposed method uses the variance of noise as well as the magnitude of residuals to obtain support vectors. Through the simulation study we justified excellence of our proposed method.