http://chineseinput.net/에서 pinyin(병음)방식으로 중국어를 변환할 수 있습니다.
변환된 중국어를 복사하여 사용하시면 됩니다.
Regularization based multi-output regression model tree
JunYong Jeong,Chi-Hyuck Jun 한국시뮬레이션학회 2017 한국시뮬레이션학회 학술대회집 Vol.2017 No.-
Multi-output regression has become an emerging problem in data-mining and machine learning. We propose a multi-output regression model tree to obtain both accuracy and interpretation. Each leaf of proposed model tree contains sparse linear models that exploit the relationship between response variables. We present an efficient splitting rule based on residual analysis. Experiments on several dataset identify the performance of the proposed model over benchmark methods.
Regularization based Multi-Output Regression Model Tree
JunYong Jeong,Chi-Hyuck Jun 대한산업공학회 2017 대한산업공학회 춘계학술대회논문집 Vol.2017 No.4
Multi-output regression has become an emerging problem in data-mining and machine learning. We propose a multi-output regression model tree to obtain both accuracy and interpretation. Each leaf of proposed model tree contains sparse linear models that exploit the relationship between response variables. We present an efficient splitting rule based on residual analysis. Experiments on several dataset identify the performance of the proposed model over benchmark methods.
Regularization based Multi-Output Regression Model Tree
JunYong Jeong,Chi-Hyuck Jun 한국경영과학회 2017 한국경영과학회 학술대회논문집 Vol.2017 No.4
Multi-output regression has become an emerging problem in data-mining and machine learning. We propose a multi-output regression model tree to obtain both accuracy and interpretation. Each leaf of proposed model tree contains sparse linear models that exploit the relationship between response variables. We present an efficient splitting rule based on residual analysis. Experiments on several dataset identify the performance of the proposed model over benchmark methods.
Effect of dimension reduction on predictability of multivariate chaotic time series
Jun-Yong Jeong,Jun-Seong Kim,Chi-Hyuck Jun 대한산업공학회 2015 대한산업공학회 춘계학술대회논문집 Vol.2015 No.4
Dimension reduction is an important component of a machine learning area. It transforms input spaces into the reduced spaces with smaller dimensionality. Goal of this paper is to analysis the effect of using various dimension reduction techniques for predicting multivariate chaotic time series. Input space of multivariate chaotic time series which is reconstructed state space usually brings more information of an original strange attractor than one of univariate chaotic time series. When the multivariate chaotic time series are used, however, it exhibits relatively high dimension on time delay coordinates vector which induces curse of dimensionality, statistical dependency and redundancy among features of input spaces which disturb the ability of machine learning techniques. To solve this problem, we apply dimension reduction techniques. After that, least squares support vector regression (LSSVR) of machine learning techniques is used to predict future value of chaotic time series. Our experiment consists of delayed Lorenz series.
Effect of dimension reduction on predictability of multivariate chaotic time series
Jun-Yong Jeong,Jun-Seong Kim,Chi-Hyuck Jun 한국경영과학회 2015 한국경영과학회 학술대회논문집 Vol.2015 No.4
Dimension reduction is an important component of a machine learning area. It transforms input spaces into the reduced spaces with smaller dimensionality. Goal of this paper is to analysis the effect of using various dimension reduction techniques for predicting multivariate chaotic time series. Input space of multivariate chaotic time series which is reconstructed state space usually brings more information of an original strange attractor than one of univariate chaotic time series. When the multivariate chaotic time series are used, however, it exhibits relatively high dimension on time delay coordinates vector which induces curse of dimensionality, statistical dependency and redundancy among features of input spaces which disturb the ability of machine learning techniques. To solve this problem, we apply dimension reduction techniques. After that, least squares support vector regression (LSSVR) of machine learning techniques is used to predict future value of chaotic time series. Our experiment consists of delayed Lorenz series.
Effect of Dimension Reduction on Prediction Performance of Multivariate Nonlinear Time Series
Jun-Yong Jeong,Jun-Seong Kim,Chi-Hyuck Jun 대한산업공학회 2015 Industrial Engineeering & Management Systems Vol.14 No.3
The dynamic system approach in time series has been used in many real problems. Based on Taken’s embedding theorem, we can build the predictive function where input is the time delay coordinates vector which consists of the lagged values of the observed series and output is the future values of the observed series. Although the time delay coordinates vector from multivariate time series brings more information than the one from univariate time series, it can exhibit statistical redundancy which disturbs the performance of the prediction function. We apply dimension reduction techniques to solve this problem and analyze the effect of this approach for prediction. Our experiment uses delayed Lorenz series; least squares support vector regression approximates the predictive function. The result shows that linearly preserving projection improves the prediction performance.
Ruptured Corpus Luteal Cyst: CT Findings
HyuckJaeChoi,SeungHyupKim,SunHoKim,Hyo-CheolKim,박창민,HakJongLee,문민환,JunYongJeong 대한영상의학회 2003 Korean Journal of Radiology Vol.4 No.1
Objective: To evaluate the CT findings of ruptured corpus luteal cysts. Materials and Methods: Six patients with a surgically proven ruptured corpus luteal cyst were included in this series. The prospective CT findings were retrospectively analyzed in terms of the size and shape of the cyst, the thickness and enhancement pattern of its wall, the attenuation of its contents, and peritoneal fluid. Results: The mean diameter of the cysts was 2.8 (range, 1.5 4.8) cm; three were round and three were oval. The mean thickness of the cyst wall was 4.7 (range, 1 10) mm; in all six cases it showed strong enhancement, and in three was discontinuous. In five of six cases, the cystic contents showed high attenuation. Peritoneal fluid was present in all cases, and its attenuation was higher, especially around the uterus and adnexa, than that of urine present in the bladder. Conclusion: In a woman in whom CT reveals the presence of an ovarian cyst with an enhancing rim and highly attenuated contents, as well as highly attenuated peritoneal fluid, a ruptured corpus luteal cyst should be suspected. Other possible evidence of this is focal interruption of the cyst wall and the presence of peritoneal fluid around the adnexa.