RISS 학술연구정보서비스

검색
다국어 입력

http://chineseinput.net/에서 pinyin(병음)방식으로 중국어를 변환할 수 있습니다.

변환된 중국어를 복사하여 사용하시면 됩니다.

예시)
  • 中文 을 입력하시려면 zhongwen을 입력하시고 space를누르시면됩니다.
  • 北京 을 입력하시려면 beijing을 입력하시고 space를 누르시면 됩니다.
닫기
    인기검색어 순위 펼치기

    RISS 인기검색어

      검색결과 좁혀 보기

      선택해제
      • 좁혀본 항목 보기순서

        • 원문유무
        • 원문제공처
        • 등재정보
        • 학술지명
        • 주제분류
        • 발행연도
          펼치기
        • 작성언어
        • 저자
          펼치기

      오늘 본 자료

      • 오늘 본 자료가 없습니다.
      더보기
      • 무료
      • 기관 내 무료
      • 유료
      • KCI등재

        A Robust Hybrid Estimator for Linear Regression

        정강모 한국자료분석학회 2010 Journal of the Korean Data Analysis Society Vol.12 No.6

        The least squares estimator in linear regression is most efficient when the errors are generated from thin tailed distributions, while the least absolute deviation estimator is most efficient when the errors are generated from thick- tailed distributions. A combination of the least squares estimator and the least absolute deviation estimator provide better results whether the errors come from thin or thick distributions. Even though the least absolute deviation estimator is not sensitive to regression outliers, it is not robust to leverage points. We propose a robust hybrid estimator based on a linear combination of the least squares estimator and a weighted least absolute deviation estimator. The results of simulation and a numerical example showed that the proposed estimator is very effective in many situations and is robust.

      • KCI등재

        A Robust Estimator in Ridge Regression

        정강모 한국자료분석학회 2007 Journal of the Korean Data Analysis Society Vol.9 No.2

        We propose a robust estimator in ridge regression using the least trimmed squares. We call this estimator the ridge least trimmed squares estimator(RLTSE). We show that RLTSE has a high breakdown point, and develop an algorithm for the proposed estimates. The algorithm is very fast and the convergence of the algorithm can be proved. Simulations are performed to compare the efficiencies of RLTSE with that of the ridge least squares estimator. The results show that the former is more efficient than the latter when the errors have outliers or leverages. And a numerical example is given to illustrate the effectiveness of the proposed estimate.

      • KCI등재

        Robust Statistical Methods in Variable Selection

        정강모 한국자료분석학회 2008 Journal of the Korean Data Analysis Society Vol.10 No.6

        Variable selection is an important research field in linear regression modeling with high-dimensional predictors. We proposed a robust penalized regression estimator which provides automatically selection of variables and estimation of regression parameters together. It is based on the least absolute deviation and the non-convex penalty function, the smoothly clipped absolute deviation suggested by Fan and Li(2001). We developed the algorithm for the proposed estimator using the local quadratic approximation and chose the tuning parameter of the penalty function. The algorithm needs only linear equations and so we can obtain quickly the estimators. The simulation result shows that the proposed estimator is robust and efficient for non-normal cases.

      • KCI우수등재

        Support vector regression with the weighted absolute deviation error loss function

        정강모 한국데이터정보과학회 2018 한국데이터정보과학회지 Vol.29 No.6

        In this paper we propose robust support vector regression algorithms to deal with noisy data sets. We adopt the absolute deviation error function for a loss function of regression model, and the proposed algorithms preserves the structure of the least squares support vector regression. The proposed algorithms are very fast and the procedures are much simpler than other support vector machine algorithms. They are robust to regression outliers, because the loss functions are less increasing than the squares error function for large errors and it uses a weight function for each observation. By comparing the proposed algorithms with other methods for the simulated datasets and benchmark datasets, the proposed methods are more robust than the least squares support vector regression when outliers exist.

      • KCI등재

        Influence Analysis of Multivariate Coefficients of Variation

        정강모 한국자료분석학회 2012 Journal of the Korean Data Analysis Society Vol.14 No.1

        The multivariate coefficients of variation are widely used in the recent technologies to compare the performance of analytical equipments. We considered four multivariate coefficients of variation proposed by several authors. We propose methods for detecting influential observations that have a large influence on the corresponding multivariate coefficients of variation. For this purpose we derive the empirical influence function and the derivative influence of the estimators under the perturbation scheme. Furthermore we computed the 95 percentile of the empirical influence function for multivariate coefficients of variation based on simulation. An illustrative example is given to show the effectiveness of the proposed methods on the identification of influential observations.

      • KCI등재

        Influence Analysis for Generalized Estimating Equations

        정강모 한국통계학회 2006 Journal of the Korean Statistical Society Vol.35 No.2

        We investigate the inuence of subjects or observations on regressioncoecients of generalized estimating equations using the inuence functionand the derivative inuence measures. The inuence function for regressioncoecients is derived and its sample versions are used for inuence analysis.The derivative inuence measures under certain perturbation schemes arederived. It can be seen that the inuence function method and the deriva-tive inuence measures yield the same inuence information. An illustrativeexample in longitudinal data analysis is given and we compare the resultsprovided by the inuence function method and the derivative inuence mea-sures.AMS 2000 subject classications.Primary 62J20, 62J12; Secondary 62P10.Keywords. Derivative inuence, diagnostics, generalized estimating equation, inuencefunction, longitudinal data analysis.1. IntroductionIn longitudinal studies measurements of the same subject are taken repeatedlythrough time. Longitudinal data has been applied to a wide range of elds,medicine, public health, biology and more. The repeated measurement for eachsubject requires that the within-subject correlation should be taken into account.Under the condition of non-repeated measurements for each subject it reducesto generalized linear models (McCullagh and Nelder, 1989). There are severalmethods to extend generalized linear models with the consideration of correlationswithin subjects: marginal mean models, random-eect models and transitionReceived February, 2006; accepted June, 2006.yThis work was supported by Korea Research Foundation Grant (KRF-2005-202-C00076).

      • KCI등재

        A Combined Robust Estimator Between the Least Squares Estimator and a t-type Regression Estimator

        정강모 한국자료분석학회 2011 Journal of the Korean Data Analysis Society Vol.13 No.5

        When the distribution of the errors in linear regression follows a normal distribution the least squares estimator is most efficient. However, if it follows a heavy-tailed distribution such as t distribution, then the least squares estimator is no longer efficient. We propose a combined estimator between the least squares estimator and a t-type regression estimator which is efficient even if the errors have a heavy-tailed or thin-tailed distributions. We calculated the asymptotic property of the proposed estimator. The results of simulation showed that the proposed estimator is robust and effective in many situations.

      • KCI등재

        Local Influence in LASSO Regression

        정강모 한국자료분석학회 2016 Journal of the Korean Data Analysis Society Vol.18 No.6

        The least absolute shrinkage and selection operator (LASSO) regression is very popular in these days, because we can get the high dimensional data cheaply. However, there are few research results on the influence analysis of the LASSO regression. We study the local influence on the LASSO estimator in linear regression. We derive a generalized Cook's distance of LASSO estimator, which is equivalent to the Cook's approach under simultaneous perturbation of the assumed model. We consider the perturbations of variance and explanatory variables. In particular we can get the influence information under an explanatory variables, which can not be obtained by the case deletion method. Since the closed form of the LASSO estimator can not be obtained, we can get the influence information based on the estimation of ridge regression. Furthermore, the final results on the local influence can be represented in a closed form. Numerical examples, for which we apply the proposed diagnostic methods, are given for illustration. The proposed method gives important influence information on the LASSO estimate by a simple graphical method.

      • KCI등재

        A Detection Method of Multivariate Outliers using Decompositions of the Squared Mahalanobis Distance

        정강모 한국자료분석학회 2005 Journal of the Korean Data Analysis Society Vol.7 No.6

        Kim(2000) proposed two meaningful decompositions of the squared Mahalanobis distance to uncover the sources of outlyingness for multivariate observations. The decompo- sition is useful for identifying some component variables dominating the Mahalanobis distance. In this article we considered the distributions for components of the decompo- sitions and we showed that each component follows a normal distribution and the random vector consisting of two components is a multivariate normal distribution. We may detect outliers using the cut-off values. We proposed a graphical tool for detecting multivariate outliers through one-sheet figure. It is a very useful result to analyze the outlyingness for high dimensional data. Two illustrative examples are given.

      • KCI등재

        Weighted Least Absolute Deviation Regression Estimator with the SCAD Function

        정강모 한국자료분석학회 2012 Journal of the Korean Data Analysis Society Vol.14 No.5

        In a regression model the estimator based on the absolute deviation loss function is more robust than that based on the squares error loss function. However, the least absolute deviation estimator is sensitive to leverage points of the predictors, even though it is robust to the regression outliers. We propose a robust penalized regression estimator to regression outliers and leverage points which provides automatically selection of variables together. It is based on the weighted least absolute deviation (SCAD) and the non-convex penalty function, the smoothly clipped absolute deviation function which has the oracle property. We develop a unified algorithm for the proposed estimator including the SCAD estimate, based on the local quadratic approximation and the tuning parameter of the penalty function. Numerical simulation shows that the proposed estimator is effective for analysing a contaminated data.

      연관 검색어 추천

      이 검색어로 많이 본 자료

      활용도 높은 자료

      해외이동버튼