In this work, we focus on nonparametric kernel methods for estimating the probability density function (pdf). The convergence of a kernel estimator depends crucially on the choice of the smoothing parameter. We present in this paper, a new method for ...
In this work, we focus on nonparametric kernel methods for estimating the probability density function (pdf). The convergence of a kernel estimator depends crucially on the choice of the smoothing parameter. We present in this paper, a new method for optimizing the bandwidth of an estimator of the probability density function: the adaptive kernel estimator. This optimized estimator is used to construct the Bayes classifier. In this sense, we have proposed a new approach to optimize the pdf based on the statistical properties of the probability distributions of random variables. We adopt the maximum entropy principle (MEP) in order to determine the optimal value of the smoothing parameter used in the estimator. In the proposed criterion, the estimated probability density function is called optimal in the sense of having a minimum error rate of classifying data. Finally, we illustrate the robustness of our optimization process of the kernel estimation methods by using a set of DNA microarray data showing that our approach effectively improves the performance of the classification process.