<P><B>Abstract</B></P> <P>Recently mutual information based feature selection criteria have gained popularity for their superior performances in different applications of pattern recognition and machine learning areas. H...
http://chineseinput.net/에서 pinyin(병음)방식으로 중국어를 변환할 수 있습니다.
변환된 중국어를 복사하여 사용하시면 됩니다.
https://www.riss.kr/link?id=A107654929
2019
-
SCI,SCIE,SCOPUS
학술저널
162-174(13쪽)
0
상세조회0
다운로드다국어 초록 (Multilingual Abstract)
<P><B>Abstract</B></P> <P>Recently mutual information based feature selection criteria have gained popularity for their superior performances in different applications of pattern recognition and machine learning areas. H...
<P><B>Abstract</B></P> <P>Recently mutual information based feature selection criteria have gained popularity for their superior performances in different applications of pattern recognition and machine learning areas. However, these methods do not consider the correction while computing mutual information for finite samples. Again, finding appropriate discretization of features is often a necessary step prior to feature selection. However, existing researches rarely discuss both discretization and feature selection simultaneously. To solve these issues, Joint Bias corrected Mutual Information (JBMI) is firstly proposed in this paper for feature selection. Secondly, a framework namely modified discretization and feature selection based on mutual information is proposed that incorporates JBMI based feature selection and dynamic discretization, both of which use a <I>χ</I> <SUP>2</SUP> based searching method. Experimental results on thirty benchmark datasets show that in most of the cases, the proposed methods outperform the state-of-the-art methods.</P> <P><B>Highlights</B></P> <P> <UL> <LI> We address discretization and feature selection jointly with a single criteria. </LI> <LI> The proposed discretization method is dynamic and independent of classification algorithms. </LI> <LI> The amount of errors introduced for Relevancy, Redundancy and Complementary Information are derived analytically. </LI> <LI> It is also analytically shown that Relevancy, Redundancy and Complementary follows χ<SUP>2</SUP>-distribution. </LI> <LI> A χ<SUP>2</SUP>-based search is introduced to select a small set of features and to discretize them with small number of intervals. </LI> </UL> </P>
Cancelable fingerprint template design with randomized non-negative least squares