MyJournals Home  

RSS FeedsEntropy, Vol. 21, Pages 1179: CASMI--An Entropic Feature Selection Method in Turing`s Perspective (Entropy)

 
 

30 november 2019 07:03:03

 
Entropy, Vol. 21, Pages 1179: CASMI--An Entropic Feature Selection Method in Turing`s Perspective (Entropy)
 


Health data are generally complex in type and small in sample size. Such domain-specific challenges make it difficult to capture information reliably and contribute further to the issue of generalization. To assist the analytics of healthcare datasets, we develop a feature selection method based on the concept of coverage adjusted standardized mutual information (CASMI). The main advantages of the proposed method are: (1) it selects features more efficiently with the help of an improved entropy estimator, particularly when the sample size is small; and (2) it automatically learns the number of features to be selected based on the information from sample data. Additionally, the proposed method handles feature redundancy from the perspective of joint-distribution. The proposed method focuses on non-ordinal data, while it works with numerical data with an appropriate binning method. A simulation study comparing the proposed method to six widely cited feature selection methods shows that the proposed method performs better when measured by the Information Recovery Ratio, particularly when the sample size is small.


 
209 viewsCategory: Informatics, Physics
 
Entropy, Vol. 21, Pages 1170: Entropy and Information Theory: Uses and Misuses (Entropy)
Entropy, Vol. 21, Pages 1178: Relative Tail Pressure with Subadditive Potentials (Entropy)
 
 
blog comments powered by Disqus


MyJournals.org
The latest issues of all your favorite science journals on one page

Username:
Password:

Register | Retrieve

Search:

Physics


Copyright © 2008 - 2024 Indigonet Services B.V.. Contact: Tim Hulsen. Read here our privacy notice.
Other websites of Indigonet Services B.V.: Nieuws Vacatures News Tweets Nachrichten