MyJournals Home  

RSS FeedsEntropy, Vol. 25, Pages 245: Homogeneous Adaboost Ensemble Machine Learning Algorithms with Reduced Entropy on Balanced Data (Entropy)


29 january 2023 12:44:59

Entropy, Vol. 25, Pages 245: Homogeneous Adaboost Ensemble Machine Learning Algorithms with Reduced Entropy on Balanced Data (Entropy)

Today’s world faces a serious public health problem with cancer. One type of cancer that begins in the breast and spreads to other body areas is breast cancer (BC). Breast cancer is one of the most prevalent cancers that claim the lives of women. It is also becoming clearer that most cases of breast cancer are already advanced when they are brought to the doctor’s attention by the patient. The patient may have the evident lesion removed, but the seeds have reached an advanced stage of development or the body’s ability to resist them has weakened considerably, rendering them ineffective. Although it is still much more common in more developed nations, it is also quickly spreading to less developed countries. The motivation behind this study is to use an ensemble method for the prediction of BC, as an ensemble model aims to automatically manage the strengths and weaknesses of each of its separate models, resulting in the best decision being made overall. The main objective of this paper is to predict and classify breast cancer using Adaboost ensemble techniques. The weighted entropy is computed for the target column. Taking each attribute’s weights results in the weighted entropy. Each class’s likelihood is represented by the weights. The amount of information gained increases with a decrease in entropy. Both individual and homogeneous ensemble classifiers, created by mixing Adaboost with different single classifiers, have been used in this work. In order to deal with the class imbalance issue as well as noise, the synthetic minority over-sampling technique (SMOTE) was used as part of the data mining pre-processing. The suggested approach uses a decision tree (DT) and naive Bayes (NB), with Adaboost ensemble techniques. The experimental findings shown 97.95% accuracy for prediction using the Adaboost-random forest classifier.

96 viewsCategory: Informatics, Physics
Entropy, Vol. 25, Pages 243: Informativeness across Interpreting Types: Implications for Language Shifts under Cognitive Load (Entropy)
Entropy, Vol. 25, Pages 244: Covariant Lyapunov Vectors and Finite-Time Normal Modes for Geophysical Fluid Dynamical Systems (Entropy)
blog comments powered by Disqus
The latest issues of all your favorite science journals on one page


Register | Retrieve



Copyright © 2008 - 2023 Indigonet Services B.V.. Contact: Tim Hulsen. Read here our privacy notice.
Other websites of Indigonet Services B.V.: Nieuws Vacatures News Tweets Nachrichten