..............................
..............................
..............................
Enhancement of the Heuristic Optimization Based on Extended Space Forests using Classifier Ensembles
Extended space forests are a matter of common knowledge for ensuring improvements on classification problems.
They provide richer feature space and present better performance than the original feature space-based forests. Most of the
contemporary studies employs original features as well as various combinations of them as input vectors for extended space
forest approach. In this study, we seek to boost the performance of classifier ensembles by integrating them with heuristic
optimization-based features. The contributions of this paper are fivefold. First, richer feature space is developed by using
random combinations of input vectors and features picked out with ant colony optimization method which have high
importance and not have been associated before. Second, we propose widely used classification algorithm which is utilized
baseline classifier. Third, three ensemble strategies, namely bagging, random subspace, and random forests are proposed to
ensure diversity. Fourth, a wide range of comparative experiments are conducted on widely used biomedicine datasets
gathered from the University of California Irvine (UCI) machine learning repository to contribute to the advancement of
proposed study. Finally, extended space forest approach with the proposed technique turns out remarkable experimental
results compared to the original version and various extended versions of recent state-of-art studies.
[1] Adnan M., Islam M., and Kwan P., “Extended Space Decision Tree,” in Proceedings of Machine Learning and Cybernetics Conference, Lanzhou, pp. 219-230, 2014.
[2] Adnan M. and Islam M., “Comprehensive Method for Attribute Space Extension for Random Forest,” in Proceedings of 17th International Conference on Computer and Information Technology, Dhaka, pp. 25-29, 2014.
[3] Ahmed A. and Brown G., “Random Projection Random Discretization Ensembles-Ensembles of Linear Multivariate Decision Trees,” IEEE Transactions on Knowledge and Data Engineering, vol. 26, no. 5, pp. 1225-1239, 2014.
[4] Amasyalı M. and Ersoy O., “Classifier Ensembles with the Extended Space Forest,” IEEE Transactions on Knowledge and Data Engineering, vol. 26, no. 3, pp. 549-562, 2014. 194 The International Arab Journal of Information Technology, Vol. 17, No. 2, March 2020
[5] Breiman L., Friedman J., Olshen R., and Stone C., Classification and Regression Trees, Chapman and Hall, 1984.
[6] Breiman L., “Bagging Predictors,” Machine Learning, vol. 24, no. 2, pp. 123-140, 1996.
[7] Breiman L., “Random Forests,” Machine Learning, vol. 45, no. 1, pp. 5-32, 2001.
[8] Frank A. and Asuncion A., UCI Machine Learning Repository, https://archive.ics.uci.edu/ml/index.php, Last Visited, 2010.
[9] Goodarzi M., Freitas M., and Jensen R., “Ant Colony Optimization As A Feature Selection Method in the QSAR Modeling of Anti-HIV-1 Activities of 3-(3,5-Dimethylbenzyl)Uracil Derivatives Using MLR, PLS and SVM Regressions,” Chemometrics and Intelligent Laboratory Systems, vol. 98, no. 2, pp. 123-129, 2009.
[10] Gopika D. and Azhagusundari B., “An Analysis on Ensemble Methods in Classification Tasks,” International Journal of Advanced Research in Computer and Communication Engineering, vol. 3, no. 7, pp. 7423-7427, 2014.
[11] Ho T., “The Random Subspace Method for Constructing Decision Forests,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 20, no. 8, pp. 832-844, 1998.
[12] Kotsiantis S., “Combining Bagging, Boosting, Rotation Forest and Random Subspace Methods,” Artificial Intelligence Review, vol. 35, no. 3, pp. 223-240, 2011.
[13] Koutanaei F., Sajedi H., and Khanbabaei M., “A Hybrid Data Mining Model of Feature Selection Algorithms and Ensemble Learning Classifiers for Credit Scoring,” Journal Retailing and Consumer Services, vol. 27, pp. 11-23, 2015.
[14] Kuncheva L. and Whitaker C., “Measures of Diversity in Classifier Ensembles and Their Relationship with the Ensemble Accuracy,” Journal Machine Learning, vol. 51, no. 2, pp. 181-207, 2003.
[15] Kuncheva L., Rodríguez J., Plumpton C., Linden D., and Johnston S., “Random Subspace Ensembles for Fmri Classification,” IEEE Transactions on Medical Imaging, vol. 29, no. 2, pp. 531-542, 2010.
[16] Liua Y. and Yaob X., “Ensemble Learning Via Negative Correlation,” Neural Networks, vol. 12, no. 10, pp. 1399-1404, 1999.
[17] Liu Y., Yu X., Huang J., and An A., “Combining Integrated Sampling with SVM Ensembles for Learning from Imbalanced Datasets,” Information Processing and Management, vol. 47, no. 4, pp. 617-631, 2011.
[18] Martin B., Jirka M., and Franky, “Multi-Label Text Classification via Ensemble Techniques,” International Journal of Computer and Communication Engineering, vol. 1, no. 1, pp. 62-71, 2012.
[19] Nanni L. and Lumini A., “Random Subspace for an Improved Biohashing for Face Authentication,” Pattern Recognition Letters, vol. 29, no. 3, pp. 295-300, 2008.
[20] Onan A., Korukoglu S., and Bulut H., “Ensemble of Keyword Extraction Methods and Classifiers in Text Classification,” Expert Systems with Application, vol. 57, pp. 232-247, 2016.
[21] Panov P. and Dzeroski S., “Combining Bagging and Random Subspaces to Create Better Ensembles,” in Proceedings of International Conference on Intelligent Data Analysis, Ljublijana, pp. 118-129, 2007.
[22] Pedrajas N. and Boyer D., “Boosting Random Subspace Method,” Neural Networks, vol. 21, no. 9, pp. 1344-1362, 2008.
[23] Peralta B. and Soto A., “Embedded Local Feature Selection within Mixture of Experts,” Information Sciences, vol. 269, pp. 176-187, 2014.
[24] Polikar R., “Ensemble Based Systems in Decision Making,” IEEE Circuits and Systems Magazine, vol. 6, no. 3, pp. 21-45, 2006.
[25] Rahman A. and Verma B., “Effect of Ensemble Classifier Composition on Offline Cursive Character Recognition,” Information Processing and Management, vol. 49, no. 4, pp. 852-864, 2013.
[26] Ren Y., Zhang L., and Suganthan P., “Ensemble Classification and Regression-Recent Developments, Applications and Future Directions,” IEEE Computational Intelligence Magazine, vol. 11, no. 1, pp. 41-53, 2016.
[27] Rokach L., “Taxonomy for Characterizing Ensemble Methods in Classification Tasks: A Review and Annotated Bibliography,” Computational Statistics and Data Analaysis, vol. 53, no. 12, pp. 4046-4072, 2009.
[28] Rokach L., “Ensemble Based Classifiers,” Artificial Intelligence Review, vol. 33, no. 1-2, pp. 1-39, 2010.
[29] Rooney N., Wang H., and Taylor P., “An Investigation into The Application of Ensemble Learning for Entailment Classification,” Information Processing and Management, vol. 50, no. 1, pp. 87-103, 2014.
[30] Singh V. and Pradhan M., “Advanced Methodologies Employed in Ensemble of Classifiers: A Survey,” International Journal of Science and Research, vol. 3, no. 12, pp. 591- 595, 2014.
[31] Skurichina M. and Duin R., “Bagging, Boosting and the Random Subspace Method for Linear Classifiers,” Pattern Analysis and Applications, vol. 5, pp. 121-135, 2002. Enhancement of the Heuristic Optimization Based on Extended Space Forests ... 195
[32] Wang G., Zhang Z., Sun J., Yang S., and Larson C., “POS-RS: A Random Subspace Method for Sentiment Classification Based on Part-of- Speech Analysis,” Information Processing and Management, vol. 51, no. 4, pp. 458-479, 2015.
[33] Wang X. and Tang X., “Random Sampling for Subspace Face Recognition,” International Journal of Computer Vision, vol. 70, no. 1, pp. 91-104, 2006.
[34] Xia R., Zong C., and Li S., “Ensemble of Feature Sets and Classification Algorithms for Sentiment Classification,” Information Sciences, vol. 181, no. 6, pp. 1138-1152, 2011.
[35] Zaghdoudi R. and Seridi H., “Combination of Multiple Classifiers for Off-Line Handwritten Arabic Word Recognition,” The International Arab Journal of Information Technology, vol. 14, no. 5, pp. 713-720, 2017. Zeynep Kilimci received the M.Sc. degree in computer and information science from the Dogus University (DU), Istanbul, Turkey, in 2013. She was a software engineer in datawarehouse department at DenizBank Istanbul, Turkey, between 2009 and 2011. She is a research assistant at DU and Ph.D. student at Computer Engineering Department of Kocaeli University, Kocaeli, Turkey. Her research interests are machine learning, ensemble learning, deep learning, text mining, and artificial intelligence. Sevinç Omurca was born in Isparta, Turkey in 1979. She received the M.Sc. degree in electronics and communications engineering and Ph.D. degree in computer engineering from the University of Kocaeli, Turkey, in 2004 and 2009, respectively. In 2001, she joined the Department of Computer Engineering, University of Kocaeli as a Research assistant; she became an Assistant Professor in 2010 and an Associate Professor in 2016. Her current research interests include machine learning, natural language processing, and artificial intelligence.