[1] Almuallim H, Dietterich T G (1991) Learning with many irrelevant features. In: Proceedings of Ninth National Conference on Artificial Intelligence 547-552.
[2] Awaidah S M, Mahmoud S A (2009) A multiple feature/resolution schemeto Arabic (Indian) numerals recognition using hidden Markov models. Signal Process 89: 1176-1184.
[3] Biesiada J, Duch W (2008) Feature selection for high-dimensional data-a Pearson redundancy based filter. Advances in Soft Computing 45:242-249.
[4] Chen B, Chen L, Chen Y (2013) Efficient ant colony optimization for image feature selection. Signal Process 93:1566-1576.
[5] Chen Y, Li Y, Cheng X, Guo L (2006) Survey and taxonomy of feature selection algorithms in intrusion detection system. LECT NOTES COMPUT SC 4318: 153-167.
[6] Chuang L Y, Hsiao C J, Yang C H (2011) Chaotic particle swarm optimization for data clustering. EXPERT SYST APPL 38: 14555-14563.
[7] Chuang L, Yang C, Li J C (2011) Chaotic maps based on binary particle swarm optimization for feature selection. Applied Soft Computing 11: 239-248.
[8] Dash M, Liu H (1997) Feature selection for classification. Intelligent Data Analysis 1:131-156.
[9] Ferreira A J, Figueiredo M A T (2012) Efficient feature selection filters for high-dimensional data. PATTERN RECOGN LETT 33: 1794-1804.
[10] Gasca E, Sanchez J S, Alonso R (2006) Eliminating redundancy and irrelevance using a new MLP-based feature selection method. PATTERN RECOGN 39: 313-315.
[11] Gheyas I A, Smith L S (2010) Feature subset selection in large dimensionality domains. PATTERN RECOGN 43:5-13.
[12] Guan S, Liu J, Qi Y (2004) An incremental approach to contribution based feature selection. Journal of Intelligence Systems 13.
[13] Hall M A (1999) Correlation-based feature selection for machine learning. thesis is submitted in partial fulfilment of the requirements for the degree of Doctor of Philosophy at The University of Waikato.
[14] Hosseinzadeh Aghdam M, Ghasem Aghaee N, Basiri M E (2009) Text feature selection using ant colony optimization. EXPERT SYST APPL 36: 6843-6853.
[15] Hsu C, Huang H, Schuschel D (2002) The ANNIGMA-wrapper approach to fast feature selection for neural nets. IEEE Transactions on Systems, Man, and Cybernetics - Part B: Cybernetics 32:207-212.
[16] Hua J, Tembe W, Dougherty E R. (2008) Feature selection in the classification of high-dimension data. IEEE International Workshop on Genomic Signal Processing and Statistics 1-2.
[17] Huang K, Aviyente S (2006)Information-theoretic wavelet packet sub-band selection for texture classification. Signal Process 86:1410-1420.
[18] Jain A, Zongker D (1997) Feature selection: evaluation, application, and small sample performance. IEEE Transactions on Pattern Analysis and Machine Intelligence 19:153 - 158.
[19] Jensen R, Shen Q(2001) A rough set aided system for sorting WWW bookmarks. Web Intelligence: Research and Development 95-105.
[20] Jiang L, Cai Z, Wang D, Jiang S(2007) Survey of improving k-nearest-neighbor for classification. In: Proceedings of Fourth International Conference on Fuzzy Systems and Knowledge Discovery , , 1, pp. 679 - 683.
[21] Jin X, Xu A, Bie R, Guo P(2006) Machine learning techniques and chi-square feature selection for cancer classification using SAGE gene expression profiles, LECT NOTES COMPUT SC 3916: 106 - 115.
[22] Kabira M M, Shahjahan M, Murase K(2012) A new hybrid ant colony optimization algorithm for feature selection. EXPERT SYST APPL 39:3747-3763.
[23] Kennedy J, Eberhart R C (1997) A discrete binary version of the particle swarm algorithm. In: Proceedings of the 1997 Conference on Systems, Man, and Cybernetics 4104-4109.
[24] Kennedy J, Eberhart R C (1995) Particle swarm optimization. In: Proceedings of the IEEE International Conference on Neural Networks 4:1942-1948.
[25] Kira K, Rendell L A (1992) The feature selection problem: traditional methods and a new algorithm. In: Proceedings of Ninth National Conference on Artificial Intelligence 129-134.
[26] Liao C, Li S, Luo Z (2007) Gene selection using Wilcoxon rank sum test and support vector machine for cancer. LECT NOTES COMPUT SC 4456:57-66.
[27] Liu H, Setiono R (1996) A probabilistic approach to feature selection - a filter solution. In: Proceedings of Ninth International Conference on Industrial and Engineering Applications of AI and ES 284-292.
[28] Mitra S, Kundu P, Pedrycz W (2012) Feature selection using structural similarity. INFORM SCIENCES 198: 48-61.
[29] Kanan H R, Faez K (2008) An improved feature selection method based on ant colony optimization (aco) evaluated on face recognition system. APPL MATH COMPUT 205: 716-725.
[30] Nickabadi A, Ebadzadeh M M, Safabakhsh R (2011) A novel particle swarm optimization algorithm with adaptive inertia weight. Applied Soft Computing 11: 3658-3670.
[31] Novaković J, Strbac P, Bulatović D(2011) Toward optimal feature selection using ranking methods and classification algorithms. Yugoslav Journal of Operations Research 21: 119-135.
[32] Patricia E N, Andries L, Engelbrecht P(2010) A decision rule-based method for feature selection in predictive data mining. EXPERT SYST APPL 37:602-609.
[33] Pedrycz W , Chen S. M (2015) Information Granularity, Big Data, and computational Intelligence. Springer, Heidelberg, Germany.
[34] Peng H, Long F, Ding C (I2005) Feature selection based on mutual information criteria of max-dependency, max-relevance, and min redundancy. IEEE T PATTERN ANAL 27:1226-1238.
[35] Peng H, Long F, Ding C(2003) Over fitting in making comparisons between variable selection methods. Journal of Machine Learning Research 3: 1371-1382.
[36] Raman B, Ioerger T R (2002) Instance based filter for feature selection, Journal of Machine Learning Research 1: 1-23.
[37] Rocchi L, Chiari L, Cappello A(2004) Feature selection of stabilometric parameters based on principal component analysis. Medical and Biological Engineering and Computing 42: 71-79.
[38] Rostami N, Nezamabadi H (2006) A new method for binary PSO. In: proceedings of the international Conference on Electronic Engineering (in Persian).
[39] Sahu B, Mishra D (2012) A novel feature selection algorithm using particle swarm optimization for cancer microarray data. Procedia Engineering 38:27-31.
[40] Shen Yi, Bu Y, Yuan M (2009) A novel chaos particle swarm optimization (pso) and its application in pavement maintance decision. In: Proceedings on fourth IEEE Conference on Industrial Electronics and Applications 3521-3526.
[41] Sikonja M R, Kononenko I (2003) Theoretical and empirical analysis of Relief and Relieff. Machine Learning 53: 23-69.
[42] Sivagaminathan R K, Ramakrishnan S(2007) A hybrid approach for feature subset selection using neural networks and ant colony optimization. EXPERT SYST APPL 33: 49-60.
[43] Tahir M A, Smith J(2010) Creating diverse nearest-neighbor ensembles using simultaneous metaheuristic feature selection. PATTERN RECOGN LETT 31: 1470-1480.
[44] Thangavel K, Bagyamani J, Rathipriya R (2012) Novel hybrid pso-sa model for biclustering of expression data. Procedia Engineering 30: 1048 - 1055.
[45] Traina C, Traina A, Wu L, Faloutsos C (2000) Fast feature selection using the fractal dimension, In: Proceedings of the fifteenth Brazilian Symposium on Databases (SBBD) 158-171.
[46] Tsai C F, Eberle W, Chu C Y(2013)Genetic algorithms in feature and instance selection, Knowledge Based Systems 39: 240-247.
[47] Unler A, Murat A(2010) A discrete particle swarm optimization method for feature selection in binary classification problems. EUR J OPER RES 206: 528-539.
[48] Wang X, Yang J, Teng X, Xia W, Jensen R (2007)Feature selection based on rough sets and particle swarm optimization. PATTERN RECOGN LETT 28: 459-471.
[49] Wang Y, Dahnoun N, Achim A(2012)A novel system for robust lane detection and tracking. Signal Process 92: 319-334.
[50] Wu X, et al (2008) Top 10 algorithms in data mining. Knowl Inf Syst 14:1-37.
[51] Yang W, Li D, Zhu L (2011) An improved genetic algorithm for optimal feature subset selection from multi-character feature set. EXPERT SYST APPL 38: 2733-2740.
[2] Awaidah S M, Mahmoud S A (2009) A multiple feature/resolution schemeto Arabic (Indian) numerals recognition using hidden Markov models. Signal Process 89: 1176-1184.
[3] Biesiada J, Duch W (2008) Feature selection for high-dimensional data-a Pearson redundancy based filter. Advances in Soft Computing 45:242-249.
[4] Chen B, Chen L, Chen Y (2013) Efficient ant colony optimization for image feature selection. Signal Process 93:1566-1576.
[5] Chen Y, Li Y, Cheng X, Guo L (2006) Survey and taxonomy of feature selection algorithms in intrusion detection system. LECT NOTES COMPUT SC 4318: 153-167.
[6] Chuang L Y, Hsiao C J, Yang C H (2011) Chaotic particle swarm optimization for data clustering. EXPERT SYST APPL 38: 14555-14563.
[7] Chuang L, Yang C, Li J C (2011) Chaotic maps based on binary particle swarm optimization for feature selection. Applied Soft Computing 11: 239-248.
[8] Dash M, Liu H (1997) Feature selection for classification. Intelligent Data Analysis 1:131-156.
[9] Ferreira A J, Figueiredo M A T (2012) Efficient feature selection filters for high-dimensional data. PATTERN RECOGN LETT 33: 1794-1804.
[10] Gasca E, Sanchez J S, Alonso R (2006) Eliminating redundancy and irrelevance using a new MLP-based feature selection method. PATTERN RECOGN 39: 313-315.
[11] Gheyas I A, Smith L S (2010) Feature subset selection in large dimensionality domains. PATTERN RECOGN 43:5-13.
[12] Guan S, Liu J, Qi Y (2004) An incremental approach to contribution based feature selection. Journal of Intelligence Systems 13.
[13] Hall M A (1999) Correlation-based feature selection for machine learning. thesis is submitted in partial fulfilment of the requirements for the degree of Doctor of Philosophy at The University of Waikato.
[14] Hosseinzadeh Aghdam M, Ghasem Aghaee N, Basiri M E (2009) Text feature selection using ant colony optimization. EXPERT SYST APPL 36: 6843-6853.
[15] Hsu C, Huang H, Schuschel D (2002) The ANNIGMA-wrapper approach to fast feature selection for neural nets. IEEE Transactions on Systems, Man, and Cybernetics - Part B: Cybernetics 32:207-212.
[16] Hua J, Tembe W, Dougherty E R. (2008) Feature selection in the classification of high-dimension data. IEEE International Workshop on Genomic Signal Processing and Statistics 1-2.
[17] Huang K, Aviyente S (2006)Information-theoretic wavelet packet sub-band selection for texture classification. Signal Process 86:1410-1420.
[18] Jain A, Zongker D (1997) Feature selection: evaluation, application, and small sample performance. IEEE Transactions on Pattern Analysis and Machine Intelligence 19:153 - 158.
[19] Jensen R, Shen Q(2001) A rough set aided system for sorting WWW bookmarks. Web Intelligence: Research and Development 95-105.
[20] Jiang L, Cai Z, Wang D, Jiang S(2007) Survey of improving k-nearest-neighbor for classification. In: Proceedings of Fourth International Conference on Fuzzy Systems and Knowledge Discovery , , 1, pp. 679 - 683.
[21] Jin X, Xu A, Bie R, Guo P(2006) Machine learning techniques and chi-square feature selection for cancer classification using SAGE gene expression profiles, LECT NOTES COMPUT SC 3916: 106 - 115.
[22] Kabira M M, Shahjahan M, Murase K(2012) A new hybrid ant colony optimization algorithm for feature selection. EXPERT SYST APPL 39:3747-3763.
[23] Kennedy J, Eberhart R C (1997) A discrete binary version of the particle swarm algorithm. In: Proceedings of the 1997 Conference on Systems, Man, and Cybernetics 4104-4109.
[24] Kennedy J, Eberhart R C (1995) Particle swarm optimization. In: Proceedings of the IEEE International Conference on Neural Networks 4:1942-1948.
[25] Kira K, Rendell L A (1992) The feature selection problem: traditional methods and a new algorithm. In: Proceedings of Ninth National Conference on Artificial Intelligence 129-134.
[26] Liao C, Li S, Luo Z (2007) Gene selection using Wilcoxon rank sum test and support vector machine for cancer. LECT NOTES COMPUT SC 4456:57-66.
[27] Liu H, Setiono R (1996) A probabilistic approach to feature selection - a filter solution. In: Proceedings of Ninth International Conference on Industrial and Engineering Applications of AI and ES 284-292.
[28] Mitra S, Kundu P, Pedrycz W (2012) Feature selection using structural similarity. INFORM SCIENCES 198: 48-61.
[29] Kanan H R, Faez K (2008) An improved feature selection method based on ant colony optimization (aco) evaluated on face recognition system. APPL MATH COMPUT 205: 716-725.
[30] Nickabadi A, Ebadzadeh M M, Safabakhsh R (2011) A novel particle swarm optimization algorithm with adaptive inertia weight. Applied Soft Computing 11: 3658-3670.
[31] Novaković J, Strbac P, Bulatović D(2011) Toward optimal feature selection using ranking methods and classification algorithms. Yugoslav Journal of Operations Research 21: 119-135.
[32] Patricia E N, Andries L, Engelbrecht P(2010) A decision rule-based method for feature selection in predictive data mining. EXPERT SYST APPL 37:602-609.
[33] Pedrycz W , Chen S. M (2015) Information Granularity, Big Data, and computational Intelligence. Springer, Heidelberg, Germany.
[34] Peng H, Long F, Ding C (I2005) Feature selection based on mutual information criteria of max-dependency, max-relevance, and min redundancy. IEEE T PATTERN ANAL 27:1226-1238.
[35] Peng H, Long F, Ding C(2003) Over fitting in making comparisons between variable selection methods. Journal of Machine Learning Research 3: 1371-1382.
[36] Raman B, Ioerger T R (2002) Instance based filter for feature selection, Journal of Machine Learning Research 1: 1-23.
[37] Rocchi L, Chiari L, Cappello A(2004) Feature selection of stabilometric parameters based on principal component analysis. Medical and Biological Engineering and Computing 42: 71-79.
[38] Rostami N, Nezamabadi H (2006) A new method for binary PSO. In: proceedings of the international Conference on Electronic Engineering (in Persian).
[39] Sahu B, Mishra D (2012) A novel feature selection algorithm using particle swarm optimization for cancer microarray data. Procedia Engineering 38:27-31.
[40] Shen Yi, Bu Y, Yuan M (2009) A novel chaos particle swarm optimization (pso) and its application in pavement maintance decision. In: Proceedings on fourth IEEE Conference on Industrial Electronics and Applications 3521-3526.
[41] Sikonja M R, Kononenko I (2003) Theoretical and empirical analysis of Relief and Relieff. Machine Learning 53: 23-69.
[42] Sivagaminathan R K, Ramakrishnan S(2007) A hybrid approach for feature subset selection using neural networks and ant colony optimization. EXPERT SYST APPL 33: 49-60.
[43] Tahir M A, Smith J(2010) Creating diverse nearest-neighbor ensembles using simultaneous metaheuristic feature selection. PATTERN RECOGN LETT 31: 1470-1480.
[44] Thangavel K, Bagyamani J, Rathipriya R (2012) Novel hybrid pso-sa model for biclustering of expression data. Procedia Engineering 30: 1048 - 1055.
[45] Traina C, Traina A, Wu L, Faloutsos C (2000) Fast feature selection using the fractal dimension, In: Proceedings of the fifteenth Brazilian Symposium on Databases (SBBD) 158-171.
[46] Tsai C F, Eberle W, Chu C Y(2013)Genetic algorithms in feature and instance selection, Knowledge Based Systems 39: 240-247.
[47] Unler A, Murat A(2010) A discrete particle swarm optimization method for feature selection in binary classification problems. EUR J OPER RES 206: 528-539.
[48] Wang X, Yang J, Teng X, Xia W, Jensen R (2007)Feature selection based on rough sets and particle swarm optimization. PATTERN RECOGN LETT 28: 459-471.
[49] Wang Y, Dahnoun N, Achim A(2012)A novel system for robust lane detection and tracking. Signal Process 92: 319-334.
[50] Wu X, et al (2008) Top 10 algorithms in data mining. Knowl Inf Syst 14:1-37.
[51] Yang W, Li D, Zhu L (2011) An improved genetic algorithm for optimal feature subset selection from multi-character feature set. EXPERT SYST APPL 38: 2733-2740.
- Abstract viewed - 1225 times
- PDF downloaded - 1236 times
Affiliations
Mohammad Masoud Javidi
University of Tabriz
Nasibeh Emami
Azad Univ of Ghermi
A new method to improve feature selection with meta-heuristic algorithm and chaos theory
Abstract
Finding a subset of features from a large data set is a problem that arises in many fields of study. It is important to have an effective subset of features that is selected for the system to provide acceptable performance. This will lead us in a direction that to use meta-heuristic algorithms to find the optimal subset of features. The performance of evolutionary algorithms is dependent on many parameters which have significant impact on its performance, and these algorithms usually use a random process to set parameters. The nature of chaos is apparently random and unpredictable; however it also deterministic, it can suitable alternative instead of random process in meta-heuristic algorithms