[1]T. Zar Phyu and N. N. Oo, “Performance comparison of feature selection methods,” MATEC Web of Conferences, vol. 42, pp. 2–5, 2016, doi: 10.1051/matecconf20164206002.
[2]R. P. L. Durgabai, “Feature Selection using ReliefF Algorithm,” Ijarcce, vol. 3, no. 10, pp. 8215–8218, 2014, doi: 10.17148/ijarcce.2014.31031.
[3]R. J. Urbanowicz, M. Meeker, W. La Cava, R. S. Olson, and H. Moore, “Relief-Based Feature Selection : Introduction and Review,” Journal of Biomedical Informatics, 2018.
[4]S. Gore and V. Govindaraju, “Feature Selection Using Cooperative Game Theory and Relief Algorithm,” in Knowledge, Information and Creativity Support Systems: Recent Trends, Advances and Solutions, 2016, pp. 401–412.
[5]Y. He, J. Zhou, Y. Lin, and T. Zhu, “A class imbalance-aware Relief algorithm for the classification of tumors using microarray gene expression data,” Computational Biology and Chemistry, vol. 80, no. March, pp. 121–127, 2019, doi: 10.1016/j.compbiolchem.2019.03.017.
[6]P. Wang, C. Sanín, and E. Szczerbicki, “Prediction based on integration of decisional dna and a feature selection algorithm relief-F,” Cybernetics and Systems, vol. 44, no. 2–3, pp. 173–183, 2013, doi: 10.1080/01969722.2013.762246.
[7]Y. Xie, D. Li, D. Zhang, and H. Shuang, “An improved multi-label relief feature selection algorithm for unbalanced datasets,” Advances in Intelligent Systems and Computing, vol. 686, pp. 141–151, 2018, doi: 10.1007/978-3-319-69096-4_21.
[8]G. A. Tahir and C. K. Loo, “An open-ended continual learning for food recognition using class incremental extreme learning machines,” IEEE Access, vol. 8, pp. 82328–82346, 2020, doi: 10.1109/ACCESS.2020.2991810.
[9]A. Sharma and S. Dey, “A Comparative Study of Feature Selection and Machine Learning Techniques for Sentiment Analysis,” RACS, pp. 1–7, 2012.
[10]L. Sun, T. Yin, W. Ding, Y. Qian, and J. Xu, “Multilabel feature selection using ML-ReliefF and neighborhood mutual information for multilabel neighborhood decision systems,” Information Sciences, vol. 537, pp. 401–424, 2020, doi: 10.1016/j.ins.2020.05.102.
[11]K. Deepika and N. Sathyanarayana, “Relief-F and budget tree random forest based feature selection for student academic performance prediction,” International Journal of Intelligent Engineering and Systems, vol. 12, no. 1, pp. 30–39, 2019, doi: 10.22266/IJIES2019.0228.04.
[12]M. Zaffar, M. A. Hashmani, and K. S. Savita, Comparing the performance of FCBF, Chi-Square and relief-F filter feature selection algorithms in educational data mining, vol. 843. Springer International Publishing, 2019.
[13]N. Gofar and P. Susmanto, Tracer Study Universitas Sriwijaya Tahun 2018 (Lulusan Tahun 2016). Palembang: Noer Fikri, 2018.
[14]K. Kira and L. A. Rendell, “A Practical Approach to Feature Selection,” in 9th International Workshop on Machine Intelligence, Aberdeen, 1992, pp. 249–256.
[15]I. Kononenko, “Estimating Attributes : Analysis and Extensions of Relief,” in European Conference on Machine Learning, 1994, pp. 171–182.
[16]Okfalisa, I. Gazalba, Mustakim, and N. G. I. Reza, “Comparative analysis of k-nearest neighbor and modified k-nearest neighbor algorithm for data classification,” Proceedings -2017 2nd International Conferences on Information Technology, Information Systems and Electrical Engineering, ICITISEE 2017, vol. 2018-Janua, pp. 294–298, 2018, doi: 10.1109/ICITISEE.2017.8285514.