
تعداد نشریات | 31 |
تعداد شمارهها | 810 |
تعداد مقالات | 7,797 |
تعداد مشاهده مقاله | 35,726,330 |
تعداد دریافت فایل اصل مقاله | 8,029,939 |
A New Insight on the Model of Support Vector Machine | ||
Computational Sciences and Engineering | ||
دوره 4، شماره 2، آذر 2024، صفحه 297-308 اصل مقاله (368.14 K) | ||
نوع مقاله: Original Article | ||
شناسه دیجیتال (DOI): 10.22124/cse.2025.31583.1118 | ||
نویسندگان | ||
Afsaneh Pourmoezi؛ Mostafa Eslami* ؛ Ali Tavakoli | ||
Department of Applied Mathematics, University of Mazandaran, Babolsar, Iran | ||
چکیده | ||
Support Vector Machine (SVM) is a powerful classification algorithm that separates samples by finding an optimal decision boundary. Its performance can degrade when feature variances differ across classes, potentially leading to suboptimal decision boundaries. A variance-weighted framework is proposed that reduces the influence of high-variance features while enhancing the impact of low-variance features, resulting in more accurate and robust decision boundaries. The method is applicable in both linear and nonlinear settings. Evaluation on synthetic datasets and real-world datasets, including Breast cancer and a9a, using cross-validation demonstrates that the variance-weighted SVM achieves higher accuracy and F1-score compared to soft SVM and LDM, particularly in scenarios with significant variance differences between classes. | ||
کلیدواژهها | ||
Support vector machines؛ Classification؛ Variance-weighted features | ||
مراجع | ||
[1] Achirul Nanda, M., Boro Seminar, K., Nandika, D., & Maddu, A. (2018). A comparison study of kernel functions in the support vector machine and its application for termite detection. Information, 9 (1), 5.
[2] Amaya-Tejera, N., Gamarra, M., Vélez, J. I., & Zurek, E. (2024). A distance-based kernel for classification via Support Vector Machines. Frontiers in Artificial Intelligence, 7, 1287875.
[3] Balakrishnama, S., & Ganapathiraju, A. (1998). Linear discriminant analysis-a brief tutorial. Institute for Signal and Information Processing, 18(1998), 1-8.
[4] Burges, C. J. (1998). A tutorial on support vector machines for pattern recognition. Data Mining and Knowledge Discovery, 2(2), 121-167.
[5] Choubey, D. K., Kumar, M., Shukla, V., Tripathi, S., & Dhandhania, V. K. (2020). Comparative analysis of classification methods with PCA and LDA for diabetes. Current diabetes reviews, 16(8), 833-850.
[6] Chang, C. C., & Lin, C. J. (2011). LIBSVM: a library for support vector machines. ACM transactions on intelligent systems and technology (TIST), 2(3), 1-27.
[7] Cortes, C., & Vapnik, V. (1995). Support-vector networks. Machine Learning, 20, 273-297.
[8] Cristianini, N., & Shawe-Taylor, J. (2000). An introduction to support vector machines and other kernel-based learning methods. Cambridge University Press.
[9] Demidova, L., Nikulchev, E., & Sokolova, Y. (2016). Big data classification using the SVM classifiers with the modified particle swarm optimization and the SVM ensembles. International Journal of Advanced Computer Science and Applications, 7(5).
[10] Guyon, I., Weston, J., Barnhill, S., & Vapnik, V. (2002). Gene selection for cancer classification using support vector machines. Machine learning, 46(1), 389-422.
[11] Huang, W., Nakamori, Y., & Wang, S. Y. (2005). Forecasting stock market movement direction with support vector machine. Computers & operations research, 32(10), 2513-2522.
[12] Joachims, T. (1998, April). Text categorization with support vector machines: Learning with many relevant features. In European Conference on machine learning (pp. 137-142). Berlin, Heidelberg: Springer Berlin Heidelberg.
[13] Kasnavi, S. A., Aminafshar, M., Shariati, M. M., Kashan, N. E. J., & Honarvar, M. (2018). The effect of kernel selection on genome wide prediction of discrete traits by Support Vector Machine. Gene Reports, 11, 279-282.
[14] Liang, H. X., Wang, Z. Y., Li, Y., Ren, A. N., Chen, Z. F., Wang, X. Z., ... & Yuan, Z. G. (2024). The application value of support vector machine model based on multimodal MRI in predicting IDH-1mutation and Ki-67 expression in glioma. BMC Medical Imaging, 24(1), 244.
[15] Lin, H. T., Lin, C. J., & Weng, R. C. (2007). A note on Platt’s probabilistic outputs for support vector machines. Machine learning, 68(3), 267-276.
[16] Tang, J., Alelyani, S., & Liu, H. (2014). Feature selection for classification: A review. Data classification: Algorithms and applications, 37.
[17] Valentini, G., & Dietterich, T. G. (2004). Bias-variance analysis of support vector machines for the development of SVM-based ensemble methods. Journal of Machine Learning Research, 5(Jul), 725-775.
[18] Wolberg, W., Mangasarian, O., Street, N., & Street, W. (1993). Breast Cancer Wisconsin (Diagnostic) [Dataset]. UCI Machine Learning Repository. https://doi.org/10.24432/C5DW2B.
[19] Zhang, T., & Zhou, Z. H. (2014, August). Large margin distribution machine. In Proceedings of the 20th ACM SIGKDD international conference on Knowledge discovery and data mining (pp. 313-322).
[20] Zhao, S., Zhang, B., Yang, J., Zhou, J., & Xu, Y. (2024). Linear discriminant analysis. Nature Reviews Methods Primers, 4(1), 70. | ||
آمار تعداد مشاهده مقاله: 41 تعداد دریافت فایل اصل مقاله: 15 |