JOURNAL ARTICLE

Fast training Support Vector Machines using parallel sequential minimal optimization

Abstract

One of the key factors that limit support vector machines (SVMs) application in large sample problems is that the large-scale quadratic programming (QP) that arises from SVMs training cannot be easily solved via standard QP technique. The sequential minimal optimization (SMO) is current one of the major methods for solving SVMs. This method, to a certain extent, can decrease the degree of difficulty of a QP problem through decomposition strategies, however, the high training price for saving memory space must be endured. In this paper, an algorithm in the light of the idea of parallel computing based on Symmetric multiprocessor (SMP) machine is improved. The new technique has great advantage in terms of speediness when applied to problems with large training sets and high dimensional spaces without reducing generalization performance of SVMs. .

Keywords:
Sequential minimal optimization Computer science Support vector machine Quadratic programming Generalization Limit (mathematics) Multiprocessing Decomposition Training (meteorology) Key (lock) Artificial intelligence Mathematical optimization Machine learning Parallel computing Mathematics

Metrics

130
Cited By
0.00
FWCI (Field Weighted Citation Impact)
11
Refs
0.16
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Face and Expression Recognition
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition
Metaheuristic Optimization Algorithms Research
Physical Sciences →  Computer Science →  Artificial Intelligence
Machine Learning and ELM
Physical Sciences →  Computer Science →  Artificial Intelligence
© 2026 ScienceGate Book Chapters — All rights reserved.