An approximate approach for training polynomial kernel SVMs in linear time

Yu Chieh Wu, Jie Chi Yang, Yue Shi Lee

研究成果: 雜誌貢獻會議論文同行評審

12 引文 斯高帕斯(Scopus)

摘要

Kernel methods such as support vector machines (SVMs) have attracted a great deal of popularity in the machine learning and natural language processing (NLP) communities. Polynomial kernel SVMs showed very competitive accuracy in many NLP problems, like part-of-speech tagging and chunking. However, these methods are usually too inefficient to be applied to large dataset and real time purpose. In this paper, we propose an approximate method to analogy polynomial kernel with efficient data mining approaches. To prevent exponential-scaled testing time complexity, we also present a new method for speeding up SVM classifying which does independent to the polynomial degree d. The experimental results showed that our method is 16.94 and 450 times faster than traditional polynomial kernel in terms of training and testing respectively.

原文???core.languages.en_GB???
頁(從 - 到)65-68
頁數4
期刊Proceedings of the Annual Meeting of the Association for Computational Linguistics
出版狀態已出版 - 2007
事件45th Annual Meeting of the Association for Computational Linguistics, ACL 2007 - Prague, Czech Republic
持續時間: 25 6月 200727 6月 2007

指紋

深入研究「An approximate approach for training polynomial kernel SVMs in linear time」主題。共同形成了獨特的指紋。

引用此