Full text

Turn on search term navigation

Copyright © 2012 Shao-Gao Lv and Jin-De Zhu. Shao-Gao Lv et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Abstract

The problem of learning the kernel function with linear combinations of multiple kernels has attracted considerable attention recently in machine learning. Specially, by imposing an [superscript]lp[/superscript] -norm penalty on the kernel combination coefficient, multiple kernel learning (MKL) was proved useful and effective for theoretical analysis and practical applications (Kloft et al., 2009, 2011). In this paper, we present a theoretical analysis on the approximation error and learning ability of the [superscript]lp[/superscript] -norm MKL. Our analysis shows explicit learning rates for [superscript]lp[/superscript] -norm MKL and demonstrates some notable advantages compared with traditional kernel-based learning algorithms where the kernel is fixed.

Details

Title
Error Bounds for lp -Norm Multiple Kernel Learning with Least Square Loss
Author
Shao-Gao, Lv; Jin-De, Zhu
Publication year
2012
Publication date
2012
Publisher
John Wiley & Sons, Inc.
ISSN
10853375
e-ISSN
16870409
Source type
Scholarly Journal
Language of publication
English
ProQuest document ID
1038344020
Copyright
Copyright © 2012 Shao-Gao Lv and Jin-De Zhu. Shao-Gao Lv et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.