English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT

Released

Conference Paper

Learning output kernels with block coordinate descent

MPS-Authors
/persons/resource/persons83886

Dinuzzo,  F
Dept. Empirical Inference, Max Planck Institute for Intelligent Systems, Max Planck Society;

External Resource

http://www.icml-2011.org/
(Table of contents)

Fulltext (restricted access)
There are currently no full texts shared for your IP range.
Fulltext (public)
There are no public fulltexts stored in PuRe
Supplementary Material (public)
There is no public supplementary material available
Citation

Dinuzzo, F., Ong, C., Gehler, P., & Pillonetto, G. (2011). Learning output kernels with block coordinate descent. In L. Getoor, & T. Scheffer (Eds.), 28th International Conference on Machine Learning (ICML 2011) (pp. 49-56). Madison, WI, USA: International Machine Learning Society.


Cite as: https://hdl.handle.net/11858/00-001M-0000-0013-BB22-D
Abstract
We propose a method to learn simultaneously a vector-valued function and a kernel between its components. The obtained kernel can be used both to improve learning performances and to reveal structures in the output space which may be important in their own right. Our method is based on the solution of a suitable regularization problem over a reproducing kernel Hilbert space (RKHS) of vector-valued functions. Although the regularized risk functional is non-convex, we show that it is invex, implying that all local minimizers are global minimizers. We derive a block-wise coordinate descent method that efficiently exploits the structure of the objective functional. Then, we empirically demonstrate that the proposed method can improve classification accuracy. Finally, we provide a visual interpretation of the learned kernel matrix for some well known datasets.