On Quantile Regression in Reproducing Kernel Hilbert Spaces with Data Sparsity Constraint.

TitleOn Quantile Regression in Reproducing Kernel Hilbert Spaces with Data Sparsity Constraint.
Publication TypeJournal Article
Year of Publication2016
AuthorsZhang, Chong, Yufeng Liu, and Yichao Wu
JournalJ Mach Learn Res
Volume17
Issue40
Pagination1-45
Date Published2016 Apr
ISSN1532-4435
Abstract

For spline regressions, it is well known that the choice of knots is crucial for the performance of the estimator. As a general learning framework covering the smoothing splines, learning in a Reproducing Kernel Hilbert Space (RKHS) has a similar issue. However, the selection of training data points for kernel functions in the RKHS representation has not been carefully studied in the literature. In this paper we study quantile regression as an example of learning in a RKHS. In this case, the regular squared norm penalty does not perform training data selection. We propose a data sparsity constraint that imposes thresholding on the kernel function coefficients to achieve a sparse kernel function representation. We demonstrate that the proposed data sparsity method can have competitive prediction performance for certain situations, and have comparable performance in other cases compared to that of the traditional squared norm penalty. Therefore, the data sparsity method can serve as a competitive alternative to the squared norm penalty method. Some theoretical properties of our proposed method using the data sparsity constraint are obtained. Both simulated and real data sets are used to demonstrate the usefulness of our data sparsity constraint.

DOI10.1145/130385.130401
Alternate JournalJ Mach Learn Res
Original PublicationOn quantile regression in reproducing kernel Hilbert spaces with data sparsity constraint.
PubMed ID27134575
PubMed Central IDPMC4850041
Grant ListP01 CA142538 / CA / NCI NIH HHS / United States
R01 CA149569 / CA / NCI NIH HHS / United States