Piecewise linear regression via a difference of convex functions
Files
Published version
Date
2020
DOI
Authors
Siahkamari, Ali
Gangrade, Aditya
Kulis, Brian
Saligrama, Venkatesh
Version
OA Version
Published version
Citation
Ali Siahkamari, Aditya Gangrade, Brian Kulis, Venkatesh Saligrama. 2020. "Piecewise Linear Regression via a Difference of Convex Functions." International Conference on Machine Learning. ICML. PMLR,
Abstract
We present a new piecewise linear regression
methodology that utilizes fitting a difference of
convex functions (DC functions) to the data.
These are functions f that may be represented
as the difference 𝜙_1- 𝜙_2 for a choice of convex functions 𝜙_1,𝜙_2. The method proceeds by estimating piecewise-liner convex functions, in a manner similar to max-affine regression, whose difference approximates the data. The choice of the function is regularised by a new seminorm over the class of DC functions that controls the 𝓁_∞ Lipschitz constant of the estimate.
The resulting methodology can be efficiently implemented via Quadratic programming even in high dimensions, and is shown to have close to minimax statistical risk. We empirically validate the method, showing it to be practically implementable, and to have comparable performance to existing egression/classification methods on real-world datasets.
Description
License
Copyright 2020 by the author(s).