Proceedings of the ICML 2008 Workshop on Sparse Optimization and Variable Selection
ICML Workshop on Sparse Optimization and Variable Selection location:Helsinki, Finland date:9 July 2008
We examine linear regression problems where the features may only be observable at some cost. To do this, we define a parsimonious linear regression objective criterion that jointly minimizes prediction error and feature cost, assuming they can be expressed in commensurable units. We are able to modify least angle regression algorithms commonly used for sparse linear regression (with non-costly features) to produce an algorithm which not only provides an efficient and parsimonious solution to linear regression with costly features as we demonstrate empirically, but it also provides formal guarantees on parsimony.