Journal of computational and graphical statistics vol:18 pages:126-146
We describe and contrast several different bootstrap procedures for penalized spline smoothers. The bootstrap methods considered are variations on existing methods, developed under two different probabilistic frameworks. Under the first framework, penalized spline regression is considered as an estimation technique to find an unknown smooth function. The smooth function is represented in a high dimensional spline basis, with spline coefficients estimated in a penalized form. Under the second framework, the unknown function is treated as a realization of a set of random spline coefficients, which are then predicted in a linear mixed model. We describe how bootstrap methods can be implemented under both frameworks, and we show theoretically and through simulations and examples that bootstrapping provides valid inference in both cases. We compare the inference obtained under both frameworks, and conclude that the latter generally produces better results
than the former. The bootstrap ideas are extended to hypothesis testing, where parametric components in a model are tested against nonparametric alternatives.