Treffer: Degrees-of-freedom penalized piecewise regression.

Title:
Degrees-of-freedom penalized piecewise regression.
Authors:
Volz, Stefan1 (AUTHOR), Storath, Martin1 (AUTHOR), Weinmann, Andreas2,3 (AUTHOR)
Source:
Information & Inference: A Journal of the IMA. Mar2025, Vol. 14 Issue 1, p1-32. 32p.
Database:
Academic Search Index

Weitere Informationen

Many popular piecewise regression models rely on minimizing a cost function on the model fit with a linear penalty on the number of segments. However, this penalty does not take into account varying complexities of the model functions on the segments potentially leading to overfitting when models with varying complexities, such as polynomials of different degrees, are used. In this work, we enhance on this approach by instead using a penalty on the sum of the degrees of freedom over all segments, called degrees-of-freedom penalized piecewise regression. We show that the solutions of the resulting minimization problem are unique for almost all input data in a least squares setting. We develop a fast algorithm that does not only compute a minimizer but also determines an optimal hyperparameter—in the sense of rolling cross validation with the one standard error rule—exactly. This eliminates manual hyperparameter selection. Our method supports optional user parameters for incorporating domain knowledge. We provide an open-source Python/Rust code for the piecewise polynomial least squares case which can be extended to further models. We demonstrate the practical utility through a simulation study and by applications to real data. A constrained variant of the proposed method gives state-of-the-art results in the Turing benchmark for unsupervised changepoint detection. [ABSTRACT FROM AUTHOR]