Strong consistency of an estimator by the truncated singular value decomposition for an errors-in-variables regression model with collinearity
29 November 2023
Proves strong consistency for solutions to rank-constrained total least squares (TLS) regression
Extends prior analysis that only covered minimal norm TLS solutions
Allows some rows of data matrices to be error-free
Uses matrix perturbation theory and Rayleigh-Ritz projections
Generalizes consistency proofs for standard unconstrained TLS
Strong consistency of rank-constrained total least squares regression
This paper proves that a variant of total least squares regression with rank constraints yields estimators that converge to the true parameter values, even when the explanatory variables contain errors. This establishes asymptotic consistency for a broader set of solutions beyond just the minimal norm solution typically analyzed. The proof relies on matrix perturbation theory and properties of orthogonal projections derived from the Rayleigh-Ritz procedure.
No comments yet, be the first to start the conversation...
Sign up to comment on this paper