![]() ![]() Zou H (2006) The adaptive Lasso and its oracle properties. Zhao P, Yu B (2006) On model selection consistency of Lasso. Zhang B, Geng J, Lai L (2015) Multiple change-points estimation in linear regression models via sparse group Lasso. Zhang C (2010) Nearly unbiased variable selection under minimax concave penalty. Yao Y, Au ST (1989) Least-squares estimation of a step function. Xu J, Ying Z (2010) Simultaneous estimation and variable selection in median regression using Lasso-type penalty. Wang H, Li G, Jiang G (2007) Robust regression shrinkage and consistent variable selection through the LAD-Lasso. Wang L (2013) The \(L_1\) penalized LAD estimator for high dimensional linear regression. Tibshirani R (1996) Regression shrinkage and selection via the Lasso. Little MA, Jones NS (2011b) Generalized methods and solvers for noise removal from piecewise constant signals. Little MA, Jones NS (2011a) Generalized methods and solvers for noise removal from piecewise constant signals. Lavielle M, Moulines E (2000) Least-squares estimation of an unknown number of shifts in a time series. Knight K, Fu WJ (2000) Asymptotics for Lasso-type estimators. Huang T, Wu B, Lizardi P, Zhao H (2005) Detection of DNA copy number alterations using penalized least squares regression. ![]() Hawkins DM (1977) Testing a sequence of observations for a shift in location. Harchaoui Z, Lévy-Leduc C (2010) Multiple change-point estimation with a total variation penalty. Harchaoui Z, Lévy-Leduc C (2008) Catching change-points with lasso. Gao X, Huang J (2010b) A robust penalized method for the analysis of noisy DNA copy number data. Gao X, Huang J (2010a) Asymptotic analysis of high-dimensional LAD regression with lasso. Introduction The Lasso, proposed in 58, is a well-established method that achieves sparsity of an estimated parameter vector via i -penalization. J Am Stat Assoc 96(456):1348–1360įan J, Lv J (2008) Sure independence screening for ultrahigh dimensional feature space. Hawkes processes Lasso procedure multivariate counting process 1. Ann Stat 32(2):407–451įan J, Li R (2001) Variable selection via nonconcave penalized likelihood and its oracle properties. J Am Stat Assoc 90(432):1200–1224Įfron B, Hastie T, Johnstone I, Tibshirani R (2004) Least angle regression. Stat Pap 55(4):349–374ĭonoho D, Johnstone I (1995) Adapting to unknown smoothness via wavelet shrinkage. Stat Pap 52(2):371–390Ĭiuperca G (2014) Model selection by LASSO methods in a change-point model. Econometrica 28(3):591–605Ĭiuperca G (2011) Penalized least absolute deviations estimation for nonlinear model with change-points. J Am Stat Assoc 109(506):590–599Ĭhow GC (1960) Tests of equality between sets of coefficients in two linear regressions. Ann Stat 37(1):157–183Ĭhan NH, Yau CY, Zhang R (2014) Group LASSO for structural break time series. J Stat Plan Inference 74(1):103–134īoysen L, Kempe A, Liebscher V, Munk A, Wittich O (2009) Consistencies and rates of convergence of jump penalized least squares estimators. ![]() Econom Theory 11(3):403–436īai J (1998) Estimation of multiple-regime regressions with least absolutes deviation. Comput Stat Data Anal 56:1952–1965īai J (1995) Least absolute deviation estimation of s shift. EURASIP J Adv Signal Process 70(1):1–16Īrslan O (2012) Weighted LAD-LASSO method for robust parameter estimation and variable selection in regression.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |