|
Software
SOFAR |
|
Implements the sparse orthogonal factor regression (SOFAR) method (Uematsu, Fan, Chen, Lv and Lin, 2019) for learning the large-scale response-predictor association network structures via layers of sparse latent factors ranked by importance.[R package]
Reference:
Uematsu, Y., Fan, Y., Chen, K., Lv, J. and Lin, W. (2019).
SOFAR: large-scale association network learning.
IEEE Transactions on Information Theory 65, 4924-4939.
|
|
ISEE |
|
Implements the innovated scalable efficient estimation (ISEE) procedure (Fan and Lv, 2016) for scalable estimation of both the innovated data matrix and the precision matrix in ultra-large scales.[R code]
The ISEE framework has a range of applications involving precision (innovated data) matrix including:
Linear and nonlinear classifications;
Dimension reduction;
Portfolio management;
Feature screening and simultaneous inference.
Reference:
Fan, Y. and Lv, J. (2016).
Innovated scalable efficient estimation in ultra-large Gaussian graphical models.
The Annals of Statistics 44, 2098-2126.
|
|
RTPS |
|
Fits high-dimensional sparse generalized linear models (linear, logistic, and Poisson) with regularization methods in thresholded parameter space (RTPS, Fan and Lv, 2013), where the connections and differences of a spectrum of regularization methods are established.[Matlab code]
Reference:
Fan, Y. and Lv, J. (2013).
Asymptotic equivalence of regularization methods in thresholded parameter space.
Journal of the American Statistical Association 108, 1044-1061.
|
|
SAHR |
|
Fits high-dimensional sparse additive hazards regression (SAHR, Lin and Lv, 2013) with popular regularization methods.[R package]
Reference: Lin, W. and Lv, J. (2013).
High-dimensional sparse additive hazards regression.
Journal of the American Statistical Association 108, 247-264.
|
|
SICA/ICA |
|
Fits high-dimensional sparse linear regression and also produces solution to sparse recovery using the iterative coordinate ascent (ICA) algorithm (Fan and Lv, 2011), with the smooth integration of counting and absolute deviation (SICA) method (Lv and Fan, 2009). SICA is a regularization method for high-dimensional sparse modeling and sparse recovery. It provides a family of concave penalties that give a smooth homotopy between the L0- and L1-penalties, where the former is the target penalty for sparse recovery and the latter is used in L1-regularization methods such as the Lasso.[Matlab code]
References:
Lv, J. and Fan, Y. (2009).
A unified approach to model selection and sparse recovery using regularized least squares.
The Annals of Statistics 37, 3498-3528.
Fan, J. and Lv, J. (2011).
Nonconcave penalized likelihood with NP-dimensionality.
IEEE Transactions on Information Theory 57, 5467-5484. |
|
SIRS |
|
The sequentially and iteratively reweighted squares (SIRS) algorithm (Lv and Fan, 2009) implementing SICA for sparse recovery.[Matlab code]
Reference:
Lv, J. and Fan, Y. (2009).
A unified approach to model selection and sparse recovery using regularized least squares.
The Annals of Statistics 37, 3498-3528.
|
|
Note: Please contact me if anyone is interested in optimizing or developing packages based on these codes. |
|
|