Yingying Fan  
 
 

Software

ISEE
Implements the innovated scalable efficient estimation (ISEE) procedure (Fan and Lv, 2016) for scalable estimation of both the innovated data matrix and the precision matrix in ultra-large scales.[R code]

The ISEE framework has a range of applications involving precision (innovated data) matrix including:

Linear and nonlinear classifications;
Dimension reduction;
Portfolio management;
Feature screening and simultaneous inference.

Reference:
Fan, Y. and Lv, J. (2016).
Innovated scalable efficient estimation in ultra-large Gaussian graphical models.
The Annals of Statistics 44, 2098-2126.
 
IIS
Implements the innovated interaction screening (IIS) followed by the sparse quadratic discriminant analysis (SQDA), which is a two-step procedure IIS-SQDA (Fan, Kong, Li and Zheng, 2015) for interaction screening and selection in high-dimensional nonlinear classification, with the innovated transformation of the feature vector estimated by the ISEE (Fan and Lv, 2016).[R code]

References:
Fan, Y., Kong, Y., Li, D. and Zheng, Z. (2015).
Innovated interaction screening for high-dimensional nonlinear classification.
The Annals of Statistics 43, 1243-1272.

Fan, Y. and Lv, J. (2016).
Innovated scalable efficient estimation in ultra-large Gaussian graphical models.
The Annals of Statistics 44, 2098-2126.
 
RTPS
Fits high-dimensional sparse generalized linear models (linear, logistic, and Poisson) with regularization methods in thresholded parameter space (RTPS, Fan and Lv, 2013), where the connections and differences of a spectrum of regularization methods are established.[Matlab code]

Reference:
Fan, Y. and Lv, J. (2013).
Asymptotic equivalence of regularization methods in thresholded parameter space.
Journal of the American Statistical Association 108, 1044-1061.
 
SICA
Fits high-dimensional sparse linear regression and also produces solution to sparse recovery using the iterative coordinate ascent (ICA) algorithm (Fan and Lv, 2011), with the smooth integration of counting and absolute deviation (SICA) method (Lv and Fan, 2009). SICA is a regularization method for high-dimensional sparse modeling and sparse recovery. It provides a family of concave penalties that give a smooth homotopy between the L0- and L1-penalties, where the former is the target penalty for sparse recovery and the latter is used in L1-regularization methods such as the Lasso.[Matlab code]

References:
Lv, J. and Fan, Y. (2009).
A unified approach to model selection and sparse recovery using regularized least squares.
The Annals of Statistics 37, 3498-3528.

Fan, J. and Lv, J. (2011).
Nonconcave penalized likelihood with NP-dimensionality.
IEEE Transactions on Information Theory 57, 5467-5484.
 
SIRS
The sequentially and iteratively reweighted squares (SIRS) algorithm (Lv and Fan, 2009) implementing SICA for sparse recovery.[Matlab code]

Reference:
Lv, J. and Fan, Y. (2009).
A unified approach to model selection and sparse recovery using regularized least squares.
The Annals of Statistics 37, 3498-3528.
 
Note: Please contact me if anyone is interested in optimizing or developing packages based on these codes.