科研成果 by Year: 2020

2020
Huang, H., Gao, Y., Zhang, H., & Li, B. (2020). Weighted Lasso Estimates for Sparse Logistic Regressions: Non-asymptotic Properties with Measurement Error. Acta Mathematica Scientia.Abstract
When we are interested in high-dimensional system and focus on classification performance, the $\ell_{1}$-penalized logistic regression is becoming important and popular. However, the Lasso estimates could be problematic when penalties of different coefficients are all the same and not related to the data. We proposed two types of weighted Lasso estimates depending on covariates by the McDiarmid inequality. Given sample size $n$ and dimension of covariates $p$, the finite sample behavior of our proposed methods with a diverging number of predictors is illustrated by non-asymptotic oracle inequalities such as $\ell_{1}$-estimation error and squared prediction error of the unknown parameters. We compare the performance of our methods with former weighted estimates on simulated data, then apply these methods to do real data analysis.
Yu, J., Wang, H., Ai, M., & Zhang, H. (2020). Optimal Distributed Subsampling for Maximum Quasi-Likelihood Estimators with Massive Data. Journal of the American Statistical Association. 访问链接Abstract
Nonuniform subsampling methods are effective to reduce computational burden and maintain estimation efficiency for massive data. Existing methods mostly focus on subsampling with replacement due to its high computational efficiency. If the data volume is so large that nonuniform subsampling probabilities cannot be calculated all at once, then subsampling with replacement is infeasible to implement. This paper solves this problem using Poisson subsampling. We first derive optimal Poisson subsampling probabilities in the context of  quasi-likelihood estimation under the A- and L-optimality criteria. For a practically implementable algorithm with approximated optimal subsampling probabilities, we establish the consistency and asymptotic normality of the resultant estimators. To deal with the situation that the full data are stored in different blocks or at multiple locations, we develop a distributed subsampling framework, in which statistics are computed simultaneously on smaller partitions of the full data. Asymptotic properties of the resultant aggregated estimator are investigated. We illustrate and evaluate the proposed strategies through numerical experiments on simulated and real data sets.
Fan, Y., Zhang, H., & Yan, T. (2020). Asymptotic Theory for Differentially Private Generalized β-models with Parameters Increasing. Statistics and Its Interface, 13(3), 385 – 398. 访问链接
Ai, M., Yu, J., Zhang, H., & Wang, H. (2020). Optimal Subsampling for Big Data Regressions. Statistica Sinica. 访问链接