科研成果 by Year: 2020

2020
Ai, M., Wang, F., Yu, J., & Zhang, H. (2020). Optimal subsampling for large-scale quantile regression. Journal of Complexity. 访问链接Abstract
To deal with massive data sets, subsampling is known as an effective method which can significantly reduce computational costs in estimating model parameters. In this article, an efficient subsampling method is developed for large-scale quantile regression via Poisson sampling framework, which can solve the memory constraint problem imposed by big data. Under some mild conditions, large sample properties for the estimator involving the weak and strong consistencies, and asymptotic normality are established. Furthermore, the optimal subsampling probabilities are derived according to the A-optimality criterion. It is shown that the estimator based on the optimal subsampling asymptotically achieves a smaller variance than that by the uniform random subsampling. The proposed method is illustrated and evaluated through numerical analyses on both simulated and real data sets.
Yu, J., Wang, H., Ai, M., & Zhang, H. (2020). Optimal Distributed Subsampling for Maximum Quasi-Likelihood Estimators with Massive Data. Journal of the American Statistical Association. 访问链接Abstract
Nonuniform subsampling methods are effective to reduce computational burden and maintain estimation efficiency for massive data. Existing methods mostly focus on subsampling with replacement due to its high computational efficiency. If the data volume is so large that nonuniform subsampling probabilities cannot be calculated all at once, then subsampling with replacement is infeasible to implement. This paper solves this problem using Poisson subsampling. We first derive optimal Poisson subsampling probabilities in the context of  quasi-likelihood estimation under the A- and L-optimality criteria. For a practically implementable algorithm with approximated optimal subsampling probabilities, we establish the consistency and asymptotic normality of the resultant estimators. To deal with the situation that the full data are stored in different blocks or at multiple locations, we develop a distributed subsampling framework, in which statistics are computed simultaneously on smaller partitions of the full data. Asymptotic properties of the resultant aggregated estimator are investigated. We illustrate and evaluate the proposed strategies through numerical experiments on simulated and real data sets.
Huang, H., Gao, Y., Zhang, H., & Li, B. (2020). Weighted Lasso Estimates for Sparse Logistic Regressions: Non-asymptotic Properties with Measurement Error. Acta Mathematica Scientia.Abstract
When we are interested in high-dimensional system and focus on classification performance, the $\ell_{1}$-penalized logistic regression is becoming important and popular. However, the Lasso estimates could be problematic when penalties of different coefficients are all the same and not related to the data. We proposed two types of weighted Lasso estimates depending on covariates by the McDiarmid inequality. Given sample size $n$ and dimension of covariates $p$, the finite sample behavior of our proposed methods with a diverging number of predictors is illustrated by non-asymptotic oracle inequalities such as $\ell_{1}$-estimation error and squared prediction error of the unknown parameters. We compare the performance of our methods with former weighted estimates on simulated data, then apply these methods to do real data analysis.
Fan, Y., Zhang, H., & Yan, T. (2020). Asymptotic Theory for Differentially Private Generalized β-models with Parameters Increasing. Statistics and Its Interface, 13(3), 385 – 398. 访问链接
Ai, M., Yu, J., Zhang, H., & Wang, H. (2020). Optimal Subsampling for Big Data Regressions. Statistica Sinica. 访问链接
Zhang, H., & Jia, J. (2020). Elastic-net Regularized High-dimensional Negative Binomial Regression: Consistency and Weak Signals Detection. Statistica Sinica. 访问链接