Conquer: Convolution-type smoothed quantile regression

发布者:季洁发布时间:2020-05-21浏览次数:380

SpeakerWen-Xin Zhou, UC San Diego
HostFengnan Gao, Fudan University
Time10:00-11:00, May 28, 2020
Zoom meeting ID698 7134 2188
code925404
Abstract

In this work, we provide a comprehensive study of a convolution-type smoothing approach to achieving an adequate approximation to computation and inference for quantile regression. The proposed estimator, which we refer to as conquer, turns the non-diffrentiable quantile loss function into a twice-differentiable and locally strongly convex surrogate, which admits a fast and scalable Barzilai-Borwein gradient-based algorithm to perform optimization, and a Rademacher multiplier bootstrap method for statistical inference. We establish non-asymptotic error bounds on the Bahadur-Kiefer linearization, from which we show that the asymptotic normality of the conquer estimator holds under a weaker requirement on the dimension of the predictors than needed for the exact quantile regression estimator. Numerical studies confirm the conquer estimator as a practical and reliable approach to large-scale inference for quantile regression. This talk is based on a joint work with Xuming He, Xiaoou Pan, and Kean Mean Tan. The R package 'conquer' is available on CRAN.

Bio

Wenxin Zhou is an Assistant Professor in the Department of Mathematics at the University of California, San Diego. Prior to joining UCSD, Dr. Zhou was a postdoc in University of Melbourne and then Princeton University, working with Aurore Delaigle and Jianqing Fan. Dr. Zhou’s research uses tools and ideas from probability theory, functional and geometric analysis and numerical optimization to understand high-dimensional and large-scale estimation and inference problems, with a particular focus on issues such as robustness, heterogeneity, model uncertainty, and statistical and computational trade-offs.