Smooth quasi-Newton methods for nonsmooth optimization
发布者:季洁发布时间:2019-11-20浏览次数:133
Speaker: | Jiayi Guo, Shanghai University of Finance and Economics |
Host: | Rujun Jiang, School of Data Science, Fudan University |
Time: | 16:00-17:00, Dec 2, 2019 |
Location: | Room 102, Zibin Building, Fudan University |
|
|
Abstract: | Sporadic informal observations over several decades (and most recently in Lewis-Overton, 2013) suggest that quasi-Newton methods for smooth optimization can also work surprisingly well on nonsmooth functions. This talk explores this phenomenon from several perspectives. First, we show how Powell's original 1976 BFGS convergence proof for smooth convex functions in fact extends to some nonsmooth settings. Secondly, we study how repeated BFGS updating at a single fixed point can serve as a separation oracle (for the subdifferential). Lastly, we compare experimentally the two most popular quasi-Newton updates, BFGS and SR1, in the nonsmooth setting.
|
|
|
Bio: | Jiayi Guo is an assistant professor at Shanghai University of Finance and Economics. Dr. Guo received his Ph.D. degree from the School of Operations Research and Information Engineering at Cornell University (2018). Before coming to Cornell, he received his B.S. of Mathematics and B.S. of Computer Science dual degree (2012) at University of Illinois Urbana-Champaign.Broadly conceived, Dr. Guo’s research area is optimization. Currently, his work explores different variations of iterative methods to solve continuous optimization problems on non-smooth functions, and their application in machine learning. In general, he is interested in the interplay between optimization, simulation, and numerical solver development.Lastly, Dr. Guo also had working experiences at eBay and Argonne National Laboratory, and consulting experiences with several leading enterprises.
|