报告人: Wenqing Hu(Missouri University of Science and Technology)
时 间: 2018年6月19日 16:00--17:00
地 点: 理科楼LA106
摘 要:Many large-scale learning problems in modern statistics and machine learning can be reduced to solving stochastic optimization problems, i.e., the search for (local) minimum points of the expectation of an objective random function (loss function). These optimization problems are usually solved by certain stochastic approximation algorithms, which are recursive update rules with random inputs in each iteration. In this talk, we will be considering various types of such stochastic approximation algorithms, including the stochastic gradient descent, the stochastic composite gradient descent, as well as the stochastic heavy-ball method. By introducing approximating diffusion processes to the discrete recursive schemes, we will analyze the convergence of the diffusion limits to these algorithms via delicate techniques in stochastic analysis and asymptotic methods, in particular via random perturbations of dynamical systems. This talk is based on a series of joint works with Chris Junchi Li (Princeton), Weijie Su (UPenn) and Haoyi Xiong (Missouri S&T).
报告人简介:Wenqing Hu, Assistant Professor of Mathematics at the Department of Mathematics and Statistics, Missouri University of Science and Technology (formerly University of Missouri, Rolla). His research interests are primarily in the fields of Probability Theory and Statistical Methodology.
EDUCATION
Ph.D. Mathematics. University of Maryland, College Park. Advisor: Mark Freidlin
B.S. Mathematics. Peking University.
更多信息详见个人网页:http://web.mst.edu/~huwen/
公司联系人: 李寒宇
欢迎广大师生积极参与!