报告人:徐礼虎(澳门大学)
时间:2022年06月08日15:45开始
腾讯会议ID:467 670 718
摘要:Stochastic variance reduced gradient (SVRG) algorithm was proposed by Johnson and Zhang in NIPS (2013) and has been extensively used in training neural networks. We shall rigorously prove that SVRG can be approximated by a family of stochastic differential delay equations ( SDDEs) under some conditions which include non-convex examples. It is well known that SDDEs have the effect of strong dissipations and variance reductions. Our result gives a new interpretation for SVRG. This is joint work with Peng Chen and Jianya Lu.
简介:徐礼虎教授,2001年本科毕业于山东大学; 2004年研究生毕业于北京大学, 2008年博士研究生毕业于帝国理工大学(Imperial College London, UK),现为澳门大学教授,主要从事随机偏微分方程和概率极限理论方面的研究。徐礼虎教授在Proab. Theory Related Fields, Ann. Statist.,Ann. Appl. Probab., J.Funct. Anal.等期刊上发表40余篇论文。
邀请人:周国立
欢迎广大师生积极参与!