当前位置: 首页 > 新闻中心 > 学术活动 > 正文

Efficient and stable SAV-based methods for the training in deep learning

发布日期:2023-06-27点击数:

报告人:毛志平(厦门大学)

时间:2023年06月29日 10:00--

地址:理科楼LA106


摘要:The optimization algorithm plays an important role in deep learning and significantly affects the stability and efficiency of the training process, and consequently the accuracy of the neural network approximation. A suitable (initial) learning rate is crucial for the optimization algorithm in deep learning. However, a small learning rate is usually needed to guarantee the convergence of the optimization algorithm, resulting in a slow training process. We develop in this work efficient and energy stable optimization methods for function approximation problems in deep learning. In particular, we consider the gradient flows arising from deep learning from the continuous point of view, and employ the particle method (PM) and the smoothed particle method (SPM) for the space discretization while we adopt SAV-based schemes, combined with the adaptive strategy used in Adam algorithm, for the time discretization. To illustrate the effectiveness of the proposed methods, we present a number of numerical tests to demonstrate that the SAV-based schemes significantly improve the efficiency and stability of the training as well as the accuracy of the neural network approximation. We also show that the SPM approach gives a slightly better accuracy than the PM approach. We further demonstrate the advantage of using adaptive learning rate in dealing with more complex problems.


简介:毛志平,厦门大学数学科学学院教授,2009年本科毕业于太阳成集团,2015年博士毕业于厦门大学计算数学专业,国家高层次青年人才,201510月至20209月在美国布朗大学应用数学系从事博士后研究。毛志平教授主要从事谱方法以及机器学习方面的研究,其目前在SIREV, JCPSISCSINUMCMAME等国际高水平杂志上发表论文30余篇。


邀请人:王坤


欢迎广大师生积极参与!



关于我们
太阳成集团tyc539的前身是始建于1929年的太阳成集团理学院和1937年建立的太阳成集团商学院,理学院是太阳成集团最早设立的三个学院之一,首任经理为数学家何鲁先生。