报告人:张小群(上海交通大学)
时间:2021年7月13日11:00开始
地点:理科楼LA106
摘要:Many problems in signal, image processing and machine learning involves minimizing a sum of convex and nonconvex functional. The first algorithm is on a three-operator splitting proposed by Davis and Yin, for which we develop a convergence theory in the nonconvex case. By defining a new decreasing energy function associated with the DYS method, we establish the global convergence of the whole sequence and a local convergence rate under an additional assumption that this energy function is a Kurdyka-Lojasiewicz function. The second class of algorithms we studied is based on Alternating direction method of multipliers (ADMM) for nonconvex composite problems. In particular, we study the ADMM method combined with a class of variance reduction gradient estimators and established the global convergence of the sequence and convergence rate under the assumption of Kurdyka-Lojasiewicz (KL) function. Moreover, we also show that the popular SAGA and SARAH gradient estimators satisfy the variance reduction property. Finally, the efficiency of the algorithms is verified through statistical learning examples and L0 based sparse regularization for 3D image reconstruction. This is a joint work with Fengmiao Bian and Jingwei Liang.
简介: 张小群, 上海交通大学自然科学研究院和数学科学学院教授。武汉大学本科硕士,法国南布列塔尼大学应用数学博士,美国加州大学洛杉矶分校访问助理教授(博士后)。主要研究方向:图像科学、医学图像处理、数据科学等问题中的数学模型与计算方法。现任Inverse problems and Imaging 杂志编委,中国工业与应用数学大数据与人工智能专委会、数学与医学交叉学科专业委员会委员。
邀请人:王坤
欢迎广大师生积极参与!