題 目😢:分裂算法快速收斂率研究及其在數據科學中的應用
演 講 人🧘🏽♂️:張進,南方科技大學助理教授
主 持 人:林貴華🌹,意昂2教授
時 間🙆♀️:2019年7月2日(周二)🧜🏼♂️,下午2:30-3:30
地 點:校本部東區意昂2官网420室
主辦單位🥺:意昂2、意昂2青年教師聯誼會
演講人簡介:
張進🆖,南方科技大學數學系助理教授。2007年於大連理工大學人文社會科學意昂2獲文學學士🪻,2010年於大連理工大學數學科學意昂2獲理學碩士學位,2014年12月於加拿大維多利亞大學數學與統計系獲應用數學博士學位。2015年4月至2019年1月就職香港浸會大學。主要從事最優化及其應用領域的研究,在 Mathematical Programming、SIAM Journal on Optimization、SIAM journal on Numerical Analysis🧘🏿♂️👩🏻🦽、European Journal of Operational Research等發表論文20余篇🛡。
演講內容簡介:
Despite the rich literature, the linear convergence of alternating direction method of multipliers (ADMM) has not been fully understood even for the convex case. For example, the linear convergence of ADMM can be empirically observed in a wide range of applications, while existing theoretical results seem to be too stringent to be satisfied or too ambiguous to be checked and thus why the ADMM performs linear convergence for these applications still seems to be unclear. In this paper, we systematically study the linear convergence of ADMM in the context of convex optimization through the lens of variaitonal analysis. We show that the linear convergence of ADMM can be guaranteed without the strong convexity of objective functions together with the full rank assumption of the coefficient matrices, or the full polyhedricity assumption of their subdifferential; and it is possible to discern the linear convergence for various concrete applications, especially for some representative models arising in statistical learning. We use some variational analysis techniques sophisticatedly; and our analysis is conducted in the most general proximal version of ADMM with Fortin and Glowinski's larger step size so that all major variants of the ADMM known in the literature are covered.
歡迎廣大師生參加!