报告题目：A Unified Framework of Subgradient Methods for Convex or Quasi-convex Programming
胡耀华博士，深圳大学数学与统计学院副传授，硕士生导师，本科和硕士毕业于浙江大学，博士毕业于香港理工大学，从事最优化理论，算法和应用方面的研究工作。目前在最优化领域的权威期刊SIAM Journal on Optimization，European Journal of Operational Research，Journal of Global Optimization及Numerical Algorithms，Journal of Machine Learning Research和Inverse Problems等期刊上发表了多篇学术论文。
Mathematical optimization provides a unified framework for a wide variety of important problems in many disciplines and application fields. Convex optimization plays a central role in mathematical optimization, and quasi-convex optimization is fundamental to the modeling of many practical problems in various fields such as economics, finance and industrial organization. Subgradient methods are practical iterative algorithms for solving large-scale convex or quasi-convex optimization problems. In this talk, we develop an abstract convergence theorem for a class of sequences, which satisfy a general basic inequality, under some suitable assumptions on parameters. The convergence properties in both function values and distances of iterate from the optimal solution set are discussed. The abstract convergence theorem covers relevant results of many types of subgradient methods studied in the literature, for either convex or quasi-convex optimization.