A Survey of Fast Convex Optimization Methods in Machine Learning

PhD Qualifying Examination


Title: "A Survey of Fast Convex Optimization Methods in Machine Learning"

by

Mr. Wenliang Zhong


Abstract:

With the development of Internat and storage technics, large datasets 
become more and more popular in machine learning research. How to e 
ciently solve convex optimization problem with these datasets is an 
important topic, which attracts many researchers' interest. Traditional 
Gradient methods, though highly scalable and easy to implement, are known 
to converge slowly. Some more sophisticated algorithms, like Newton 
method, can converges fast w.r.t number of iteration. How- ever, it is 
impractical to compute or save Hessian matrix even for one iteration when 
the data is of millions dimensions. To overcome these obstacles, several 
fast convex optimization methods have been pro- posed recently. This paper 
gives an a general introduction to these algorithms and a review of the 
literature. Specially, both deterministic and stochastic, normal and 
accelerated gradient decent methods are presented. Another fast 
optimization style, called coordinate decent, is also included. These 
algorithm frameworks cover a wide range of convex optimization problems in 
machine learning, e.g. SVM, logistic regression, LASSO, elastic net 
regression, convex multi-tasks learning, etc. Moreover, brie y comparison, 
convergence rate analysis, applica- tion examples and some empirical 
evidence are also provided.


Date:                   Friday, 7 January 2011

Time:                   10:00am - 12:00noon

Venue:                  Room 3501
                         lifts 25/26

Committee Members:      Dr. James Kwok (Supervisor)
                         Prof. Dit-Yan Yeung (Chairperson)
 			Dr. Raymond Wong
 			Prof. Nevin Zhang


**** ALL are Welcome ****