Smooth and locally linear semi-supervised metric learning

MPhil Thesis Defence


Title: "Smooth and locally linear semi-supervised metric learning"

By

Mr. Yang Ruan


Abstract

Many algorithms in pattern recognition and machine learning make use of 
some distance function explicitly or implicitly to characterize the 
relationships between data instances. Choosing a suitable distance 
function for a given problem at hand thus plays a very crucial role in 
delivering satisfactory performance. The goal of metric learning is to 
automate the design of the distance function (a metric or pseudometric in 
particular) by learning it automatically from data. We study in this 
thesis a metric learning problem in which some supervisory information is 
available for the data in semi-supervised learning setting, and propose a 
metric learning method called constrained moving least squares (CMLS). 
Specifically, CMLS performs locally linear transformation which varies 
smoothly across the instance space as guaranteed by the moving least 
squares approach. Learning the transformation can be cast as a convex 
optimization problem with optimality guarantee, and the transformation 
thus obtained induces a pseudometric space. We demonstrate the 
effectiveness of CMLS via a synthetic problem for illustration as well as 
some classification and clustering tasks using UCI and other real-world 
image databases.


Date:			Thursday, 14 May 2009

Time:			10:00am – 12:00noon

Venue:			Room 3416
 			Lifts 17-18

Committee Members:	Prof. Dit-Yan Yeung (Supervisor)
 			Dr. Nevin Zhang (Chairperson)
 			Dr. Raymond Wong



**** ALL are Welcome ****