A Survey of CrowdSourcing: Query Processing with People

PhD Qualifying Examination


Title: "A Survey of CrowdSourcing: Query Processing with People"

by

Mr. Zhou ZHAO


Abstract:

Some complex problems, such as image tagging and sentiment analysis, are 
challenge for computers where even the state-of-the-art technologies are 
far from perfect. For instance, given a set of hotels with their locations 
and photos, the user may want to know ”which are the good hotels”. 
Fortunately, Crowdsourcing markets like Amazon’s Mechanical Turk (MTurk) 
make it possible to divide these complex jobs into small tasks and assign 
to workers (humans) to judge. However, the workers are obviously different 
from machines. Nevertheless, their performance are uncertain in the 
latency, expected reward and the quality of the work. Furthermore, the 
users may even pose time or monetary budget on their crowdsourced jobs. 
Recently, considerable research has been conducted on the Crowdsourcing 
problems in the database re- search community. Because the subjective 
database operators are involved in the applications of the Crowd- 
sourcing. In this survey, we first introduce different state-of-the-art 
Crowdsourcing databases. Then we review different Crowdsourcing techniques 
to tackle the database problems such as join and max. Besides, we study 
the Crowdsourcing problems in the social network area. Finally, we 
conclude this survey by giving some further research directions from the 
recent work.


Date:                   Wednesday, 22 August 2012

Time:                   10:30am - 12:30pm

Venue:                  Room 3501
                         lifts 25/26

Committee Members:	Dr. Wilfred Ng (Supervisor)
                         Prof. Dik-Lun Lee (Chairperson)
 			Dr. Qiong Luo
 			Dr. Raymond Wong


**** ALL are Welcome ****