Human-Computer Interaction FYP Report Examples

Below are links to previous FYP reports related to human-computer interaction (HCI). Links to videos of associated oral presentation are also available.

Project Title Project Description Link(s)
2016 PAN3
Real-time Emotion Sensing with Google Glass
An emotion sensing system with Google Glass, which allows users to infer the emotions of others by analyzing their facial expressions in an automated yet unobtrusive way. The inferring process is based on machine learning and support vector machine (SVM) emotion classifiers that represent six plus one distinct emotions according to Ekman’s basic emotion model. These classifiers were painstakingly devised by first manually selecting 1,806 representative images of Asian faces from 5553 images in eight different open source facial expression datasets. Then, through OpenCV’s face detector and dlib’s face landmark extractor, 18 key landmark points were identified on each face and 17 displacement vector features were calculated for each of the seven emotions. These features are fed into LIBSVM’s machine learning library for training and emotion prediction. The combined system enables real-time emotion sensing on the Google Glass with good accuracy. The levels of sensed emotions are overlaid with augmented reality (AR) on the Android mobile phone user interface as well as audibly conveyed via the tiny headset speaker in the Google Glass platform. report
2015 MA1
Leap Sense: Turn Any Computer Screen into a Gesture-Assisted Touch Screen Using a Leap Motion Controller
Utilizes a Leap Motion Controller to detect hand movement and enable gesture control and virtual touch screen functionality; distinguished project; 2015 HKUST President's Cup Winner! reportvideo
2015 PAN2
Indoor Mobile Augmented Reality Navigation System
A small-scale indoor mobile augmented reality (MAR) navigation system that can recognize some routes and sometimes align virtual paths accordingly; also provides the precise location of the user on a map; FYT report
2015 PAN3
USTAR: Augmented Reality HKUST Navigation App on Android
A practical AR application that helps users navigate through indoor areas of the HKUST campus; relies on manual input of the user's location and then uses the Android device's optical-character reader (OCR) and pedometer readings to support indoor localization, tracking and navigation; bonus feature: handy course information for all classrooms and lecture theaters so as to help users find the vacant rooms. report
2015 PAN4
Presentation Tools with Gesture Recognition and Augmented Reality
Cool and dynamic presentation software using a Leap Motion controller and adding Augmented Reality (AR) into the presentations; includes an add-in for MS PowerPoint reportvideo
2015 PSAN1
AirTennis: A Web-Based, Mobile Motion Controlled Console Game
A cool game that allows two players to play virtual tennis via computer displays and "tennis raquets" that are actually smart phones with accelerometers and gyroscopes
reportvideo
2014 RO3
Multi-stage Human-computer Interaction for Command Refining on an Intelligent Personal Assistant
A study comparing the effects of multi-stage human-computer interaction on user experience when using digital voice assistants to the user experience with that of the current single-stage verbal command based approach; well-done FYT report
2013 MA1
Kinect-Based Windows control Using Finger Gestures
Hand tracking integrated with traditional Windows control reportvideo
2012 MA1
SixthSense - Wearable Gestural Interface
Creative DIY method way to interact with a computer by using pocket projector, a normal web camera, a mirror and a frame binding all these together. reportvideo
2010 MA1
Camera Based Interactive Wall Display With Hand Detection Using LED Lights
Innovative DIY hand detection technique that uses a simple, low-budget web camera and three LED lights; distinguished project! reportvideo

Copyright HKUST CSE Dept. 2017
Blog template built for Bootstrap by @mdo.
Back to top