Towards Futuristic Visual StoryTelling: Authoring Data-Driven Infographics in Augmented Reality

The Hong Kong University of Science and Technology
Department of Computer Science and Engineering

PhD Thesis Defence

Title: "Towards Futuristic Visual StoryTelling: Authoring Data-Driven 
Infographics in Augmented Reality"


Mr. Zhutian CHEN


An increasingly large amount of data from the physical world has been 
collected, digitized, and stored. To communicate such data to general 
public effectively, visual data-driven storytelling has been widely used. 
Yet, the vast majority of data communication occurs on desktop computers 
separated from the physical world the data originates in and refers to. 
Recent advances in Augmented Reality (AR) have shed new light on 
data-driven storytelling, offering exciting possibilities for telling 
engaging, in-situ, and immersive stories by embedding the data in the 
real-world context. However, although creating such kind of AR data 
stories is demanding and requires considerable knowledge and skills from 
different fields (e.g., data visualization, computer graphics, computer 
vision, and human-machine interaction), prior research has rarely 
investigated the effective way to create visual data-driven stories in AR 

This thesis aims to fill this gap by exploring the approaches to 
facilitate the authoring of infographics, a popular format for data-driven 
storytelling, in AR. Given that AR devices are still evolving rapidly, 
this thesis focuses on the interaction between reality and virtuality, an 
essence of AR that is independent of specific devices. The first system, 
MARVisT, leverages physical properties (e.g., size, positions) from 
real-world objects to assist non-experts in creating 3D infographics in 
mobile AR. Given the limited interaction capabilities of mobile devices, 
in the second work, LassoNet is proposed to facilitate the selection of 3D 
objects in a 2D screen based on a deep neural network. Besides physical 
properties, the third research explores leveraging the semantic properties 
(i.e., the content) of real-world infographics to support 2D infographics 
creation in AR by developing a deep-learning based method to automate the 
creation of timeline infographics. Finally, the fourth work studies 
augmenting real-world static 2D infographics with virtual content in AR 
and introduces PapARVis Designer to allow designers to create both real 
and virtual visualizations together.

The core idea of this thesis is to allow visualization designers to go 
beyond the desktop platform to the engaging, immersive, and promising AR 
platform that is seen to be the next-generation human-machine interaction 
platform. The resulting systems and techniques blaze a trail toward 
futuristic visual storytelling.

Date:			Tuesday, 17 December 2019

Time:			10:00am - 12:00noon

Venue:			Room 2132B
 			Lift 19

Chairman:		Prof. Ross MURCH (ECE)

Committee Members:	Prof. Huamin QU (Supervisor)
 			Prof. Pan HUI
 			Prof. Brian MAK
 			Prof. Ravindra GOONETILLEKE (ISD)
 			Prof. Tim DWYER (Monash Univ.)

**** ALL are Welcome ****