Visual Style Transfer and Neural Rendering: A Survey

PhD Qualifying Examination


Title: "Visual Style Transfer and Neural Rendering: A Survey"

by

Miss Yingshu CHEN


Abstract:

With the advent of deep learning techniques, neural networks have been used to 
generate automatic visual design and visual rendering with attributes in visual 
scenes. Datadriven intelligent designs with speedy inference time via neural 
network models inspire people to blend artistic and technical elements for 
artwork creation and industrial production. Such neural computational design 
and digital element stylization with data assists non-professional or even 
amateur artists in creating original works of art or re-creating existing 
content out of imagination.

Exemplar-based style transfer provides a convenient and flexible tool for 
stylization with straightforward and intuitive target visual references for 
creation. For years, visual style transfer has been becoming a welcomed topic 
with the birth of a bunch of excellent works [51, 138, 72, 197, 66] in terms of 
image style transfer, color transfer, image-to-image translation, 3D shape and 
texture translation, texture stylization, 3D novel-view stylization, etc.

In this survey, I review related works with regard to visual style transfer and 
neural rendering. The survey involves and discusses three concepts: techniques 
of visual style transfer, neural rendering for 2D and 3D elements, challenges 
to integrating style transfer to neural rendering and relevant future research 
potentials.


Date:  			Monday, 18 July 2022

Time:                  	2:00pm - 4:00pm

Venue:			Room 1410
 			Lifts 25/26

Zoom Meeting: 
https://hkust.zoom.us/j/98007347048?pwd=SzNUWC9MQjdqcTZrTnl2U2lVRVB1dz09

Committee Members:	Dr. Sai-Kit Yeung (Supervisor)
 			Prof. Ajay Joneja (Supervisor, ISD)
 			Prof. Pedro Sander (Chairperson)
 			Prof. Huamin Qu


**** ALL are Welcome ****