top of page


Collaborating with XRlab at City University of Hong Kong, I developed an Augmented Reality (AR) system within the 360 Gallery, integrating Hololens and iOS devices.

This project explores blending a 360 stereoscopic screen setup with Hololens 2 and motion capture sensors using Unreal Engine 4. The goal is to seamlessly merge holograms with screen content for immersive experiences.

In collaboration with the XRlab at City University of Hong Kong, I spearheaded the development of an Augmented Reality (AR) authoring system within the Ndisplay infrastructure of the 360 Gallery. Generously shared by Professor Jeffrey Shaw, the 360 Gallery serves as a cylindrical projection with stereoscopic capabilities, offering a shared VR experience where multiple users can simultaneously engage in an immersive environment.

This pioneering project seamlessly integrated the 360 Gallery with Hololens systems and iOS devices concurrently. Operating within an environment capable of generating live content while inspecting it through dynamic locations, the challenge involved devising mechanisms to align all devices and points of view for a cohesive experience.

Technical Exploration Idea:
Addressing the limitations of shared holographic experiences, we embarked on a project that intertwined a 360 stereoscopic screen setup with Hololens 2 devices and motion capture sensors. This exploration aimed to refine usability, languages, and codes for practical applications in real-life scenarios. Leveraging Unreal Engine 4 and its NDisplay technologies, we incorporated Hololens 2 streaming tools, Open Sound Control (OSC), and motion capture using the OptiTrack system.

The 360 Gallery:
Situated in the School of Creative Media at City University of Hong Kong, the 360 Gallery operates on six computers, relying on a set of five Barco Projectors with Frame Sequential stereoscopic shutters.

Building upon our previous development for the 360 Gallery, which utilized Unreal Engine NDisplay technology, we established a synchronized system for running stereoscopic real-time content. The addition of OSC facilitated specialized interactions, serving as a crucial communication bridge between the Hololens 2 devices and the screens.

Hololens and Motion Capture:
Controlled by one or multiple Hololens 2 devices, the system incorporates absolute positional tracking for manipulating objects and scenes.

The primary objective of this project was to explore interaction design and technological infrastructure. This encompasses studying modes to synchronize and interact between the two mediums within a single space, seamlessly blending holograms with screen content to create shared immersive and cohesive experiences.



Development under the direction of Prof Alvaro Casinelli and Prof Christian Sandor.

Extended Reality Labs. 

City Unversity of Hong kong

Special thanks to:
Dávid Maruscsák
Prof Jeffrey Shaw
Applied Computer and Interactive Media (ACIM)
Joe Leung
The 360 Gallery
The AR Lab amazing team


bottom of page