Kaleidoscope: A Multicultural Mosaic is still in development. Click the above picture to be brought to the development blog!

Overview
Role: Technical Artist, Technical Producer
Tools Used: Unity3D + HDRP, Intel RealSense
Collaborators: Byungju Lee, Weizhang Lee, Vicky Lin, Jehan Sandhu, Amber Zheng
Intent
Our client for the project is the director of Carnegie Mellon University's Askwith Kenner Global Languages and Cultures Room. The room is inside CMU's new Tepper Quadrangle, and its intent is to serve as an interactive and immersive learning space on campus for language and culture. Our team was given the task of creating an experience that would reflect the intention behind the room and encourage students and other members of the CMU faculty to enter the room. One aspect that our client stressed for us to include in our experience was the idea of "cultural competency," or the ability to effectively interact with people of other cultures with sensitivity.
The Kaleidoscope experience is an exploration in how everyone carries some level of implicit cultural bias when encountering people they don't know, and aims to make our guests aware of this fact as a first step towards learning cultural competency. An obscured point cloud of a stranger and a recording of them saying a greeting is presented to the guest, after which the guest is prompted to guess character traits about this stranger. Once the guest has created an assumption-based persona of this stranger, they are shown a visual comparison of their assumptions and the answers actually provided by this stranger to see how their assumptions may have been incorrect and how they can learn from that.
Process
One large technical challenge of this project is developing an effective way to render the Intel RealSense data of the "stranger" in the experience with an obscuring effect that slowly reveals more visual information to the guest as they answer each question. My initial attempt was to use the default Intel point cloud renderer, but what I found was that it offered very little control over 

Initial test of the point cloud obscuring effect using the Intel renderer.

Next, I tested using the VFX Graph inside of Unity's High Definition Render Pipeline (offered in version 2018.3). Being a GPU-based particle system that allows for millions of particles to be on the screen at once, I can position particles based on the RealSense point cloud to allow for a much more robust way for representing the volumetric data.

Examples of point clouds using VFX Graph, and one obscure to clear effect demo

Currently, my next steps in the project involve migrating the current Kaleidoscope project into Unity's HDRP and testing the compatibility of SteamVR and our existing code with this pipeline. Once I confirm that this render pipeline will fit with our project, I will be working with the team's designers to playtest a final point cloud effect to use in the project.
Back to Top