top of page

Introduction

Overview

This website is to outline the final project submission for the CSC 2524 - Topics in Interactive Computing: Graphics, Interaction and Performance in Immersive Environments course. Our project was designed as an interactive installation for audience members to experience the theatre piece known as Flight Over the Ocean by Bertolt Brecht. 

 

Our final project allows audience members to traverse a hallway environment where as they walk and interact in the space, different visual and auditory effects are triggered. The intention of the installation is allow audience members to experience the piece from the pilot’s perspective. The general pipeline of the system goes as follows:

 

  1. The Kinect sensor generates a point cloud. 

  2. Using the point cloud, the audience member’s presence in the space is tracked.

  3. As this presence interacts with voxel triggers, the system output audio files according to the trigger.

  4. The voxel triggers also output a signal to Isadora to set the appropriate visual output.

Group Member

Clare and Nicole

Clare and Nicole were primarily responsible for the creative design of the system, the inception of the overall concept, and designing' the experience for the audience. Specifically, they worked together in setting the virtual map of voxel triggers with the associated effects, composing each of the visual and auditory elements, and designing the final space of the theatre piece such as the lighting and projection systems.

Jacob and Sijie

Jacob and Sijie were primarily responsible for the technical development of the system. Working with David Rokeby and the software he previously developed throughout his career, we designed a tool and user interface used by Clare and Nicole to create the virtual space. Our role was to ensure that the tool was easy to use, had enough intuitive features, and was clear throughout operation and testing. 

Scene Description

We were inspired by the piece known as Flight Over the Ocean by Bertolt Brecht which is an account of Charles Lindbergh’s daring flight over the Atlantic Ocean in 1927. The piece follows the pilot as he flies over the ocean using a small plane where he interacts with the different elements with dialogue such sleep, fog, snow, and water. The piece was presented as a radio play and so a significant focus of our installation was the auditory experience.

Technical Setup

Max

General Pipeline
General pipeline.png

The majority of the system was developed using the Max software and using the packages developed and provided by David Rokeby. The general pipeline is shown in the image on the left.

 

Basically, the data is collected from the Kinect and transformed to be useful in the space starting from the node r kinect_1_on to v.change. This data is converted to a point cloud and then into voxel space using the p.convert_to_voxels node, where the density of the point cloud within each voxel is given and represents the presence of an individual or activity within the given space. As an individual walks through the space and the activity in the voxels change, the custom map of triggers is referenced and the appropriate effects are outputted using the nodes  p.map_sounds and p.calc_triggers.

Kinect and Voxels
Kinect.png

The image above is the patch for the p.convert_to_voxels node. Within this patch, the raw data of the Kinect is converted into the point cloud and then to the voxels (or bins). In this patch, the dimensions of the space and voxel (or bins) are set and the different views after appropriate transformations are shown.

Maps
maps.png

The image above shows the patch for the p.map_sounds node. This patch was the interface given to Clare and Nicole to develop the custom map of triggers. 

 

In the top left of the patch, we see the functions that opens and saves the different custom maps. 

 

In the top right of the patch, we can see the 2D representation of the currently selected custom map and the functions used to paint this map. It should be noted that this 2D representation is not any actual orientation and was only used as a reference to ensure that the operation was successful. The functions to paint the map was split into slicing the map or region painting as shown, where the user can specify which sound object and the slice (x, y or z) to paint or the specific coordinates in the map to paint with the sound object.

 

The middle and bottom left of the patch shows the interface for the sound objects, where the user was able to test the sounds and specify parameters including which speakers to use, volume, pitch, retrigger rate, decay rate and etc.

 

It should be noted that the patches and system responsible for appropriately outputting the sound was entirely developed by David Rokeby. 

 

The bottom right of the patch was developed in order to visualize the different views of the custom map as an individual interacted in the space. The visualization outputs are shown in the images on the left, where images are respectively side, top and front. As an individual interacts with a region that has a sound object associated with it, the appropriate voxels are outlined with white lines to represent activity in that region.

visualization of maps 1.png
visualization of maps 3.png
visualization of maps 2.png

Isadora

Isadora.png

The visualizations were controlled using Isadora software and the patch shown was developed by Nicole. The basic system operates such that it receives a signal through defined channels by the software developed in Max. Based on the channel and signal, Isadora would increase or decrease the  mixing of a particular visual effect. So as the signal for one particular visual effect increased, the patch would increase the dominance of that visual effect relative to others.

Final Presentation

P1260572.JPG
P1260559.JPG
P1260584.JPG
P1260582.JPG

Resources

bottom of page