ETHERIAL – Quantum Form from the Virtual to the Material.



The nature and behavior of matter and energy on the atomic or subatomic level, quantum physics, lends to its ethereal nature, gossamer wings, those substances that are immutable however untouchable. To touch the untouchable, to understand and know what is real but cannot be seen and to experience it; to truly experience immateriality as substance, form, and shape that is dynamic, transformative and truly alive, constantly changing but continually unchanged, the vibration of waveforms intermingling as one form, one shape one spirit, into a myriad of forms.

ETHERIAL will bring the quantum form into the material, through virtual reality, spatial augmented reality and material form. The work will consist of both virtual reality and spatial augmented, a completely immersive VR space that will allow viewers to interact with the virtual world of quantum mechanics in real time through a 3D immersive multi-user projection space with an interactive floor projection system. An HMD system facilities viewing the installation dis-embodied while the physically immersed viewers interact through the floor projection system. A virtual sculpture of the probability wave functions of the electron can be viewed in the HMD, as well giving the possibility for a physically rendered sculpture that could be tracked with gestural sensors. Two windows into a completely immersive VR space, one that is interactive, multi-user and physically embodied, and the other that is dis-embodied, physically un-grounded, and gives the ability to view the narrative from a completely different perspective.

In keeping with the narrative of light, the quantum, revealed, the hydrogen-like atom combinations feature light-emitting wave function combinations that move toward the science of the phenomenon, while the quantum, suggests the ethereal nature of spirit in the form of light, ETHEREAL/IMMUTABLE – to touch the untouchable.

Type of the project:

Virtual Reality Interactive Installation

Etherial, Project documentation. June, 2019.

Year the project was created:



Much thanks to the members of the AlloSphere Research Group for the design and development of the AlloLib software.

Support received from:

This research was possible thanks to the support of Robert W. Deutsch Foundation, The Mosher Foundation, the National Science Foundation under Grant numbers 0821858, 0855279, and 1047678, and the U.S. Army Research Laboratory under MURI Grant number W911NF-09-1-0553.


immersive multi-modal installation, interactive immersive,multimedia art work, immersive multi-modal multimedia system installation


The AlloLib software system is open source software design by the AlloSphere Research Group and licensed by the Regents of the University of California on GitHub (

The current AlloLib libraries provide a toolkit for writing cross-platform audio/visual applications in C++, with tools to assist the synchronization of sound with graphics rendering required for distributed performance, interactive control, as well as a rich set of audio spatializers. The AudioScene infrastructure allows the use of vector-based amplitude panning VBAP, distance-based amplitude panning (DBAP), or higher order ambisonics as the backend diffusion techniques (McGee 2016). The Gamma library provides a set of C++ unit-generator classes with a focus on the design of audiovisual systems (Putnam 2014). The Gamma library allows synchronization of generators and processors to different “domains,” permitting them to be used at frame or audio rate in different contexts, i.e., there is no need to downsample signals for the visuals as the signals and processes can be running at video rate within the graphics functions. The Gamma application programming interfaces (APIs) are also consistent with the rest of the graphics software, which makes learning and integrating the systems simpler. Generic spatializers and panners designed around the concept of scenes with sources and listeners are employed. Similar to the way a 3-D graphics scene is described to the graphics rendering that can then render through different “cameras” with different perspectives, an audio scene can be listened to by multiple “listeners” using different spatialization techniques and perspectives to render the sound. The AlloLib library, which is designed to support distributed graphics rendering, can also be readily used for distributed audio rendering, taking advantage of the rendering cluster’s computing power for computationally intensive audio processing. This is potentially useful for sonification of extremely large data sets, where the synthesis of each sound agent can be performed in parallel on separate machines, allowing for rendering that is both more complex and more nuanced. The system is completely multi-user and interactive.


Etherial 2019 Installation Technical Equipment & Space Requirements

I. Technical Equipment Requirements:

II. Space Requirements

III. Installation Graphic Document Summary:

IV. Artists