Dr. JoAnn Kuchera-Morin, chief designer of the facility, composer, and media artist with over thirty filve years of experience in media systems engineering, outlines the vision for our research.
MYRIOI The Shared Immersive VR Experience: Intuitive, Intrinsic, Instinctive
The AlloLib Software System is the underlying media infrastructure that drives the instrument, an integrated collection of C++ libraries for interactive immersive computation, visualization/sonification, as well as a reactive system connected to the python scripting language and the Jupyter notebook. This system functions as a complete workflow design from the HPC cluster or the Cloud, to the Desktop. Licensed by the Regents of the University of California, AlloLib is the open source software core for distributed high performance multimedia computing. More info
MYRIOI is an interactive, immersive, shared narrative that drives the further evolution of our AlloPortal/AlloLib instrument/environment for artistic/scientific content exploration. The AlloPortal seamlessly ties together projection-based multi-user VR with HMDs for the shared experience, full FOV unencumbered, projection-based VR, facilitating groups of people immersed in the same world, tied to HMDs for a shared embodied/dis-embodied experience, this world, involving further evolution of our hydrogen-like atom application. The evolution of our quantum mechanical system through the interactive/immersive composition MYRIOI facilitated the extension of AlloPortal to intricate SAR control for a robust multi-user shared experience, and expansion of FOV encompassing the environment.
What would it be like to have a shared VR experience and to be present, to really feel presence together in immersive worlds unimagined, from the atomic to the cosmic? AlloPortal is an immersive instrument/installation that is constructed based on our research in designing the AlloSphere instrument and AlloLib software. We implement various versions of the AlloSphere instrument based on the installation environment provided, and demonstrate our narrative, MYRIOI - "innumerable" ("myriad particles”): To share the experience of being immersed and interacting with myriads of particles that create currents, becoming waveforms, to understand and to really experience viscerally, the quantum, sharing and interacting with this narrative. A shared experience that will allow a group of users to see themselves and each other and to passively experience or interact with the world of the quantum: waveforms, light, the pure essence of form and shape. The distributed projection system also connects to an HMD for a dis-embodied shared experience of this same narrative. The HMD will be the intersection of the flow of dynamic form and dynamically moving virtual sculpture with the fabrication of the prominent theme in material form.
AlloPortal will be constructed for this immersive, shared experience and MYRIOI will extend the SAR construction of the installation to include robust tracking for multi-user manipulation of the narrative. FOV will be expanded for full peripheral immersion, essential for “feeling present” in the immersive world. FOV must be expansive enough to convince a group that they are experiencing and “in” the same world. The dis-embodied experience can be viewed by a number of people if more than one HMD is connected to the distributed system. The dis-embodied experience will facilitate the HMD user to explore very intricately each form and shape from a very different view than the group experiencing the projection-based narrative. We present our studies in composing elementary wavefunctions of a hydrogen-like atom and identify several relationships between physical phenomena and musical composition that helped guide the process. The hydrogen-like atom accurately describes some of the fundamental quantum mechanical phenomena of nature and supplies the composer with a set of well-defined mathematical constraints that can create a wide variety of complex spatiotemporal patterns. We explore the visual appearance of time-dependent combinations of two and three eigenfunctions of an electron with spin in a hydrogen-like atom, highlighting the resulting symmetries and symmetry changes.
MYRIOI will take these wavefunction combinations to the highest level of counterpoint, myriads or particles forming waves of light interactively and immersively visualized and experienced by a shared community who are present and active in a world that they could only experience in an instrument/environment built solely for unencumbered group-user experience in a Virtual World.
Type of the project:
Virtual Reality Interactive Installation including an HMD Version for SIGGRAPH 2020
MYRIOI, Project documentation. 2019 - 2020.
Year the project was created:
2019-2020
Credits:
Much thanks to the members of the AlloSphere Research Group for the design and development of the AlloLib software.
Support received from:
This research was possible thanks to the support of Robert W. Deutsch Foundation, The Mosher Foundation, the National Science Foundation under Grant numbers 0821858, 0855279, and 1047678, and the U.S. Army Research Laboratory under MURI Grant number W911NF-09-1-0553.
Keywords:
immersive multi-modal installation, interactive immersive,multimedia art work, immersive multi-modal multimedia system installation
Software:
The AlloLib software system is open source software design by the AlloSphere Research Group and licensed by the Regents of the University of California on GitHub (https://github.com/AlloSphere-Research-Group/allolib).
The current AlloLib libraries provide a toolkit for writing cross-platform audio/visual applications in C++, with tools to assist the synchronization of sound with graphics rendering required for distributed performance, interactive control, as well as a rich set of audio spatializers. The AudioScene infrastructure allows the use of vector-based amplitude panning VBAP, distance-based amplitude panning (DBAP), or higher order ambisonics as the backend diffusion techniques (McGee 2016). The Gamma library provides a set of C++ unit-generator classes with a focus on the design of audiovisual systems (Putnam 2014). The Gamma library allows synchronization of generators and processors to different “domains,” permitting them to be used at frame or audio rate in different contexts, i.e., there is no need to downsample signals for the visuals as the signals and processes can be running at video rate within the graphics functions. The Gamma application programming interfaces (APIs) are also consistent with the rest of the graphics software, which makes learning and integrating the systems simpler. Generic spatializers and panners designed around the concept of scenes with sources and listeners are employed. Similar to the way a 3-D graphics scene is described to the graphics rendering that can then render through different “cameras” with different perspectives, an audio scene can be listened to by multiple “listeners” using different spatialization techniques and perspectives to render the sound. The AlloLib library, which is designed to support distributed graphics rendering, can also be readily used for distributed audio rendering, taking advantage of the rendering cluster’s computing power for computationally intensive audio processing. This is potentially useful for sonification of extremely large data sets, where the synthesis of each sound agent can be performed in parallel on separate machines, allowing for rendering that is both more complex and more nuanced. The system is completely multi-user and interactive.
Hardware:
MYRIOI Installation Technical Equipment & Space Requirements (Abbreviated)
I. Space Requirements
A. Room
- Minimum 5m x 10m
- Dark
Required to ensure immersiveness of the Main display
Exterior windows need to be blocked
Minimal to no lighting (Barely enough to move around)
Light needs to be blocked from adjacent installations
- Adequate Sound Insulation
Piece may get loud at times
Audio portion of the piece needs to not interfere with audio from nearby pieces or vice versa
High enough walls on all sides of the installation
B. Main Screens (3)
- White wall/panel
- Minimum size 5m x 3m. (16:9 aspect ratio but can be adjusted)
- Side to side (Fill the whole wall)
Screen also needs to be high enough to avoid visual distractions above the screen to avoid breaking of immersion. Preferably floor to ceiling.
C. Floor Projection Surfaces (3)
- Non-reflective. Might require a tarp or similar to be put on the floor if the floor is reflective
II. Installation Graphic Document Summary:
Final Installation Design for SIGGRAPH 2020 – (6) Surfaces
III. Technical Equipment Requirements:
A. 3D Glasses & Emitter & Sync cable
- Example Product: XPAND X105-RF-X1 glasses and XPAND AD025 RF X1 emitters
- RF Signal (IR variants can interfere with HTC Vive and Kinects)
- 20+ glasses ready for viewers (It might be important to have backups in case batteries run out or glasses break)
- Sync cable connecting to the main stereo projector (10m, Cable type may vary depending on projector/3D glass system, but usually a 3-pin DIN or BNC)
- Small table at the entrance to place/manage glasses (avoid repetition if possible, or label items so that there is no confusion about how many tables/pedestals, etc, are needed)
B. Two (6) Projectors & Display Cables
- One Stereo-capable Projector (Main projection onto the wall, requires a sync signal output to connect to the 3D glass system)
1920 x 1200, 60Hz Stereo Rendering = 120Hz Scanrate
Elevated or Ceiling mounted
Needs to be high enough to avoid shadows onto the display
High lumens (at least 5000 lumens. Preferably 8000+)
Requires covering the whole screen
C. Three (5) computers (main simulator, HTC Vive renderer, interaction server)
- Needs to be encased with ventilation to block off access and reduce noise during show. If cabling allows, they can be located in an adjacent room.
-Placement of the machines might need to be moved depending on the Projector/cable situation
- Wireless Keyboard and Mouse
D. Network System
- Ethernet cable connections between the computers (1Gbps)
- All connected to the same network switch
E. Audio system
2 or 8 channel High quality loudspeakers (ceiling mount/appropriate pedestals)
F. HTC Vive
- Preferred HTC Vive Pro
G. Microsoft Kinect
IV. Artists
Lead Artist – Composer, Dr. JoAnn Kuchera-Morin, creator of MYRIOI, is Director and Chief Scientist of the AlloSphere Research Facility http://www.allosphere.ucsb.edu/ and Professor of Media Arts and Technology and Music, in the California NanoSystems Institute at the University of California, Santa Barbara (UCSB). Her research focuses on creative computational systems, content, and facilities design. Her 35 years of experience in digital media research led to the creation of a multi-million dollar sponsored research program for the University of California—the Digital Media Innovation Program. She was Chief Scientist of the Program from 1998 to 2003. The culmination of her creativity and research is the AlloSphere, a 30-foot diameter, 3-story high metal cylinder inside an echo-free cube, designed for immersive/interactive scientific/artistic investigation of multi-dimensional data sets. https://allosphere.ucsb.edu/kuchera-morin/
Andrès Cabrera – Distributed Multimedia Software Design, AlloSphere Media Systems Engineer, AlloSphere Research Facility, University of California Santa Barbara, PhD in Music Technology, Queen’s University Belfast, Belfast, Ireland, Cabrera’s expertise includes 3D spatial audio, and multimedia systems design.
http://www.allosphere.ucsb.edu/html/people.html.
Kon Hyong Kim – Graphics Researcher/Artist, Calibration, Spatial Augmented Reality (SAR) Research, Creator of the SAR Installation Environment. Kon Hyong Kim is currently a Ph.D. Candidate in the MAT and a member of the AlloSphere Research Group. Kim’s research focuses on the analysis and application of advanced graphics rendering and calibration techniques, including the integration of Spatial Augmented Reality and Virtual Reality technologies. https://konhyong.wordpress.com/.
.
Tim Wood – Human Computer Interaction Design, Spatial Augmented Reality (SAR) Research, & User Interface. Tim Wood is currently a Ph.D. Candidate in the MAT and a member of the AlloSphere Research Group. Woods’s research focuses Full Body Interaction and Immersive Systems. http://fishuyo.com
Gustavo Rincon – Architectural Design Researcher/Media Artist, Fabricator, Graphics Immersive Artist, Creator of the material-rendered sculpture: Gustavo Alfonso Rincon is educated as an architect, visual artist and currently is a Ph.D. student in the Graduate Program in the Media Arts and Technology (MAT) at UCSB, and a member of the AlloSphere Research Group. He holds Masters Degrees in Architecture/Urban Design from UCLA as well as a Masters in Fine Arts from the California Institute of the Arts. http://w2.mat.ucsb.edu/grincon
Exhibitions
2020
International Exhibitions
SIGGRAPH 2020 - Think Beyond, D.C. USA , Virtual Exhibition
National Exhibitions
AlloPortal & AlloSphere - California NanoSystems Institute - UCSB, USA
Putnam, Lance Jonathan; Kuchera-Morin, JoAnn; Peliti, Luca. "Studies in Composing Hydrogen Atom Wavefunctions". Leonardo: Journal of the International Society for the Arts, Sciences and Technology. In press. 2014.
Kuchera-Morin, JoAnn. "Performing in Quantum Space: A Creative Approach to N-Dimensional Computing." Leonardo 44.5 (2011): 462-463.
Putnam, Lance Jonathan; Wakefield, Graham; Ji, Haru; Alper, Basak; Adderton, Dennis; Kuchera-Morin, JoAnn. "Immersed in Unfolding Complex Systems" Beautiful Visualization: Looking at Data through the Eyes of Experts. ed. / Julie Steele; Noah Iliinsky. O'Reilly Media, Inc., 2010. p. 291-309.