Total immersion

From oemagazine July 2001
01 July 2001
By Chrysostomos (Max) Nikias, Alexander Sawchuk, Ulrich Neumann, Dennis McLeod, Roger Zimmermann, and C. C. (Jay) Kuo

By now, the Internet has become an inescapable part of everyday life. What office could run without e-mail, for instance? But imagine adding a new optical component to the system—a three-dimensional, real-time visual display that puts the user in the middle of a scene being transmitted from somewhere else. As high-bandwidth optical communications makes transmission of such scenes possible, our group at the Integrated Media Systems Center (IMSC) at the University of Southern California (USC; Los Angeles, CA) is working on technologies to create total immersion in the scenes.

Panoramic video technology creates a wide-screen view of a large event, such as this USC homecoming game at the Los Angeles Coliseum.

At IMSC, we think a total-immersion technology called Immersipresence will dramatically change our world within this decade, transforming our two-dimensional world of computers, TV, and film into three-dimensional immersive environments in our own living rooms—or practically anywhere else. Through Immersipresence, remote live scenes will be transmitted in 3-D over the Internet to augment the real environment. Transmissions will include graphics and animation. In essence, the real environment will become immersed in the remote environment.

For example, within 10 years, people will shop from their living rooms via an Internet home-shopping channel that will allow them to see and talk to lifelike, full-bodied human representations of store clerks. Within 15 years, they will be able to get a realistic sense of touching and feeling the products. Three-dimensional audio will make it sound like they are really shopping in a mall. These immersive environments could be a living room, factory floor, office, or classroom. Aided by unnoticed screens or special glasses, Immersipresence will touch the lives of everyone.

For this new immersive Internet, IMSC is developing such multimedia technologies as advanced interactivity features, video compression, 3-D facial modeling and animation, tracking technology, panoramic video technology, and audio. IMSC has continued to enhance the Media Immersion Environment (MIE), the center's software and hardware infrastructure, as a national test bed for the development and fundamental integration of new multimedia and Internet technologies. The MIE has a robust system infrastructure for the implementation and demonstration of multiple, diverse system applications in an operational environment. Using the MIE, the center is conducting experiments involving the use of IMSC's panoramic video technology and audio for Internet transmission of a concert at the USC Alfred Newman Recital Hall and a USC homecoming game at the Los Angeles Coliseum (see figure).

The MIE test bed spans a wide range of efforts and technologies. Other research efforts that overlap partially with MIE functionalities can loosely be classified into the following categories: tele-immersion, network media experiments, and simulated virtual environments. These categories address subsets of the research issues that the MIE encompasses and sometimes are a loose collection of related projects that do not address the important integration issues.

tele-immersion

The vision of tele-immersion is to have people interacting with each other over a distance as three-dimensional human representations, or avatars. The vision has been pursued in various ways and forms for more than a decade. In the late 1980s, the Advanced Telecommunications Research Institute International (Kyoto, Japan) created a virtual meeting place where each participant's avatar was visible to the group. In the mid-nineties, a project at the University of California, San Diego modeled a dynamic scene from multiple camera images.

Efforts are underway in a joint project of the University of North Carolina (UNC; Chapel Hill, NC) and University of Pennsylvania (Penn; Philadelphia, PA), and at another project at Carnegie Mellon University (Pittsburgh, PA). The UNC/Penn project is a component of the National Tele-immersion Initiative funded by Advanced Networks and Services (Armonk, NY). Penn developed a six-camera array focused on a user at a desk/display system, while UNC deals with the stereo display system and the acquisition of a laser-scanned office background model, as well as haptics (touch-related technologies). To date, one or more users can observe prestored 3-D models of remote offices with live models of people in the foreground.

IMSC's vision differs from these projects because the center pursues the development of an integrated media system, focusing on a much greater range of issues in tele-immersion, including audio. All prior experimental systems have lacked immersive audio, which is a critical component to realizing the full benefits of sensory immersion. Another key part of IMSC's vision of tele-immersion is the integration of haptics technologies, and the center has been making progress in that area. We are developing panoramic video technology for scenes that are not practical to acquire with any 3-D scanner or camera technology. For example, our recent experiments at a football game and blues concert demonstrate the ability to acquire both immersive imagery and audio that allow a remote participant to experience the event. We integrate storage formats, synchronized streaming, and compression into a single system and provide real-time playback through a head-tracked and head-worn viewer.

using the network

An important component of remote immersion is the ability to send and receive a multitude of session data from all participants via long-distance, high-speed networks. Various research groups have developed and tested network-centered applications within the past few years. The focus of these projects is generally the exploitation of next-generation networking hardware and software. For example, a musical performance at McGill University (Montreal, Quebec, Canada) was transmitted in real time over the Internet to an audience at New York University (New York, NY) in September 1999.1 Researchers at the University of Washington (Seattle, WA) produced the first live high-definition television newscast over the Internet in April 2000. In November 2000, a group of universities in partnership with Optivision (Palo Alto, CA), recorded a music video at multiple locations using real-time streaming over Internet 2.

IMSC's MIE differs from these networking experiments in significant ways. These network experiments addressed synchronization issues with a manual approach. In contrast, we are developing a system that combines global positioning satellite (GPS) timing with a real-time operating system for a platform that will overcome synchronization problems. Given the best-effort nature of the Internet, Network Time Protocol (NTP) can only achieve synchronization within several hundreds of milliseconds for participants separated by moderate physical distances, thus producing noticeable delays. The GPS system, on the other hand, can maintain microsecond synchronization between any number of locations anywhere in the world.

The MIE architecture also includes specialized storage repositories for all different media types. IMSC has developed a real-time file server as a distributed, scalable continuous media server that addresses support for multiple media streams with different modalities and bandwidth requirements. It delivers media data over standard IP-based networks in real time, and synchronizes multiple media streams, such as panoramic video and audio. In short, the MIE provides a complete media processing architecture to build applications that send and receive the high-speed media streams.

virtually real

Simulated virtual environments share similarities with immersive environments. SIMNET (an acronym formed from "simulated networking") is a large-scale networked simulation system developed by the U.S. Army and the Defense Advanced Research Projects Agency.2 It links simulators of tanks, helicopters, and airplanes into a realistic cyberspace battlefield. The SIMNET group developed the first standard protocol for distributed interactive simulations.

In another example, game engines offer a high level of realism, interaction, and cooperation in creating virtual 3-D worlds. At IMSC, we are exploring the use of the game engine 3-D GameStudio A4 for an application research project known as BioSIGHT, which is developing an interactive high-school biology curriculum as a prototype for curricula in all education from kindergarten through grade 12. At the same time, we are taking the opportunity to identify integration issues that we can transfer to the MIE architecture as components of the software modules mature. In general, IMSC's MIE differs markedly from simulation systems such as SIMNET and game engines, because the MIE is being developed as a generic development and execution platform, while these systems are limited by specific requirements and designs.

In order to deal with all of the issues facing researchers in tele-immersion, networking, and virtual environment simulation, the MIE focuses on integration. In addition to acquiring and displaying representations of 3-D scenes with visual and aural content, it addresses data storage, retrieval, indexing, streaming, compression, and network transport issues. Also, because of its broad scope and extensibility, IMSC's MIE allows for the integration of technologies developed by third parties at other universities and commercial enterprises. Addressing all these natural elements of any integrated media system is a truly novel approach for research in media immersion, and it provides a path to the immersive environment of the future. oe

References

1. Xu, A., et al. Jrnl Audio Engineering Society, July-August 2000.

2. Pope, A. The SIMNET network and protocols. Technical Report 7102, BBN Systems and Technologies, July 1989.


Chrystosomos (Max) Nikias, Alexander Sawchuk, Ulrich Neumann, Dennis McLeod, Roger Zimmermann, and C. C. (Jay) Kuo

The authors are with the Integrated Media Systems Center at the University of Southern California, Los Angeles, CA. 


Recent News
PREMIUM CONTENT
Sign in to read the full article
Create a free SPIE account to get access to
premium articles and original research